In recent discussions it was often suggested that idle RAM usage was a proper way to measure a system’s suitability for older machines and as a benchmark for its responsiveness.
With “idle RAM usage” we mean the amount of memory consumed by the operating system itself and the preloaded applications like the window manager or the desktop environment. Higher idle RAM usage means that there is less memory available for application programs…
Running the commands free -h
or htop
before starting any other application programs can give you some insight about this.
On systems with very limited resources, like single board computers (SBCs) or very old hardware with, say, less than 2 GByte of RAM, this is a valid consideration. For such kind of use cases, you would probably don’t want to use desktop environments like Gnome or KDE and try to minimize the amount of little helper applications in the background.
However, we should consider that many application programs, namely browsers, graphics editors and video renderers can easily consume several times the memory of the bare operating system. Clever choice of applications, settings and tasks will do far more for the system’s overall responsiveness than the amount of memory used by the operating system.
When resources are not stretched to the limit, the amount of idle memory becomes irrelevant for a system’s performance, even the contrary might be true:
- Preloading often used applications into the memory reduces their startup times, on the expense of idle RAM.
preload
is an optional feature of the Linux kernel. - Storing applications’ temporary data in a dedicated RAM area and not on the disk makes applications more responsive at the expense of unused memory and CPU.
- Active monitoring of running processes and re-prioritizing them (tech babble: “automatic renicing”) costs a bit of memory and CPU time but does often prevent applications from becoming unresponsive.
and, and, and.
In all these cases, higher idle memory and CPU usage is the price paid for better responsiveness - as long as we don’t reach their limits.
When we discuss the virtues of different distributions or setups, there is never a unique answer to the question, whether a specific optimization makes sense: It always depends on the environment and the use case.
It should also be said that installing more memory, switching HDDs for SSDs and installing dedicated graphics cards have probably far higher impact on the user experience than any optimization, be it towards reducing the OS’s memory load in favour of applications or towards increasing it in favour of quicker access.
That’s the reason why most distributions’ developers try to keep a balance between austerity and optimization. The big all-purpose distributions aim to provide solutions that will work reasonably well on practically all platforms whilst still providing enough features to make your work pleasant and easy, whilst others cater for more specific use cases.