From my understanding this is an official statement, not a benchmark result.
> The change isn't about the core operating system becoming resource-hungry. Instead, it reflects the way people use computers today—multiple browser tabs, web apps, and multitasking workflows, all of which demand additional memory.
So it is more about the 3rd party software instead of OS or desktop environment. Actually, nowadays it's recommended to have 8+ GB of RAM, regardless of OS.
I just checked the memory usage on Ubuntu 24.04 LTS after closing all the browser tabs. It's about 2GB of 16GB total RAM. 26.04 LTS might have higher RAM usage but it seems unlikely that it will get anywhere close to 6GB.
4GB of RAM? What? I guess if your minimum is "able to start Windows and eventually reach the desktop", sure? I wouldn't even use Windows 11 with 8GB even though it would theoretically be okay.
> 4GB of RAM? What? I guess if your minimum is "able to start Windows and eventually reach the desktop", sure? I wouldn't even use Windows 11 with 8GB even though it would theoretically be okay.
Not okay as soon as you throw on the first security tool, lol.
I work in an enterprise environment with Win 11 where 16 GB is capped out instantly as soon as you open the first browser tab thanks to the background security scans and patch updates. This is even with compressed memory paging being turned on.
I also believe that this memory usage might be decreased significantly, but I don't know how much (and how much is worth it). Some RAM usage might be useful, such as caching or for things related with graphics. Some is a cumulative bloat in applications caused by not caring much or duplication of used libraries.
But I remember in 2016 Fedora Gnome consumed about 1.6GB of RAM on my PC with 2GB of RAM a decade ago. Considering that after a decade the standard Ubuntu Gnome consumes only 400MB more RAM and also that my new laptop has 16GB of RAM (the system might use more RAM when more RAM is installed), I think the increase is not that bad for a decade. I thought it would be much worse.
Buy why that much? The first computer I bought had 192MB of RAM and I ran a 1600x1200 desktop with 24-bit color. When Windows 2000 came out, all of the transparency effects ran great. Office worked fine, Visual Studio, 1024x768 gaming (I know that’s quite a step down from 1080p).
What has changed? Why do I need 10x the RAM to open a handful of terminals and a text editor?
Partly because we have more layers of abstraction. Just an extreme example, when you open a tiny < 1KB HTML file on any modern browser the tab memory consumption will still be on the order of tens, if not hundreds of megabytes. This is because the browser has to load / initialize all its huge runtime environment (JS / DOM / CSS, graphics, etc) even though that tiny HTML file might use a tiny fraction of the browser features.
Partly because increased RAM usage can sometimes improve execution speed / smoothness or security (caching, browser tab isolation).
Partly because developers have less pressure to optimize software performance, so they optimize other things, such as development time.
I remember running Xubuntu (XFCE) and Lubuntu (LXDE, before LXQt) on a laptop with 4 GB of RAM and it was a pretty pleasant experience! My guess is that the desktop environment is the culprit for most modern distros!
well to start, you likely have 2 screen size buffers for current and next frame. The primary code portion is drivers since the modern expectation is that you can plug in pretty much anything and have it work automatically.
No because as far as we know 26.04 won't enable zswap or zram whereas Windows and MacOS both have memory compression technology of some sort. So Ubuntu will use significantly more memory for most tasks when facing memory pressure.
Apparently it's still in discussion but it's April now so seems unlikely.
Kind of weird how controversial it is considering DOS had QEMM386 way back in 1987.
It's not just the applications, the installer doesn't even start up with 1GiB of memory. With 2GiB of memory it does start up. You could (well, I would :) ) blame it on the Gnome desktop, but it is very different from what I would have expected.
I just tested this with 25.10 desktop, default gnome. With 24.04 LTS it doesn't even start up with 2GiB.
I hear a lot from linux users that found gtk 2 era on x11 as pretty close to perfect. I know i had run ubuntu and after boot it used far less than 1GB. The desktop experience was perhaps even slightly more polished than what we have today. Not much has fundamentally changed except the bloat and a regression on UX where they started chasing fads.
I suppose the most major change on RAM usage is electron and the bloated world of text editors and other simple apps written in electron.
Just stick XFCE on a modern minimal-ish (meaning not Ubuntu, mainly) distribution and you'll have this with modern compatibility. Debian and Fedora are both good options. If you want something more minimal as your XFCE basd, there are other options too.
XFCE is saddled with its GTK requirement, and GTK gets worse with every version. Even though XFCE is still on GTK3, that's a big downgrade from GTK2 because it forces you to run Wayland if you don't want your GUI frame rate arbitrary capped at 60 fps.
For people wanting the old-fashioned fast and simple GUI experience, I recommend LXQt.
It makes it easier to treat the computer as part of your own body, allowing operation without conscious thought, as you would a pencil or similar hand tool.
Outside of gaming, not much. However, now that I'm used to a 144Hz main monitor, there is no world where I would get back. You just feel the difference.
So basically, no use when you've not tasted 120+Hz displays. And don't because once you do, you won't go back.
I have a 165hz display that I use at 60hz. Running it at max speed while all I'm doing is writing code or browsing the web feels like a waste of electricity, and might even be bad for the display's longevity.
But for gaming, it really is hard to go back to 60.
> What use is there in display frame rates above 60 fps?
On a CRT monitor the difference between running at 60 Hz and even a just slightly better 72 Hz was night and day. Unbearable flickening vs a much better experience. I remember having some little utility for Windows that'd allow the display rate to be 75 (not 72 but 75). Under Linux I was writing modelines myself (these were the days!) to have the refresh rate and screen size (in pixels) I liked: I was running "weird" resolutions like 832x604 @ 75 Hz instead of 800x600 @ 60 Hz, just to gain a little bit more screen real estate and better refresh rate.
Now since monitors started using flat panels: I sure as heck have no idea if 60 fps vs 120 fps or whatever change anything for a "desktop" usage. I don't think the problem of the image fading too quickly at 60 Hz that CRT had is still present. But I'm not sure about it.
The whole linux stack got bigger though - just look at what you need now to compile stuff, cmake, meson/ninja, mesa, llvm and so forth. gtk2 was great; GTK is now a GNOMEy-toolkit only, controlled by one main corporation. Systemd increased the bloat factor too - and also gathers age data of users now (https://github.com/systemd/systemd/pull/40954).
I guess one of the few smaller things would be wayland, but this has so few features that you have to wonder why it is even used.
I’ve been using cmake since early 2000s when i was hacking on the vtk/itk toolkit. Compiling a c++ program hasn’t gotten any better/worse. FWIW, I always used the curses interface for it.
Is not FUD; the full name, email and the rest were not META/corporations mandated, which are lobbying for it so they can earn money with users' preferences. Get your spyware to somewhere else.
If META's business model is not lucrative, is not my problem.
>which are lobbying for it so they can earn money with users' preferences
Given it's a field where you can put absolutely anything in (and probably randomize, if you want), how is this different than the situation today, where random sites ask you for your birthday (also unverified)? Moreover Meta already has your birthday. It's already mandated for account creation, so claims of "so they can earn money with users' preferences" don't make any sense.
Yep. I still develop Gtk2 applications today. It's a very snappy and low resource usage toolkit aimed entirely at desktop computers. None of that "mobile" convergence. I suppose you could put Gtk2 applications into containers of some sort but since Gtk2 has (luckily) been left alone by GNOME for decades it's a stable target (like NES or N64 is a stable target) and there's no need for it.
Most of the bloat these days is from containers and Canonical's approach to Ubuntu since ~2014 has been very heavy on using upstream containers so they don't have to actually support their software ecosystem themselves. This has lead to severe bloat and bad graphical theming and file system access.
First, it sounds like this 6gb requirement is more like a suggestion/recommendation than a requirement. I also am curious if it actually actively uses all 6gb. From my own usage of Linux over the years the OS itself isn't using that much ram, but the application is, which is almost always the browser.
Secondly. I haven't used Ubuntu desktop in years. So I have no real idea if this is something specific to them, but I do use Fedora, so I would imagine that the memory footprint cannot be too different. Whilst I could easily get away with <8gb ram, you really kind of don't want too if you're going to be doing anything heavier than web browsing or editing documents. Dev work? Or CAD, Design etc etc. But this isn't unique to Linux.
Ubuntu just raised the minimum RAM requirement from 4gb to 6. While it might have been possible to run anything with a GUI on 4, I can't imagine that is a good experience.
When they turned Centos into streams, I cut my workstation over to Ubuntu. It has been a reasonable replacement. Only real issues were when dual booting Win10 horked my grub and snap being unable to sort itself on occasion. When they release 26 as an LTS, I'm planning to update. You are spot on - the desktop itself is reasonably lean. 100+ tabs in Firefox... less so. Mind you, the amount of RAM in the workstations I'm using could buy a used car these days.
I don't really get it. I have ran fleets of thousands of devices running Chrome in a container on Ubuntu server, and it's a nice experience. It took a lot to make it nice, but once it was there it was rock solid. This was with 1GB ram on a Pi 3. When we swapped to Pi4, we just had thousands on gigabytes of ram and thousands of cpu cores unused.
I happened to install Fedora Silverblue on a computer a few days ago and looked quickly at the memory usage after boot: it was about 6gb ! I usually run Alpine or FreeBSD, so I thought: great that thing consumes 10x the RAM.
I believe Fedora and Ubuntu use about the same set of technologies: systemd, wayland, Gnome, etc. so it is about the same.
Apart from working out of the box I do not really know what those distros have and I don't.
I just have to admit managing network interfaces is really easy in Gnome.
With the skyrocketing price of RAM this might finally be the year of the Linux desktop. But it is not gonna be Gnome I guess.
This is garbage writing. Linux’s advantages are numerous and growing. Ubuntu ≠ Linux. WRT RAM requirements, Win 11’s 4GB requirement isn’t viable for daily use and won’t represent any practical machine configuration that has the requisite TPM 2 module. On the other side, the Linux ecosystem offers a wide variety of minimal distributions that can run on ancient hardware.
Maybe I’m just grouchy today but I would flag this content if sloppy MS PR was a valid reason.
Agree. I'm able to do development, run multiple containerized services (including Postgres, NATS, etc), have 10 browser tabs open, all on an 8 GiB laptop running Arch. I have a desktop with 64GiB as well but realized there is no point using it most of the time.
Win11 barely works with 4GB. Like, you can have a browser with youtube on and that's it, 90%+ memory usage. I know because that is one of my media PC (instead of smart tv).
Can't move to Linux because it's Intel Atom and Intel P-state driver for that is borked, never fixed.
Today's browsers tend to be huge memory hogs too. Software's attitude of "there's always more memory" is coming back to bite them as prices of ram increase.
IMHO, browsers might prioritize execution speed somewhat more than memory. There is the Pareto tradeoff principle where it's unlikely to optimize all the parameters - if you optimize one, you are likely to sacrifice others. Also more memory consumption (unlike CPU) doesn't decrease power efficiency that much, so, more memory might even help with that by reducing CPU usage with caching.
Since the dawn of time, Microsoft has published the minimum system requirements needed to run Windows, not what you need to actually do something useful with it.
It says that Ubuntu increase the requirements not because of the OS itself but to have a better user experience when people have many browser tabs opened. Then it compares to Windows which has lower nominal requirements but higher requirements in practice to get a passable user experience.
> Canonical isn’t making 6GB memory a hard requirement for Ubuntu 26.04. It will still install on machines that fall below the minimum requirement, but users will have to deal with slower performance.
I think we have quite different definition of "minimum requirement", then.
God I miss openstep and CDE. It needs 16MB RAM (yes MB!) and together with a lighweight firefox clone you get everything you need. Eye candy is nice to have but not at that cost.
Windows 11's 4 GB minimum is dishonest. You cannot reasonably run it on that little, it is far too bloated at this point. Even LTSC benefits from 6 GB, and that is substantially cut-down compared to retail/enterprise.
I'd say Windows 11's real minimal is 8 GB in 2026, with the recommended being 16 GB.
PS - And even at 8 GB, it hits 100% usage and pages under moderate load or e.g. Windows Update running in the background.
It's Arch based, with (iirc) Hyperland as it's "DE", so really not much memory I'd guess.
My desktop runs Arch with Sway (so quite close), three monitors, and uses ~400MB ram after boot. Most of it are the framebuffers. All the rest is eaten by Firefox, rust-analyzer and qemu.
The article itself acknowledges that the headline is bullshit:
> The change isn't about the core operating system becoming resource-hungry. Instead, it reflects the way people use computers today—multiple browser tabs, web apps, and multitasking workflows
Basically the change reflects the fact that, at this level of analysis (how much RAM do I need in my consumer PC), the OS is irrelevant these days. If you use a web browser then that will dominate your resource requirements and there's nothing Linux can do about that.
With arch+hyprland I hit 5GiB for a zen browser instance with 15+ tabs and a kitty instance with 15+ windows across 5 tabs, with codex and vim running.
If ram is a problem there's always alternatives. The impediment is always having to rethink your workflow or adopting someone else's opinion.
Trisquel 12 Mate -codenamed Ecne- with the Xanmod kernel to cover propietary drivers, that's a more libre start than Ubuntu. If everything works with the libre kernel, you can toss the Xanmod kernel in the spot.
The article suggests that Xubuntu (which uses xfce instead of Gnome) uses much less memory. I don't know how true that is, but it seems reasonable that xfce uses some less memory.
the problem here is more the programs you run on top of the OS (browser, electron apps, etc.)
realistic speaking you should budged at least 1GiB for you OS even if it's minimalist, and to avoid issues make it 2GiB of OS + some emergency buffer, caches, load spikes etc.
and 2GiB for your browser :(
and 500MiB for misc apps (mail, music, etc.)
wait we are already at 4.5 GiB I still need open office ....
even if xfc would safe 500 MiB it IMHO wouldn't matter (for the recommendation)
and sure you can make it work, can only have one tab open at a time, close the browser every time you don't need it, not use Spotify or YT etc.
but that isn't what people expect, so give them a recommendation which will work with what they expect and if someone tries to run it at smaller RAM it may work, but if it doesn't it at least isn't your fault
We expect xfce is much more efficient (it has more basic features) but is that the cause? Are you just subtracting out a big part from a higher baseline?
they compared the Ubuntu minimal recommended RAM to Windows absolute minimal RAM requirements.
but Windows has monetary incentives (related to vendors) to say they support 4GiB of RAM even if windows runs very shitty on it, on the other had Ubuntu is incentivized to provider a more realistic minimum for convenient usage
I mean taking a step back all common modern browsers under common usage can easily use multiple GiB of memory and that is outside of the control of the OS vendor. (1)
As consequence IMHO recommending anything below 6 GiB is just irresponsible (iff a modern browser is used) _not matter what OS you use_.
---
(1): If there is no memory pressure (i.e. caches doesn't get evicted that fast, larger video buffers are used, no fast tab archiving etc.) then having YT playing likely will consume around ~600-800 MiB.(Be aware that this is not just JS memory usage but the whole usage across JS, images, video, html+css engine etc. For comparison web mail like proton or gmail is often roughly around 300MiB, Spotify interestingly "just" around 200MiB, and HN around 55MiB.
I had a machine (an AMD 3700X with 32 GB of RAM and a fast NVMe SSD) on which I used to run Debian. Then about 2.5 years ago I bought a new one and gave my wife the 3700X: I figured out she'd be more at ease so I installed Ubuntu on it.
I couldn't understand why everything was that slow compared to Debian and didn't want to bother looking into it so...
After a few weeks: got rid of Ubuntu, installed her Debian. A simple "IceWM" WM (I use the tiling "Awesome WM" but that's too radical for my wife) and she loves it.
She basically manages her two SMEs entirely from a browser: Chromium or Firefox (but a fork of Firefox would do too).
It works so well since years now that for her latest hire she asked me to set her with the same config. So she's now got one employee on a Debian machine with the IceWM WM. Other machines are still on Windows but the plan is to only keep one Windows (just in case) and move the other machines to Debian too.
Unattended upgrades, a trivial firewall "everything OUT or IN but related/established allowed" and that's it.
I had used ubuntu back in the day, and when I came back to linux a bit ago I immediately installed it again.
I don't remember all of my frustrations, but I remember having a lot of trouble with snap. Specifically, it really annoyed me that the default install of firefox was the snap version instead of native. I want that to be an opt-in kind of thing. I found that flatpak just worked better anyway.
I almost tried making the switch to arch, but I've been pretty happy running debian sid (unstable) since. The debian installer is just more friendly to me for getting encrypted drives and partitions set up how I want.
It's not for everyone, but I like the structured rolling updates of sid and having access to the debian ecosystem too much to switch to something else at this point.
I use sway with a radeon card for my primary and have a secondary nvidia card for games and AI stuff.
Maybe in some ways, yes. But there are distros out there that can run easily in as little as 1G RAM. And I heard people have used it with far less.
I also remember hearing Ubuntu moved to default to Wayland, if true I have to wonder if defaulting to Wayland is part of the problem because Gnome / KDE on Wayland will use far more memory than FVWM / Fluxbox on X11.
FWIW, you can do a lot just from the console without a GUI w/Linux and any BSD, in that case the RAM usage will be tiny compared to Windows and Apple.
Not to mention that 'lower memory usage' is only one of many benefits and, at least before the prices went mad, hardly the most important one on the list.
> But there are distros out there that can run easily in as little as 1G RAM
It always make me chuckle when I hear this. Default server (ie no GUI at all) installation of a RHEL derivative just outright dies silently with 1GB of RAM if there is no swap. Sure with the enabled swap it no longer dies but to say what the performance is anywhere performant is to lie to yourself.
RHEL is not the be-all end-all of minimalist linux, even sans GUI. Puppy Linux, with a full WM, is completely usable with a single gig of ram. That's obviously a different use-case from RHEL but the point stands.
3: Trisquel 12 Ecne exists. You might need Xanmos as a propietary kernel because of hardware, but try to blacklist mei and mei_me first in some .conf file at /lib/modprobe.d. Value your privacy.
Trisquel Mate with zram-config and some small tweaks can work with 4GB of RAM even with a browser with dozens of Tabs, at least with UBlock Origin.
I was testing them on a HP laptop I bought for $200 with 4GB of RAM.
Windows, its default, used so much memory that there was not much left for apps.
Ubuntu used 500MB less than Windows in system monitor. I think it was still 1GB or more. It also appeared to run more slowly than it used to on older hardware.
Lubuntu used hundreds of MB less than Ubuntu. It could still run the same apps but had less features in UI (eg search). It ran lightening fast with more, simultaneous apps.
(Note: That laptop's Wifi card wouldn't work with any Linux using any technique I tried. Sadly, I had to ditch it.)
I also had Lubuntu on a 10+ year old Thinkpad with an i7 (2nd gen). It's been my daily machine for a long time. The newer, USB installers wouldn't work with it. While I can't recall the specifics, I finally found a way to load an Ubuntu-like interface or Ubuntu itself through the Lubuntu tech. It's now much slower but still lighter than default Ubuntu or Windows.
(Note: Lubuntu was much lighter and faster on a refurbished Dell laptop I tested it on, too.)
God blessed me recently by a person who outright gave me an Acer Nitro with a RTX and Windows. My next step is to figure out the safest way to dual boot Windows 11 and Linux for machine learning without destroying the existing filesystem or overshrinking it.
Consider a dedicated SSD for each OS. You should have a couple M2 slots in the laptop. What you can do is remove (or disable) the Windows SSD, install Linux on the second drive, and then add back the windows drive. Select the drive at startup you want to be in on boot and default the drive you want to spend most of your time in. I did that on my XPS and it was trouble free. Linux can mount your NTFS just fine, without having to consider it from a boot/grub perspective.
> Ubuntu used 500MB less than Windows in system monitor.
Those number meant nothing comparing across OS. Depends on how they counts shared memory and how aggressive it cache, they can feel very different.
The realistic benchmark would be open two large applications (e.g. chrome + firefox with youtube and facebook - to jack up the memory usage), switch between them, and see how it response switching different tasks.
I imagine the choice of desktop environment has most to do with RAM requirements in Linux.
Unrelated to this, despite Ubuntu’s popularity, I think it’s one of the worst distro choices out there, especially for including old kernels for essentially no discernible reason.
I wouldn’t go so far as defending Microslop but I do get tired of the Apple fanboys accusing Windows of being bloated and running poorly.
They seem to defend Apple’s 8GB machines by saying that Apple systems perform better than Windows with the same amount of RAM. This claim is entirely unsubstantiated.
Windows has a lot of problems but performance and memory efficiency is not one of them. We should recall that Microsoft actually reduced RAM usage and minimum requirements between windows 7 and 8 as they wanted to get into the tablet game, and Windows has remained efficient with memory since then as Microsoft wants Windows to come with cheap Chromebook-like hardware and other similar low-end systems.
MacOS handles memory pressure better than Linux imo (at least for interactive use cases)
I have seen MacOS overcommit up to 50% of memory and still have the system be responsive.
Yesterday I filled up my ram accidentally on Fedora and even earlyoom took several minutes to trigger and in the meantime the system was essentially non-responsive
From my experience it does not help much, and I still get occasional freezes when a program misbehaves on Linux. It’s not a huge problem, but it is a problem and it exists; I have been dealing with it for about 15 years with no significant improvement.
Best kernel recommendations? Mainly for extremely long running (2+ years) SaaS applications. Stability overall. Is running a handful of docker containers and some binaries.
Linux LTS if stability is the most important. Realistically, "normal" Linux works for for it too, not sure you have to care as deeply about what kernel you use as you seem to do.
On my desktop I use linux-cachyos-bore-lto which seems to give me a slight performance boost in compilation times compared to the regular kernel, but I've had at least one crash that I've unable to attribute to any other specific issue, so could be the kernel I suppose, I wouldn't use it on a server nonetheless.
For desktop use, I find that being on the latest stable kernels is best because you get things like recent AMD graphics drivers and support for recent hardware and laptops. I’m in an arch-based distro and my kernel updates all the time. I’ve never had an issue. The stability benefits of LTS seem completely useless in comparison. Just my opinion though.
If you’re running applications as in a server that’s an entirely different discussion. I have been assuming we are talking about desktop users who are not serving anything.
E.g., if I go out and buy a 2026 Panther Lake laptop with a new WiFi 7 chip or what have you, I’m going to want a distro with the latest kernels so that I don’t have hardware issues. If I install the default Ubuntu download it’s going to almost certainly have problems.
I am not an Apple fan, so I'll just tell my story. And yeah, it may be biased, and I may not understand something important.
So around ~3 years ago or so I bought a lightweight low-end laptop (Intel Core i3, 14 inch display, 8GB of RAM) for everyday stuff so I could easily bring it with me everywhere I need to go (I mean, everywhere I would need it). It came with Windows 11 pre-installed. Now, for you to understand, previously, like ~10 years ago or so I had a Windows 7 system and it was pretty neat. And I remembered when people were switching from Windows 7 to Windows 8 or 10, they blamed the new OS version just like right now the Windows 11 was blamed; yet everyone got used to it, it received some fixes, improvements, etc; so I thought "well, maybe Windows 11 is not so bad, I should try it out at least just for the sake of curiosity".
And now, the clean installation of the Windows 11 that came to my was requiring like ~20 seconds to fully boot up to the login window. I know that my laptop is not best of the best, but still... After a startup, with no apps opened, there was like ~4 GB of RAM usage just out of nowhere; so effectively I was limited to ~4GB of RAM to run something I want to. Bluetooth drivers were terrible (at the time) - sometimes I was able to connect to my headphones and sometimes I wasn't, while they were working with all of my other devices perfectly. Then there was also this hellish "Antimalware Executable" - and I know how it sounds, I have nothing against anti-virus software, but when it randomly shows up several times per day, eats all of your processing power (like ~80% of CPU usage, and note that I have 8 cores ~3 GHz here), heats up your laptop to the point that fan starts screaming... that was not very good, to speak softly. Battery usage was also a disappointment - sometimes it couldn't last for just 3 hours, while the most heavy thing I was doing during that time was compilation of some software.
I was trying, I was re-configuring, I was applying patches... and finally I got fed up with all of this bloat, broken updates and other garbage. So I just backed up all of my important files and data to external drive and installed Linux Mint (because in this particular case I just needed working laptop). And wow, it just worked! Now at startup I get like ~1 GB RAM usage at most (this actually depends on the DE I use, so numbers could be different from time to time), battery life improved, no more weird Bluetooth issues, no more random bloatware... it just works, and that's it.
I know that distros like Mint are focused on stability and efficiency, so maybe the comparison is a bit unfair. But hell, even while I don't have anything against Windows 7 or Windows 8, the recent Windows 11 is a real combination of bloatware and spyware. So performance and memory efficiency is, actually, the problem here. Or at least it was a problem last time I tried it.
Now, again, I may be wrong somewhere, maybe I missed something out. If I did - please point it out.
I just logged in to my MacBook Air M2 (24GB RAM) with no programs open and it’s reserving 8.3GB of RAM and using 500MB of swap.
My Framework laptop running CachyOS with KDE Plasma with nothing open except System Monitor reserves 4GB with 500MB in swap (I enabled swap for sleep to hibernate, normally there’s no swap).
Reserving RAM doesn’t mean there’s a performance problem.
Most of the things you’re talking about in your comment have nothing to do with RAM usage and memory efficiency. You’re complaining about some annoying preinstalled OEM software [1], bad drivers, fan noise, battery life, and windows updates. That stuff isn’t great but a lot of it doesn’t have anything to do with Windows RAM efficiency itself.
If you download the Windows ISO from Microsoft and clean install you’ll have a pretty nice experience. I think Microsoft needs to crack down on OEM software additions.
As far as slow boot up times/slow initial setup I’ll remind you that Macs also have that as an issue during first boot and spend a lot of time doing initial indexing.
Linux mint is a great distro and I also prefer Linux to both Mac and Windows as well. Mostly my commentary is on the subject of people claiming Microsoft Windows is bad with RAM when we now see some Linux distros asking for more RAM than Windows. I think it’s quite clear that RAM isn’t the problem with Windows, it’s a lot of other things and the surrounding ecosystem.
[1] I have to assume you’re talking about some third party antimalware program because the Microsoft one absolutely does not behave how you describe.
> Reserving RAM doesn’t mean there’s a performance problem.
It does in my own experience (so it may not be a problem for you, I agree, but it is a problem for me). Because when OS allocates ~50% percents of RAM for itself and isn't letting it go, then other software simply can't use it. Therefore, you're limited. Your potential performance is capped at certain level just because your OS decided to allocate half or more of your system RAM. Why? Well, just because it wants to.
> have nothing to do with RAM usage or performance
Well, to be honest, most of them don't. But would you please explain then, why it takes around 20 seconds just to boot up, while for the aforementioned Linux Mint (and I'll clarify that it's currently 22.3 for me, the latest version, it was 22.1 at the time as far as I remember) it's only around ~3-4 second to take me to the login screen and then another second (at most) to load everything after I have logged in? Could you also, please, explain how does it happen that even GNOME's Nautilus file explorer takes less RAM and far less CPU usage than Microsoft's Explorer (and I won't even mention Thunar, that's kinda unfair)? What about "Start" menu in Windows which spiked up CPUs just by opening/closing? There's a lot of performance issues, both with RAM and CPU usage.
I'm not saying that these problems are unique to Windows, no; but saying that Windows doesn't have any performance issues is not really true.
> I think it’s quite clear that RAM isn’t the problem with Windows, it’s a lot of other things and the surrounding ecosystem.
I agree with you here. That's true. A large part of the problem comes not from the actual operating system, but from the application software. I thought once that well, maybe if RAM shortages will last longer than for just one or two years, that will be bad, but also, maybe - just maybe - some software developers will start to think at least a bit more about optimization...
> [1] I have to assume you’re talking about some third party antimalware program because the Microsoft one absolutely does not behave how you describe.
Editing without specifying that you have edited your reply is not very good, you know. But okay.
Actually, I'm talking about the Windows-shipped Microsoft Defender process (at least it seems to come from Microsoft Defender). I have not seen anything third-party installed on my laptop at the time, and it actually behaved just like I described. I should also remind you that it is a low-end laptop, that's just Intel Core i3-N305, it's not the most powerful CPU in the world - just 8 cores, 8 threads and 3.80 GHz of max boost frequency.
If you think that I'm lying, then just search for "antimalware executable high CPU usage" in any search engine. You will find a plenty of complaints and even some guides on how to deal with it.
> I do get tired of the Apple fanboys accusing Windows of being bloated and running poorly.
> Windows has a lot of problems but performance and memory efficiency is not one of them.
I can't even describe how much your experience differs from mine. I would never have imagine someone to utter such sentence about Windows in todays day and age.
For everyone else reading this, a couple of advice I have gotten that made me suffer less with Windows is to replace Windows search with Everything (by Voidtools) and replace Explorer with Filepilot (filepilot.tech).
Maybe if FOSS was less focused on reverse engineering proprietary technology they could make products people LIKE. I say this as someone who learned about firmware because of several listeners and one group having the aim of reverse engineering my new Apple ecosystem that is now falling apart after signal traps. My crime was working for an ISP and the media, but I reported on Scienos not techbros. Yawn.
I knew they were fucking with my virtual memory cause theirs sucks, the partition schemes on this Mac mini were ridiculous and the helpers weren’t stealing my information.
Given that efficiency one of Linux's most touted advantages, what in the world is Ubuntu's PR department thinking? Ubuntu isn't providing any more functionality than when its memory requirement was 4GB. What is hogging all that extra ram?
> what in the world is Ubuntu's PR department thinking?
The same as any other corporate PR department: "At least now when people run it with N GB of RAM, we can just point to the system requirements and say 'This is what we support' rather than end up in a back-and-forth"
If you expect them to have any sort of long-term outlook on "Lets be careful with how developers view our organization", I think you're about a decade too late for Canonical.
No official reason given, so all the tech press is basically speculating (if someone finds a source that does a teardown, please share; I can't seem to locate one). I think my favorite piece of speculation is that it reflects an anticipated modern workload of using the OS as a vector to launch a web browser and open multiple tabs in it, which is just going to be a memory hog as experienced by most Ubuntu users.
Besides the correct answer that Canonical sucks, I would argue that “efficiency” is not a selling point to get someone to use a desktop operating system.
Mainstream users and business organizations don’t really understand that concept and would prefer to see how the operating system enables their use cases and workflows.
Apple under-speccing their machines like they’ve been doing since the dawn of time is not some kind of indicator of any trends. You can buy $350 PC laptops that come with 12GB of RAM (example: https://www.staples.com/asus-vivobook-x1404-14-laptop-intel-...)
RAM shortages will be quite temporary. Making predictions based on individual component shortages has never been a winning strategy in the history of the industry. Next you’ll tell me that graphics cards will be impossible to get because of blockchain.
From my understanding this is an official statement, not a benchmark result.
> The change isn't about the core operating system becoming resource-hungry. Instead, it reflects the way people use computers today—multiple browser tabs, web apps, and multitasking workflows, all of which demand additional memory.
So it is more about the 3rd party software instead of OS or desktop environment. Actually, nowadays it's recommended to have 8+ GB of RAM, regardless of OS.
I just checked the memory usage on Ubuntu 24.04 LTS after closing all the browser tabs. It's about 2GB of 16GB total RAM. 26.04 LTS might have higher RAM usage but it seems unlikely that it will get anywhere close to 6GB.
Also, the Windows 11 requirements are ludicrous.
https://www.microsoft.com/en-us/windows/windows-11-specifica...
4GB of RAM? What? I guess if your minimum is "able to start Windows and eventually reach the desktop", sure? I wouldn't even use Windows 11 with 8GB even though it would theoretically be okay.
> 4GB of RAM? What? I guess if your minimum is "able to start Windows and eventually reach the desktop", sure? I wouldn't even use Windows 11 with 8GB even though it would theoretically be okay.
Not okay as soon as you throw on the first security tool, lol.
I work in an enterprise environment with Win 11 where 16 GB is capped out instantly as soon as you open the first browser tab thanks to the background security scans and patch updates. This is even with compressed memory paging being turned on.
Win11 IOT runs great on 4gb if that matters :) I have a few machines in the field running it and my java app, still over a gig free usually.
If you run Windows 11 with Microsoft Teams and Microsoft Outlook on a 4GB machine you're gonna have a bad day.
I know 2GB isn't very heavy in OS terms these days, but it's still enough to hold nearly 350 uncompressed 1080p 24-bit images.
There's rather a lot of information in a single uncompressed 1080p image. I can't help but wonder what it all gets used to for.
I also believe that this memory usage might be decreased significantly, but I don't know how much (and how much is worth it). Some RAM usage might be useful, such as caching or for things related with graphics. Some is a cumulative bloat in applications caused by not caring much or duplication of used libraries.
But I remember in 2016 Fedora Gnome consumed about 1.6GB of RAM on my PC with 2GB of RAM a decade ago. Considering that after a decade the standard Ubuntu Gnome consumes only 400MB more RAM and also that my new laptop has 16GB of RAM (the system might use more RAM when more RAM is installed), I think the increase is not that bad for a decade. I thought it would be much worse.
Buy why that much? The first computer I bought had 192MB of RAM and I ran a 1600x1200 desktop with 24-bit color. When Windows 2000 came out, all of the transparency effects ran great. Office worked fine, Visual Studio, 1024x768 gaming (I know that’s quite a step down from 1080p).
What has changed? Why do I need 10x the RAM to open a handful of terminals and a text editor?
Partly because we have more layers of abstraction. Just an extreme example, when you open a tiny < 1KB HTML file on any modern browser the tab memory consumption will still be on the order of tens, if not hundreds of megabytes. This is because the browser has to load / initialize all its huge runtime environment (JS / DOM / CSS, graphics, etc) even though that tiny HTML file might use a tiny fraction of the browser features.
Partly because increased RAM usage can sometimes improve execution speed / smoothness or security (caching, browser tab isolation).
Partly because developers have less pressure to optimize software performance, so they optimize other things, such as development time.
Here is an article about bloat: https://waspdev.com/articles/2025-11-04/some-software-bloat-...
Higher res icons probably add a couple hundred megs alone
I remember running Xubuntu (XFCE) and Lubuntu (LXDE, before LXQt) on a laptop with 4 GB of RAM and it was a pretty pleasant experience! My guess is that the desktop environment is the culprit for most modern distros!
Gnome 50 and its auxilliary services on my machine uses maybe 400MB.
The culprit is browsers, mostly.
well to start, you likely have 2 screen size buffers for current and next frame. The primary code portion is drivers since the modern expectation is that you can plug in pretty much anything and have it work automatically.
How often do you plug in a new device without a flurry of disk activity occurring?
No because as far as we know 26.04 won't enable zswap or zram whereas Windows and MacOS both have memory compression technology of some sort. So Ubuntu will use significantly more memory for most tasks when facing memory pressure.
Apparently it's still in discussion but it's April now so seems unlikely.
Kind of weird how controversial it is considering DOS had QEMM386 way back in 1987.
Zswap is a no brainer. I have to wonder why the hesitancy.
QEMM386 for DOS did not have a memory compression feature. Only one of the later versions for Windows 3.1 did.
It's not just the applications, the installer doesn't even start up with 1GiB of memory. With 2GiB of memory it does start up. You could (well, I would :) ) blame it on the Gnome desktop, but it is very different from what I would have expected.
I just tested this with 25.10 desktop, default gnome. With 24.04 LTS it doesn't even start up with 2GiB.
I hear a lot from linux users that found gtk 2 era on x11 as pretty close to perfect. I know i had run ubuntu and after boot it used far less than 1GB. The desktop experience was perhaps even slightly more polished than what we have today. Not much has fundamentally changed except the bloat and a regression on UX where they started chasing fads.
I suppose the most major change on RAM usage is electron and the bloated world of text editors and other simple apps written in electron.
Just stick XFCE on a modern minimal-ish (meaning not Ubuntu, mainly) distribution and you'll have this with modern compatibility. Debian and Fedora are both good options. If you want something more minimal as your XFCE basd, there are other options too.
XFCE is saddled with its GTK requirement, and GTK gets worse with every version. Even though XFCE is still on GTK3, that's a big downgrade from GTK2 because it forces you to run Wayland if you don't want your GUI frame rate arbitrary capped at 60 fps.
For people wanting the old-fashioned fast and simple GUI experience, I recommend LXQt.
What use is there in display frame rates above 60 fps?
It makes it easier to treat the computer as part of your own body, allowing operation without conscious thought, as you would a pencil or similar hand tool.
Outside of gaming, not much. However, now that I'm used to a 144Hz main monitor, there is no world where I would get back. You just feel the difference.
So basically, no use when you've not tasted 120+Hz displays. And don't because once you do, you won't go back.
I have a 165hz display that I use at 60hz. Running it at max speed while all I'm doing is writing code or browsing the web feels like a waste of electricity, and might even be bad for the display's longevity.
But for gaming, it really is hard to go back to 60.
> What use is there in display frame rates above 60 fps?
On a CRT monitor the difference between running at 60 Hz and even a just slightly better 72 Hz was night and day. Unbearable flickening vs a much better experience. I remember having some little utility for Windows that'd allow the display rate to be 75 (not 72 but 75). Under Linux I was writing modelines myself (these were the days!) to have the refresh rate and screen size (in pixels) I liked: I was running "weird" resolutions like 832x604 @ 75 Hz instead of 800x600 @ 60 Hz, just to gain a little bit more screen real estate and better refresh rate.
Now since monitors started using flat panels: I sure as heck have no idea if 60 fps vs 120 fps or whatever change anything for a "desktop" usage. I don't think the problem of the image fading too quickly at 60 Hz that CRT had is still present. But I'm not sure about it.
I, for one, lose track of the mouse way less often at 165Hz.
I lose track of the mouse less often at 1024x768!
MXLinux is really great for something like xfce and I really loved the snapshotting feature of it too. Highly recommended.
You spelled Debian wrong.
I used gtk2, it was ok, but I preferred Ubuntu's Unity interface when it came out.
Gnome 3 seems similar to Unity nowadays, and it is pretty good.
I find it much easier to use than Windows or Mac, which is credit to the engineers who work on it.
it's always the browser, each tab is at least 100MB, electronjs is also a browser. the gtk or whatever is nothing before the browser
The whole linux stack got bigger though - just look at what you need now to compile stuff, cmake, meson/ninja, mesa, llvm and so forth. gtk2 was great; GTK is now a GNOMEy-toolkit only, controlled by one main corporation. Systemd increased the bloat factor too - and also gathers age data of users now (https://github.com/systemd/systemd/pull/40954).
I guess one of the few smaller things would be wayland, but this has so few features that you have to wonder why it is even used.
>The whole linux stack got bigger though - just look at what you need now to compile stuff, cmake, meson/ninja, mesa, llvm and so forth
Those are all development tools. Has the runtime overhead grown proportionally, and what accounts for the extra weight?
I’ve been using cmake since early 2000s when i was hacking on the vtk/itk toolkit. Compiling a c++ program hasn’t gotten any better/worse. FWIW, I always used the curses interface for it.
Is the option of legal compliance a bad thing? They have corporate customers.
If there's no opt-out, that's a different story.
It's plain FUD. systemd always had fields for the full name, email address and location. They were optional, just like the date of birth. Bad systemd!
Is not FUD; the full name, email and the rest were not META/corporations mandated, which are lobbying for it so they can earn money with users' preferences. Get your spyware to somewhere else.
If META's business model is not lucrative, is not my problem.
>which are lobbying for it so they can earn money with users' preferences
Given it's a field where you can put absolutely anything in (and probably randomize, if you want), how is this different than the situation today, where random sites ask you for your birthday (also unverified)? Moreover Meta already has your birthday. It's already mandated for account creation, so claims of "so they can earn money with users' preferences" don't make any sense.
Keep gaslighting:
https://www.theregister.com/2026/03/24/foss_age_verification...
Good luck when most libre users toss RH/Debian because of this and embrace GNU.
Yep. I still develop Gtk2 applications today. It's a very snappy and low resource usage toolkit aimed entirely at desktop computers. None of that "mobile" convergence. I suppose you could put Gtk2 applications into containers of some sort but since Gtk2 has (luckily) been left alone by GNOME for decades it's a stable target (like NES or N64 is a stable target) and there's no need for it.
Most of the bloat these days is from containers and Canonical's approach to Ubuntu since ~2014 has been very heavy on using upstream containers so they don't have to actually support their software ecosystem themselves. This has lead to severe bloat and bad graphical theming and file system access.
Can you point us to some of these gtk2 applications that you’ve been writing recently?
Two things
First, it sounds like this 6gb requirement is more like a suggestion/recommendation than a requirement. I also am curious if it actually actively uses all 6gb. From my own usage of Linux over the years the OS itself isn't using that much ram, but the application is, which is almost always the browser.
Secondly. I haven't used Ubuntu desktop in years. So I have no real idea if this is something specific to them, but I do use Fedora, so I would imagine that the memory footprint cannot be too different. Whilst I could easily get away with <8gb ram, you really kind of don't want too if you're going to be doing anything heavier than web browsing or editing documents. Dev work? Or CAD, Design etc etc. But this isn't unique to Linux.
Ubuntu just raised the minimum RAM requirement from 4gb to 6. While it might have been possible to run anything with a GUI on 4, I can't imagine that is a good experience.
When they turned Centos into streams, I cut my workstation over to Ubuntu. It has been a reasonable replacement. Only real issues were when dual booting Win10 horked my grub and snap being unable to sort itself on occasion. When they release 26 as an LTS, I'm planning to update. You are spot on - the desktop itself is reasonably lean. 100+ tabs in Firefox... less so. Mind you, the amount of RAM in the workstations I'm using could buy a used car these days.
I don't really get it. I have ran fleets of thousands of devices running Chrome in a container on Ubuntu server, and it's a nice experience. It took a lot to make it nice, but once it was there it was rock solid. This was with 1GB ram on a Pi 3. When we swapped to Pi4, we just had thousands on gigabytes of ram and thousands of cpu cores unused.
Does Firefox really not unload the tabs in that case?
It does. You can also do it by hand via the right-click on tab menu
I happened to install Fedora Silverblue on a computer a few days ago and looked quickly at the memory usage after boot: it was about 6gb ! I usually run Alpine or FreeBSD, so I thought: great that thing consumes 10x the RAM.
I believe Fedora and Ubuntu use about the same set of technologies: systemd, wayland, Gnome, etc. so it is about the same.
Apart from working out of the box I do not really know what those distros have and I don't. I just have to admit managing network interfaces is really easy in Gnome.
With the skyrocketing price of RAM this might finally be the year of the Linux desktop. But it is not gonna be Gnome I guess.
> Linux's advantage is slowly shrinking
This is garbage writing. Linux’s advantages are numerous and growing. Ubuntu ≠ Linux. WRT RAM requirements, Win 11’s 4GB requirement isn’t viable for daily use and won’t represent any practical machine configuration that has the requisite TPM 2 module. On the other side, the Linux ecosystem offers a wide variety of minimal distributions that can run on ancient hardware.
Maybe I’m just grouchy today but I would flag this content if sloppy MS PR was a valid reason.
FWIW I find even KDE plasma on wayland perfectly viable on a 4 GiB budget notebook. Windows runs horribly on the same hardware.
Agree. I'm able to do development, run multiple containerized services (including Postgres, NATS, etc), have 10 browser tabs open, all on an 8 GiB laptop running Arch. I have a desktop with 64GiB as well but realized there is no point using it most of the time.
I agree. And even on Ubuntu, the performance vs same specs on Windows is ridiculously better.
Apps are still a huge gap on Linux, but as an OS, I choose it every time over Windows and MacOS.
Win11 barely works with 4GB. Like, you can have a browser with youtube on and that's it, 90%+ memory usage. I know because that is one of my media PC (instead of smart tv).
Can't move to Linux because it's Intel Atom and Intel P-state driver for that is borked, never fixed.
Today's browsers tend to be huge memory hogs too. Software's attitude of "there's always more memory" is coming back to bite them as prices of ram increase.
IMHO, browsers might prioritize execution speed somewhat more than memory. There is the Pareto tradeoff principle where it's unlikely to optimize all the parameters - if you optimize one, you are likely to sacrifice others. Also more memory consumption (unlike CPU) doesn't decrease power efficiency that much, so, more memory might even help with that by reducing CPU usage with caching.
TBH your comments come off as either very misleading or just uneducated on the nature of performance. Troubling indeed.
Can you enlighten me why it's misleading or uneducated?
For comparison, here are the official hardware recommendations for Debian: https://www.debian.org/releases/stable/amd64/ch03s04.en.html
"With Desktop" has 1GB minimum and 2GB recommended - along with Pentium 4, 1GHz cpu.
Since the dawn of time, Microsoft has published the minimum system requirements needed to run Windows, not what you need to actually do something useful with it.
The framing of the article is very odd.
It says that Ubuntu increase the requirements not because of the OS itself but to have a better user experience when people have many browser tabs opened. Then it compares to Windows which has lower nominal requirements but higher requirements in practice to get a passable user experience.
> Canonical isn’t making 6GB memory a hard requirement for Ubuntu 26.04. It will still install on machines that fall below the minimum requirement, but users will have to deal with slower performance.
I think we have quite different definition of "minimum requirement", then.
Many commenters blows up here but you have to see this from the non-informed consumer perspective I think.
What I mean is, yes, WE know Win11 barely works with 4GB and WE know that 6gb is quite generous for a Linux machine, but they don't.
The general public isn't as informed as we think they are (which is proven by 75 million people last election).
God I miss openstep and CDE. It needs 16MB RAM (yes MB!) and together with a lighweight firefox clone you get everything you need. Eye candy is nice to have but not at that cost.
Windows 11's 4 GB minimum is dishonest. You cannot reasonably run it on that little, it is far too bloated at this point. Even LTSC benefits from 6 GB, and that is substantially cut-down compared to retail/enterprise.
I'd say Windows 11's real minimal is 8 GB in 2026, with the recommended being 16 GB.
PS - And even at 8 GB, it hits 100% usage and pages under moderate load or e.g. Windows Update running in the background.
I changed to devuan, now it uses 75 MB ram on idle.
How much RAM does Omarchy use? Anyone running the OS after the media hyped it a couple of months back?
It's Arch based, with (iirc) Hyperland as it's "DE", so really not much memory I'd guess.
My desktop runs Arch with Sway (so quite close), three monitors, and uses ~400MB ram after boot. Most of it are the framebuffers. All the rest is eaten by Firefox, rust-analyzer and qemu.
But we already know Ubuntu is the “worst” (most like modern windows, setup for media consumption, etc).
You can install Debian and it gives you all that you are familiar with from Ubuntu.
The article itself acknowledges that the headline is bullshit:
> The change isn't about the core operating system becoming resource-hungry. Instead, it reflects the way people use computers today—multiple browser tabs, web apps, and multitasking workflows
Basically the change reflects the fact that, at this level of analysis (how much RAM do I need in my consumer PC), the OS is irrelevant these days. If you use a web browser then that will dominate your resource requirements and there's nothing Linux can do about that.
Exactly. The headline is clickbait.
It doesn't matter how efficient your kernel or DE is if users expect to be able to load bloated websites in Chrome.
The headline is clickbait and the acknowledgement is LLM
I also feel bad for human em dash fans…
With arch+hyprland I hit 5GiB for a zen browser instance with 15+ tabs and a kitty instance with 15+ windows across 5 tabs, with codex and vim running.
If ram is a problem there's always alternatives. The impediment is always having to rethink your workflow or adopting someone else's opinion.
Why is this here? Extreme clickbait for those without tech literacy
Last time I touched an Ubuntu system, I had to diagnose why the machine suddenly had no available disk space.
1.5TB in /var/log
All from the Firefox snap package complaining every millisecond about some trivial Snap permission.
I'm glad I chose an OS without goddamn Snap. It's been unadulterated pain every time I've ever interacted with it.
YES! Snap drove me to debian sid and haven't looked back. Snap is probably fine, but don't force me to use it.
Trisquel 12 Mate -codenamed Ecne- with the Xanmod kernel to cover propietary drivers, that's a more libre start than Ubuntu. If everything works with the libre kernel, you can toss the Xanmod kernel in the spot.
Is this a Ubuntu issue or a Gnome issue? What about Lubuntu, Kubuntu, etc?
The article suggests that Xubuntu (which uses xfce instead of Gnome) uses much less memory. I don't know how true that is, but it seems reasonable that xfce uses some less memory.
sure probably even git a bit less,
but I still would recommend 6 GiB.
no matter of the OS
the problem here is more the programs you run on top of the OS (browser, electron apps, etc.)
realistic speaking you should budged at least 1GiB for you OS even if it's minimalist, and to avoid issues make it 2GiB of OS + some emergency buffer, caches, load spikes etc.
and 2GiB for your browser :(
and 500MiB for misc apps (mail, music, etc.)
wait we are already at 4.5 GiB I still need open office ....
even if xfc would safe 500 MiB it IMHO wouldn't matter (for the recommendation)
and sure you can make it work, can only have one tab open at a time, close the browser every time you don't need it, not use Spotify or YT etc.
but that isn't what people expect, so give them a recommendation which will work with what they expect and if someone tries to run it at smaller RAM it may work, but if it doesn't it at least isn't your fault
We expect xfce is much more efficient (it has more basic features) but is that the cause? Are you just subtracting out a big part from a higher baseline?
It is not actually an issue. The article isn’t based on any technical aspects of the OS, just the reported system requirements.
If it was a Gnome issue it would also be a Fedora issue though, no?
Depends on the packaging no? I'm not sure you get 100% the same experience even with the same Gnome version across Fedora, Ubuntu and Arch, do you?
I think this is a snap issue.
I’d imagine that all of Canonical’s flavors/spins ship with snap, so if the resources are lighter on say xubuntu then it’s probably not snap.
Snap still kinda egh though ;-D
neither, they didn't measure anything
they compared the Ubuntu minimal recommended RAM to Windows absolute minimal RAM requirements.
but Windows has monetary incentives (related to vendors) to say they support 4GiB of RAM even if windows runs very shitty on it, on the other had Ubuntu is incentivized to provider a more realistic minimum for convenient usage
I mean taking a step back all common modern browsers under common usage can easily use multiple GiB of memory and that is outside of the control of the OS vendor. (1)
As consequence IMHO recommending anything below 6 GiB is just irresponsible (iff a modern browser is used) _not matter what OS you use_.
---
(1): If there is no memory pressure (i.e. caches doesn't get evicted that fast, larger video buffers are used, no fast tab archiving etc.) then having YT playing likely will consume around ~600-800 MiB.(Be aware that this is not just JS memory usage but the whole usage across JS, images, video, html+css engine etc. For comparison web mail like proton or gmail is often roughly around 300MiB, Spotify interestingly "just" around 200MiB, and HN around 55MiB.
The amount of people still on less than 8gb of memory is really small.
I won't stand for this erasure!
On the contrary, those are mostly really overweight people, so the amount of them is quite large. The number of them is, however, small. :)
Doesn't "amount" just mean "number"?
What the fuck
Switched to SuSE a few years ago, still love it
I had a machine (an AMD 3700X with 32 GB of RAM and a fast NVMe SSD) on which I used to run Debian. Then about 2.5 years ago I bought a new one and gave my wife the 3700X: I figured out she'd be more at ease so I installed Ubuntu on it.
I couldn't understand why everything was that slow compared to Debian and didn't want to bother looking into it so...
After a few weeks: got rid of Ubuntu, installed her Debian. A simple "IceWM" WM (I use the tiling "Awesome WM" but that's too radical for my wife) and she loves it.
She basically manages her two SMEs entirely from a browser: Chromium or Firefox (but a fork of Firefox would do too).
It works so well since years now that for her latest hire she asked me to set her with the same config. So she's now got one employee on a Debian machine with the IceWM WM. Other machines are still on Windows but the plan is to only keep one Windows (just in case) and move the other machines to Debian too.
Unattended upgrades, a trivial firewall "everything OUT or IN but related/established allowed" and that's it.
I had used ubuntu back in the day, and when I came back to linux a bit ago I immediately installed it again.
I don't remember all of my frustrations, but I remember having a lot of trouble with snap. Specifically, it really annoyed me that the default install of firefox was the snap version instead of native. I want that to be an opt-in kind of thing. I found that flatpak just worked better anyway.
I almost tried making the switch to arch, but I've been pretty happy running debian sid (unstable) since. The debian installer is just more friendly to me for getting encrypted drives and partitions set up how I want.
It's not for everyone, but I like the structured rolling updates of sid and having access to the debian ecosystem too much to switch to something else at this point.
I use sway with a radeon card for my primary and have a secondary nvidia card for games and AI stuff.
It has its warts, but I love my debian+sway setup
Fat chance, Satya!
>Linux's advantage is slowly shrinking
Maybe in some ways, yes. But there are distros out there that can run easily in as little as 1G RAM. And I heard people have used it with far less.
I also remember hearing Ubuntu moved to default to Wayland, if true I have to wonder if defaulting to Wayland is part of the problem because Gnome / KDE on Wayland will use far more memory than FVWM / Fluxbox on X11.
FWIW, you can do a lot just from the console without a GUI w/Linux and any BSD, in that case the RAM usage will be tiny compared to Windows and Apple.
Not to mention that 'lower memory usage' is only one of many benefits and, at least before the prices went mad, hardly the most important one on the list.
Practically speaking most people would want a GUI though.
> But there are distros out there that can run easily in as little as 1G RAM
It always make me chuckle when I hear this. Default server (ie no GUI at all) installation of a RHEL derivative just outright dies silently with 1GB of RAM if there is no swap. Sure with the enabled swap it no longer dies but to say what the performance is anywhere performant is to lie to yourself.
RHEL is not the be-all end-all of minimalist linux, even sans GUI. Puppy Linux, with a full WM, is completely usable with a single gig of ram. That's obviously a different use-case from RHEL but the point stands.
1: ZRAM exists
2: Win11 is not usable with 4GB
3: Trisquel 12 Ecne exists. You might need Xanmos as a propietary kernel because of hardware, but try to blacklist mei and mei_me first in some .conf file at /lib/modprobe.d. Value your privacy.
Trisquel Mate with zram-config and some small tweaks can work with 4GB of RAM even with a browser with dozens of Tabs, at least with UBlock Origin.
The fact that I couldn't tell if point number 3 was a joke or not makes me confident we've still not seen the year of the Linux desktop.
I was testing them on a HP laptop I bought for $200 with 4GB of RAM.
Windows, its default, used so much memory that there was not much left for apps.
Ubuntu used 500MB less than Windows in system monitor. I think it was still 1GB or more. It also appeared to run more slowly than it used to on older hardware.
Lubuntu used hundreds of MB less than Ubuntu. It could still run the same apps but had less features in UI (eg search). It ran lightening fast with more, simultaneous apps.
(Note: That laptop's Wifi card wouldn't work with any Linux using any technique I tried. Sadly, I had to ditch it.)
I also had Lubuntu on a 10+ year old Thinkpad with an i7 (2nd gen). It's been my daily machine for a long time. The newer, USB installers wouldn't work with it. While I can't recall the specifics, I finally found a way to load an Ubuntu-like interface or Ubuntu itself through the Lubuntu tech. It's now much slower but still lighter than default Ubuntu or Windows.
(Note: Lubuntu was much lighter and faster on a refurbished Dell laptop I tested it on, too.)
God blessed me recently by a person who outright gave me an Acer Nitro with a RTX and Windows. My next step is to figure out the safest way to dual boot Windows 11 and Linux for machine learning without destroying the existing filesystem or overshrinking it.
Consider a dedicated SSD for each OS. You should have a couple M2 slots in the laptop. What you can do is remove (or disable) the Windows SSD, install Linux on the second drive, and then add back the windows drive. Select the drive at startup you want to be in on boot and default the drive you want to spend most of your time in. I did that on my XPS and it was trouble free. Linux can mount your NTFS just fine, without having to consider it from a boot/grub perspective.
https://community.acer.com/en/kb/articles/16556-how-to-upgra...
Looks like you got space for 2 drive.
That's a terrific idea. It might address the other problem that I'd have little space for Linux apps. Thanks!
> Ubuntu used 500MB less than Windows in system monitor.
Those number meant nothing comparing across OS. Depends on how they counts shared memory and how aggressive it cache, they can feel very different.
The realistic benchmark would be open two large applications (e.g. chrome + firefox with youtube and facebook - to jack up the memory usage), switch between them, and see how it response switching different tasks.
Thanks for the critiques and the tips. I might try that in future testing.
For the life of me I couldn't understand why anyone would downvote parent comment. Nothing offensive or disagreeable here.
windows always optimistically loads a lot, almost no matter how much ram you have
I imagine the choice of desktop environment has most to do with RAM requirements in Linux.
Unrelated to this, despite Ubuntu’s popularity, I think it’s one of the worst distro choices out there, especially for including old kernels for essentially no discernible reason.
I wouldn’t go so far as defending Microslop but I do get tired of the Apple fanboys accusing Windows of being bloated and running poorly.
They seem to defend Apple’s 8GB machines by saying that Apple systems perform better than Windows with the same amount of RAM. This claim is entirely unsubstantiated.
Windows has a lot of problems but performance and memory efficiency is not one of them. We should recall that Microsoft actually reduced RAM usage and minimum requirements between windows 7 and 8 as they wanted to get into the tablet game, and Windows has remained efficient with memory since then as Microsoft wants Windows to come with cheap Chromebook-like hardware and other similar low-end systems.
MacOS handles memory pressure better than Linux imo (at least for interactive use cases)
I have seen MacOS overcommit up to 50% of memory and still have the system be responsive.
Yesterday I filled up my ram accidentally on Fedora and even earlyoom took several minutes to trigger and in the meantime the system was essentially non-responsive
The plural of 'anecdote' is not 'data'.
It's exactly what it is
How do you think data is created? It's lots of anecdotes, normalised.
macOS uses solid-state drives to do swap to help increase virtual memory. I can run multiple browsers and IDEs smoothly on my 8GB MacBook.
This is with earlyoom/systemd-oomd enabled ?
From my experience it does not help much, and I still get occasional freezes when a program misbehaves on Linux. It’s not a huge problem, but it is a problem and it exists; I have been dealing with it for about 15 years with no significant improvement.
The earlyoom/oomd changes are quite recent.. I've had a 'better' experience, but I guess it's not really fixed yet.
Best kernel recommendations? Mainly for extremely long running (2+ years) SaaS applications. Stability overall. Is running a handful of docker containers and some binaries.
Linux LTS if stability is the most important. Realistically, "normal" Linux works for for it too, not sure you have to care as deeply about what kernel you use as you seem to do.
On my desktop I use linux-cachyos-bore-lto which seems to give me a slight performance boost in compilation times compared to the regular kernel, but I've had at least one crash that I've unable to attribute to any other specific issue, so could be the kernel I suppose, I wouldn't use it on a server nonetheless.
FreeBSD is touted for long running and stability.
For desktop use, I find that being on the latest stable kernels is best because you get things like recent AMD graphics drivers and support for recent hardware and laptops. I’m in an arch-based distro and my kernel updates all the time. I’ve never had an issue. The stability benefits of LTS seem completely useless in comparison. Just my opinion though.
If you’re running applications as in a server that’s an entirely different discussion. I have been assuming we are talking about desktop users who are not serving anything.
E.g., if I go out and buy a 2026 Panther Lake laptop with a new WiFi 7 chip or what have you, I’m going to want a distro with the latest kernels so that I don’t have hardware issues. If I install the default Ubuntu download it’s going to almost certainly have problems.
I am not an Apple fan, so I'll just tell my story. And yeah, it may be biased, and I may not understand something important.
So around ~3 years ago or so I bought a lightweight low-end laptop (Intel Core i3, 14 inch display, 8GB of RAM) for everyday stuff so I could easily bring it with me everywhere I need to go (I mean, everywhere I would need it). It came with Windows 11 pre-installed. Now, for you to understand, previously, like ~10 years ago or so I had a Windows 7 system and it was pretty neat. And I remembered when people were switching from Windows 7 to Windows 8 or 10, they blamed the new OS version just like right now the Windows 11 was blamed; yet everyone got used to it, it received some fixes, improvements, etc; so I thought "well, maybe Windows 11 is not so bad, I should try it out at least just for the sake of curiosity".
And now, the clean installation of the Windows 11 that came to my was requiring like ~20 seconds to fully boot up to the login window. I know that my laptop is not best of the best, but still... After a startup, with no apps opened, there was like ~4 GB of RAM usage just out of nowhere; so effectively I was limited to ~4GB of RAM to run something I want to. Bluetooth drivers were terrible (at the time) - sometimes I was able to connect to my headphones and sometimes I wasn't, while they were working with all of my other devices perfectly. Then there was also this hellish "Antimalware Executable" - and I know how it sounds, I have nothing against anti-virus software, but when it randomly shows up several times per day, eats all of your processing power (like ~80% of CPU usage, and note that I have 8 cores ~3 GHz here), heats up your laptop to the point that fan starts screaming... that was not very good, to speak softly. Battery usage was also a disappointment - sometimes it couldn't last for just 3 hours, while the most heavy thing I was doing during that time was compilation of some software.
I was trying, I was re-configuring, I was applying patches... and finally I got fed up with all of this bloat, broken updates and other garbage. So I just backed up all of my important files and data to external drive and installed Linux Mint (because in this particular case I just needed working laptop). And wow, it just worked! Now at startup I get like ~1 GB RAM usage at most (this actually depends on the DE I use, so numbers could be different from time to time), battery life improved, no more weird Bluetooth issues, no more random bloatware... it just works, and that's it.
I know that distros like Mint are focused on stability and efficiency, so maybe the comparison is a bit unfair. But hell, even while I don't have anything against Windows 7 or Windows 8, the recent Windows 11 is a real combination of bloatware and spyware. So performance and memory efficiency is, actually, the problem here. Or at least it was a problem last time I tried it.
Now, again, I may be wrong somewhere, maybe I missed something out. If I did - please point it out.
I just logged in to my MacBook Air M2 (24GB RAM) with no programs open and it’s reserving 8.3GB of RAM and using 500MB of swap.
My Framework laptop running CachyOS with KDE Plasma with nothing open except System Monitor reserves 4GB with 500MB in swap (I enabled swap for sleep to hibernate, normally there’s no swap).
Reserving RAM doesn’t mean there’s a performance problem.
Most of the things you’re talking about in your comment have nothing to do with RAM usage and memory efficiency. You’re complaining about some annoying preinstalled OEM software [1], bad drivers, fan noise, battery life, and windows updates. That stuff isn’t great but a lot of it doesn’t have anything to do with Windows RAM efficiency itself.
If you download the Windows ISO from Microsoft and clean install you’ll have a pretty nice experience. I think Microsoft needs to crack down on OEM software additions.
As far as slow boot up times/slow initial setup I’ll remind you that Macs also have that as an issue during first boot and spend a lot of time doing initial indexing.
Linux mint is a great distro and I also prefer Linux to both Mac and Windows as well. Mostly my commentary is on the subject of people claiming Microsoft Windows is bad with RAM when we now see some Linux distros asking for more RAM than Windows. I think it’s quite clear that RAM isn’t the problem with Windows, it’s a lot of other things and the surrounding ecosystem.
[1] I have to assume you’re talking about some third party antimalware program because the Microsoft one absolutely does not behave how you describe.
> Reserving RAM doesn’t mean there’s a performance problem.
It does in my own experience (so it may not be a problem for you, I agree, but it is a problem for me). Because when OS allocates ~50% percents of RAM for itself and isn't letting it go, then other software simply can't use it. Therefore, you're limited. Your potential performance is capped at certain level just because your OS decided to allocate half or more of your system RAM. Why? Well, just because it wants to.
> have nothing to do with RAM usage or performance
Well, to be honest, most of them don't. But would you please explain then, why it takes around 20 seconds just to boot up, while for the aforementioned Linux Mint (and I'll clarify that it's currently 22.3 for me, the latest version, it was 22.1 at the time as far as I remember) it's only around ~3-4 second to take me to the login screen and then another second (at most) to load everything after I have logged in? Could you also, please, explain how does it happen that even GNOME's Nautilus file explorer takes less RAM and far less CPU usage than Microsoft's Explorer (and I won't even mention Thunar, that's kinda unfair)? What about "Start" menu in Windows which spiked up CPUs just by opening/closing? There's a lot of performance issues, both with RAM and CPU usage.
I'm not saying that these problems are unique to Windows, no; but saying that Windows doesn't have any performance issues is not really true.
> I think it’s quite clear that RAM isn’t the problem with Windows, it’s a lot of other things and the surrounding ecosystem.
I agree with you here. That's true. A large part of the problem comes not from the actual operating system, but from the application software. I thought once that well, maybe if RAM shortages will last longer than for just one or two years, that will be bad, but also, maybe - just maybe - some software developers will start to think at least a bit more about optimization...
> [1] I have to assume you’re talking about some third party antimalware program because the Microsoft one absolutely does not behave how you describe.
Editing without specifying that you have edited your reply is not very good, you know. But okay.
Actually, I'm talking about the Windows-shipped Microsoft Defender process (at least it seems to come from Microsoft Defender). I have not seen anything third-party installed on my laptop at the time, and it actually behaved just like I described. I should also remind you that it is a low-end laptop, that's just Intel Core i3-N305, it's not the most powerful CPU in the world - just 8 cores, 8 threads and 3.80 GHz of max boost frequency.
If you think that I'm lying, then just search for "antimalware executable high CPU usage" in any search engine. You will find a plenty of complaints and even some guides on how to deal with it.
Both Ubuntu and Trisquel have backports for mainline and LTS kernels. Also, GNU/Linux has ZRAM, 4GB can work as 6.
> I wouldn’t go so far as defending Microslop
> I do get tired of the Apple fanboys accusing Windows of being bloated and running poorly.
> Windows has a lot of problems but performance and memory efficiency is not one of them.
I can't even describe how much your experience differs from mine. I would never have imagine someone to utter such sentence about Windows in todays day and age.
For everyone else reading this, a couple of advice I have gotten that made me suffer less with Windows is to replace Windows search with Everything (by Voidtools) and replace Explorer with Filepilot (filepilot.tech).
On a older machine, I switched to Tiny10.
Linux needs to go back to engineering again.
Ubuntu is the issue.
Maybe if FOSS was less focused on reverse engineering proprietary technology they could make products people LIKE. I say this as someone who learned about firmware because of several listeners and one group having the aim of reverse engineering my new Apple ecosystem that is now falling apart after signal traps. My crime was working for an ISP and the media, but I reported on Scienos not techbros. Yawn.
I knew they were fucking with my virtual memory cause theirs sucks, the partition schemes on this Mac mini were ridiculous and the helpers weren’t stealing my information.
Given that efficiency one of Linux's most touted advantages, what in the world is Ubuntu's PR department thinking? Ubuntu isn't providing any more functionality than when its memory requirement was 4GB. What is hogging all that extra ram?
> what in the world is Ubuntu's PR department thinking?
The same as any other corporate PR department: "At least now when people run it with N GB of RAM, we can just point to the system requirements and say 'This is what we support' rather than end up in a back-and-forth"
If you expect them to have any sort of long-term outlook on "Lets be careful with how developers view our organization", I think you're about a decade too late for Canonical.
No official reason given, so all the tech press is basically speculating (if someone finds a source that does a teardown, please share; I can't seem to locate one). I think my favorite piece of speculation is that it reflects an anticipated modern workload of using the OS as a vector to launch a web browser and open multiple tabs in it, which is just going to be a memory hog as experienced by most Ubuntu users.
I don't know what Ubuntu is doing with the RAM but I'm constantly swapping with all of the 16GB RAM filled on my work laptop with Ubuntu.
At home I have a desktop running Arch plus Gnome with 32GB RAM and I am at 7GB on a normal day and below 16GB at all times unless I run an LLM.
The sad answer is: nobody cares.
Besides the correct answer that Canonical sucks, I would argue that “efficiency” is not a selling point to get someone to use a desktop operating system.
Mainstream users and business organizations don’t really understand that concept and would prefer to see how the operating system enables their use cases and workflows.
This is changing. 8 GB will be more normal given ram shortages. See apples neo..
No, 8GB is still the bottom of the market, and the ram shortage is past its worst days already.
Chromebooks still start out at 4GB (example: https://www.acer.com/us-en/chromebooks/acer-chromebook-315-c...), and it's not like Google Chrome is lighter on RAM than Ubuntu's default browser.
Apple under-speccing their machines like they’ve been doing since the dawn of time is not some kind of indicator of any trends. You can buy $350 PC laptops that come with 12GB of RAM (example: https://www.staples.com/asus-vivobook-x1404-14-laptop-intel-...)
RAM shortages will be quite temporary. Making predictions based on individual component shortages has never been a winning strategy in the history of the industry. Next you’ll tell me that graphics cards will be impossible to get because of blockchain.