It still is if you're an enterprise customer. The retail users aren't Microsoft's cash cows, so they get ads and BS in their editions. The underlying APIs are still stable and MS provides the LTSC & Server editions to businesses which lack all that retail cruft.
I'm an enterprise user and I find Windows 11 a complete disaster. They've managed to make something as trivial as right-clicking a slow operation.
I used to be a pretty happy Windows camper (I even got through Me without much complaint), but I'm so glad I moved to Linux and KDE for my private desktops before 11 hit.
Yes. Enterprise, Pro, and Home are the enshittified, retail editions. Enterprise just adds a few more features IIRC but still has ads. The other versions I mentioned above don't have any of that.
The problem with Windows after Windows 7 isn't really ads, it's the blatant stupid use of web view to do the most mundane things and hog hundreds of MB or even GBs for silly features, that are still present in enterprise versions.
Competition. In the first half of the 90s Windows faced a lot more of it. Then they didn't, and standards slipped. Why invest in Windows when people will buy it anyway?
Upgrades. In the first half of the 90s Windows was mostly software bought by PC users directly, rather than getting it with the hardware. So, if you could make Windows 95 run in 4mb of RAM rather than 8mb of RAM, you'd make way more sales on release day. As the industry matured, this model disappeared in favor of one where users got the OS with their hardware purchase and rarely bought upgrades, then never bought them, then never even upgraded when offered them for free. This inverted the incentive to optimize because now the customer was the OEMs, not the end user. Not optimizing as aggressively naturally came out of that because the only new sales of Windows would be on new machines with the newest specs, and OEMs wanted MS to give users reasons to buy new hardware anyway.
UI testing. In the 1990s the desktop GUI paradigm was new and Apple's competitive advantage was UI quality, so Microsoft ran lots of usability studies to figure out what worked. It wasn't a cultural problem because most UI was designed by programmers who freely admitted they didn't really know what worked. The reason the start button had "Start" written on it was because of these tests. After Windows 95 the culture of usability studies disappeared, as they might imply that the professional designers didn't know what they were doing, and those designers came to compete on looks. Also it just got a lot harder to change the basic desktop UI designs anyway.
The web. When people mostly wrote Windows apps, investing in Windows itself made sense. Once everyone migrated to web apps it made much less sense. Data is no longer stored in files locally so making Explorer more powerful doesn't help, it makes more sense to simplify it. There's no longer any concept of a Windows app so adding new APIs is low ROI outside of gaming, as the only consumer is the browser. As a consequence all the people with ambition abandoned the Windows team to work on web-related stuff like Azure, where you could have actual impact. The 90s Windows/MacOS teams were full of people thinking big thoughts about how to write better software hence stuff like DCOM, OpenDoc, QuickTime, DirectMusic and so on. The overwhelming preference of developers for making websites regardless of the preferences of the users meant developing new OS ideas was a waste of time; browsers would not expose these features, so devs wouldn't use them, so apps wouldn't require them, so users would buy new computers to get access to them.
And that's why MS threw Windows away. It simply isn't a valuable asset anymore.
It's quite common for a company to build a good product and then once the initial wave of ICs and management moves on, the next waves of employees either don't understand what they're maintaining or simply don't care because they see a chance to extract short term gains from the built-up intellectual capital others generated.
Is this really the case? I feel like most windows users just bought a laptop with Windows already on it. Even if all home users were running pirated versions they would still become entrenched in the world of Windows/Office which would then lead to enterprise sales.
This might offend some people but even Linus Torvalds thinks that the ABI compatibility is not good enough in Linux distros, and this is one of the main reasons Linux is not popular on the desktop. https://www.youtube.com/watch?v=5PmHRSeA2c8&t=283s
Maybe it's better now in some distros. Not sure about other distros, but I don't like Ubuntu's Snap package. Snap packages typically start slower, use more RAM, require sudo privileges to install, and run in an isolated environment only on systems with AppArmour. Snap also tends to slow things some at boot and shutdown. People report issues like theming mismatches, permissions/file-access friction. Firefox theming complaints are a common example. It's almost like running a docker container for each application. Flatpaks seem slightly better, but still a bandaid. Just nobody is going to fix the compatibility problems in Linux.
I agree 100% with Linus. I can run a WinXP exe on Win10 or 11 almost every time, but on Linux I often have to chase down versions that still work with the latest Mint or Ubuntu distros. Stuff that worked before just breaks, especially if the app isn’t in the repo.
Yes and even the package format thing is a hell of its own. Even on Ubuntu you have multiple package formats and sometimes there are even multiple app stores (a Gnome one and an Ubuntu specific if I remember correctly)
So every Linux distribution should compile and distribute packages for every single piece of open source software in existence, both the very newest stuff that was only released last week, and also everything from 30+ years ago, no matter how obscure.
Because almost certainly someone out there will want to use it. And they should be able to, because that is the entire point of free software: user freedom.
Cool. Having major distributions default to using binfmt_misc to register Wine for PE executables (EXE files) would be nice though. Next steps would obviously be for Windows apps to have their own OS-level identity, confined and permissioned per app using normal Linux security mechanisms, run against a reproducible and pinned Wine runtime with clearly managed state, integrated with the desktop as normal applications (launching, file associations, icons), and produce per-app logs and crash information, so they can be operated and managed like native programs. We have AI now, this should not be rocket science or require major investments. Only viable way Linux is replacing Windows.
Crazy how, thanks to Wine/Proton, Linux is now more compatible with old Windows games than Windows itself. There are a lot of games from the 90s and even the 00s that require jumping through a lot of hoops to run on Windows, but through Steam they're click-to-play on Linux.
My gaming PC isn't compatible with windows 11, so it was the first to get upgraded to Linux. Immediate and significant improvement in experience.
Windows kept logging down the system trying to download a dozen different language versions of word (for which I didn't have a licence and didn't want regardless). Steam kept going into a crash restart cycle. Virus scanner was ... being difficult.
Everything just works on Linux except some games on proton have some sound issues that I still need to work out.
Sound (oss, alsa, pulseaudio, pipewire...), bluetooth, WiFi are eternal problematic Linux paper cuts.
As always It is Not Linux Fault, but it is Linux Problem.
It's one of the reasons why I moved to OSX + Linux virtual machine. I get the best of both worlds. Plus, the hardware quality of a 128GB unified RAM MacBookPro M4 Max is way beyond anything else in the market.
In some games I get a crackle in the audio which I don't get through any native application, nor some games run with proton. I don't know if that's what he means, but it hasn't bothered me enough to figure it out. I use bluetooth headphones anyway, I'm relatively insensitive to audio fidelity.
Linux sound is fine at least for me. The problem is running Windows games in proton. Sound will suddenly stop, then come back delayed. Apparently a known issue on some systems.
It kinda works both ways, just yesterday I tried to play the Linux native version of 8bit.runner and it didn't work, I had to install the Windows (beta) version and run it through proton.
Lemmings Revolutions. Apparently to run in something else that is not Windows 95/98/Me requires some unofficial .EXE patch that you could download from some shady website. The file is now nowehre to be found.
It's a great game, unfortunately right now I am not able to play it anymore :( even though I have the original CD.
Pretty much all the Renderware based GTAs have issues these days that only community made patches can mitigate.
A recent example is that in San Andreas, the seaplane never spawns if you're running Windows 11 24H2 or newer. All of it due to a bug that's always been in the game, but only the recent changes in Windows caused it to show up. If anybody's interested, you can read the investigation on it here: https://cookieplmonster.github.io/2025/04/23/gta-san-andreas...
The last time I tried to run Tachyon: The Fringe was Windows 10, and it failed. IIRC I could launch it and play, but there was a non-zero chance that a FMV cutscene would cause it to freeze.
I see there are guides on Steam forums on how to get it to run under Windows 11 [0], and they are quite involved for someone not overly familiar with computers outside of gaming.
Anything around DirectX 10 and older has issues with Windows, these days.
One more popular example is Grid 2, another is Morrowind. Both crash on launch, unless you tweak a lot of things, and even then it won't always succeed.
Need for Speed II: SE is "platinum" on Wine, and pretty much unable to be run at all on Windows 11.
Windows used to be half operating system, half preconfigured compatibility tweaks for all kinds of applications. That's how it kept its backwards compatibility.
Alternatively, RemObjects makes Elements, also a RAD programming environment in which you can code in Oxygene (their Object Pascal), C#, Swift, Java, Go, or Mercury (VB) and target all platforms: .Net, iOS and macOS, Android, WebAssemblyl, Java, Linux, Windows.
Yes, you can build cross-platform GUI apps with Delphi. However, that requires using Firemonkey (FMX). If you build a GUI app using VCL on Delphi, it's limited to Windows. If you build an app with Lazarus and LCL, you CAN have it work cross-platform.
> Alternatively, RemObjects makes Elements, also a RAD programming environment in which you can code in Oxygene (their Object Pascal), C#, Swift, Java, Go, or Mercury (VB) and target all platforms: .Net, iOS and macOS, Android, WebAssemblyl, Java, Linux, Windows.
Wait you can make Android applications with Golang without too much sorcery??
I just wanted to convert some Golang CLI applications to GUI's for Android and I instead ended up giving up on the project and just started recommending people to use termux.
Please tell me if there is a simple method for Golang which can "just work" for basically being the Visualbasic-alike glue code to just glue CLI and GUI mostly.
If there was sufficient interest in it, most performance issues could be solved. Look at Python or Javascript, big companies have financial interest in it so they've poured an insane amount of capital into making them faster.
Being slower than other mainstream languages isn't really a problem in and of itself if it's fast enough to get the job done. Looking at all the ML and LLM work that's done in Python, I would say it is fast enough to get things done.
I started with VB6 so I'm sometimes nostalgic for it too but let's not kid ourselves.
We might take it for granted but React-like declarative top-down component model (as opposed to imperative UI) was a huge step forward. In particular that there's no difference between initial render or a re-render, and that updating state is enough for everything to propagate down. That's why it went beyond web, and why all modern native UI frameworks have a similar model these days.
Only if I don't need to do anything beyond the built-in widgets and effects of Win32. If I need to do anything beyond that then I don't see me being more productive than if I were using a mature, well documented and actively maintained application runtime like the Web.
That's not really true. Even in the 90s there were large libraries of 3rd party widgets available for Windows that could be drag-and-dropped into VB, Delphi, and even the Visual C++ UI editor. For tasks running the gamut from 3D graphics to interfacing with custom hardware.
The web was a big step backwards for UI design. It was a 30 year detour whose results still suck compared to pre-web UIs.
Whenever people bring this up I find it somewhat silly. Wine originally stood for "Windows Emulator". See old release notes ( https://lwn.net/1998/1112/wine981108.html ) for one example: "This is release 981108 of Wine, the MS Windows emulator." The name change was made for trademark and marketing reasons. The maintainers were concerned that if the project got good enough to frighten Microsoft, they might get sued for having "Windows" in the name. They also had to deal with confusion from people such as yourself who thought "emulation" automatically meant "software-based, interpreted emulation" and therefore that running stuff in Wine must have some significant performance penalty. Other Windows compatibility solutions like SoftWindows and Virtual PC used interpreted emulation and were slow as a result, so the Wine maintainers wanted to emphasize that Wine could run software just as quickly as the same computer running Windows.
Emulation does not mean that the CPU must be interpreted. For example, the DOSEMU emulator for Linux from the early 90s ran DOS programs natively using the 386's virtual 8086 mode, and reimplemented the DOS API. This worked similarly to Microsoft's Virtual DOS Machine on Windows NT. For a more recent example, the ShadPS4 PS4 emulator runs the game code natively on your amd64 CPU and reimplements the PS4 API in the emulator source code for graphics/audio/input/etc calls.
2. What causes it (the issues that makes it such a challenge)
3. How it changed over the years, and its current state
4. Any serious attempts to resolve it
I've been on Linux for may be 2 decades at this point. I haven't noticed any issues with ABI so far, perhaps because I use everything from the distro repo or build and install them using the package manager. If I don't understand it, there are surely others who want to know it too. (Not trying to brag here. I'm referring to the time I've spent on it.)
I know that this is a big ask. The best course for me is of course to research it myself. But those who know the whole history tend to have a well organized perspective of it, as well as some invaluable insights that are not recorded anywhere else. So if this describes you, please consider writing it down for others. Blog is probably the best format for this.
The kernel is stable, but all the system libraries needed to make a grapical application are not. Over the last 20 years, we've gone from GTK 2 to 4, X11 to Wayland, Qt 4 to 6, with compatibility breakages with each change. Building an unmodified 20 year old application from source is very likely to not work, running a 20 year old binary even less so.
The model of patching+recompiling the world for every OS release is a terrible hack that devs hate and that users hate. 99% of all people hate it because it's a crap model. Devs hate middlemen who silently fuck up their software and leave upstream with the mess, users hate being restricted to whatever software was cool and current two years ago. If they use a rolling distro, they hate the constant brokenness that comes with it. Of the 1% of people who don't hate this situation 99% of those merely tolerate it, and the rest are Debian developers who are blinded by ideology and sunk costs.
Good operating systems should:
1. Allow users to obtain software from anywhere.
2. Execute all programs that were written for previous versions reliably.
3. Not insert themselves as middlemen into user/developer transactions.
Judged from this perspective, Windows is a good OS. It doesn't nail all three all the time, but it gets the closest. Linux is a bad OS.
The answers to your questions are:
(1) It isn't backwards compatible for sophisticated GUI apps. Core APIs like the widget toolkits change their API all the time (GTK 1->2->3->4, Qt also does this). It's also not forwards compatible. Compiling the same program on a new release may yield binaries that don't run on an old release. Linux library authors don't consider this a problem, Microsoft/Apple/everyone else does. This is the origin of the glibc symbol versioning errors everyone experiences sometimes.
(2) Maintaining a stable API/ABI is not fun and requires a capitalist who says "keep app X working or else I'll fire you". The capitalist Fights For The User. Linux is a socialist/collectivist project with nobody playing this role. Distros like Red Hat clone the software ecosystem into a private space that's semi-capitalist again, and do offer stable ABIs, but their releases are just ecosystem forks and the wider issue remains.
(3) It hasn't change and it's still bad.
(4) Docker: "solves" the problem on servers by shipping the entire userspace with every app, and being itself developed by a for-profit company. Only works because servers don't need any shared services from the computer beyond opening sockets and reading/writing files, so the kernel is good enough and the kernel does maintain a stable ABI. Docker obviously doesn't help the moment you move outside the server space and coordination requirements are larger.
I'm back to running Windows because of the shifting sands of Python and WxWindows that broke WikidPad, my personal wiki. The .exe from 2012 still works perfectly though, so I migrated back from Ubuntu to be able to use it without hassle.
It's my strong opinion that Windows 2000 Server, SP4 was the best desktop OS ever.
I like this idea and know at least a few who would love to use this if you can solve for the:
'unfortunate rough edges that people only tolerate because they use WINE as a last resort'
Whether those rough edges will ever be ironed out is a matter I'll leave to other people. But I love that someone is attempting this just because of the tenacity it shows. This reminds me of projects like asahi and cosmopolitan c.
Now if we're to do something to actually solve for Gnu/Linux Desktops not having a stable ABI I think one solution would be to make a compatibility layer like Wine's but using Ubuntu's ABIs. Then as long as the app runs on supported Ubuntu releases it will run on a system with this layer. I just hope it wouldn't be a buggy mess like flatpak is.
This is a really cool idea. My only gripe is that Win32 is necessarily built on x86. AArch64/ARM is up and coming, and other architectures may arise in the future.
Perhaps that could be mitigated if someone could come up with an awesome OSS machine code translation layer like Apple's Rosetta.
There's not much x86 specific about Win32 and you can make native ARM Windows programs for years already. WinNT was designed to be portable from the start. Windows/ARM comes with a Rosetta like system and can run Intel binaries out of the box.
Yea! I love the spirit. Compatibility in computing is consternating. If my code is compiled for CPU Arch X, the OS should just provide it with (using Rust terminology) standard library tools (networking, file system, and allocator etc) , de-conflict it with other programs, and get out of the way. The barriers between OSes, including between various linux dependencies feels like a problem we (idealistically thinking) shouldn't have.
The difference between Win32 and Linux is that the latter didn't realize an operating system is more than a kernel and a number of libraries and systems glued together, but is, indeed, a stable ABI (even for kernel modules -- so old drivers will be usable forever), a default, unique and stable API for user interface, audio, ..., and so forth. Linux failed completely not technologically, but to understand what an OS is from the POV of a product.
Linux didn't aim to be an OS in the consumer sense (it is entirely an OS in an academic sense - in scientific literature OS == kernel, nothing else).The "consumer" OS is GNU/Linux or Android/Linux.
This will never work, because it isn't a radical enough departure from Linux.
Linux occupies the bottom of a well in the cartesian space. Any deviation is an uphill battle. You'll die trying to reach escape velocity.
The forcing factors that pull you back down:
1. Battles-testedness. The mainstream Linux distros just have more eyeballs on them. That means your WINE-first distro (which I'll call "Lindows" in honor of the dead OS from 2003) will have bugs that make people consider abandoning the dream and going back to Gnome Fedora.
2. Cool factor. Nobody wants to open up their riced-out Linux laptop in class and have their classmate look over and go "yo this n** running windows 85!" (So, you're going to have to port XMonad to WINE. I don't make the rules!)
3. Kernel churn. People will want to run this thing on their brand-new gaming laptop. That likely means they'll need a recent kernel. And while they "never break userspace" in theory, in practice you'll need a new set of drivers and MESA and other add-ons that WILL breaks things. Especially things like 3D apps running through WINE (not to mention audio). Google can throw engineers at the problem of keeping Chromium working across graphics stacks. But can you?
If you could plant your flag in the dirt and say "we fork here" and make a radical left turn from mainline Linux, and get a cohort of kernel devs and app developers to follow you, you'd have a chance.
Depends on what task you're doing, and to a certain extent how you prefer to do it. For example sure there's plenty of ways to tag/rename media files, but I've yet to find something that matches the power of Mp3tag in a GUI under linux.
I use some cool ham radio software, a couple SDR applications, and a lithophane generator for my 3d printer. It all works great, if you have a cool utility or piece of software, why wouldn't you want to?
Well, not having Proton definitely didn't work to grow gaming on Linux.
Maybe Valve can play the reverse switcheroo out of Microsoft's playbook and, once enough people are on Linux, force the developers' hand by not supporting Proton anymore.
For making music as much as I love the free audio ecosystem there's some very unique audio plugins with specific sounds that will never be ported. Thankfully bridging with wine works fairly well nowadays.
While true, people should pay attention that WinRT, the technology infrastructure for UWP, nowadays lives in Win32 and is what is powering anything CoPilot+ PC, Windows ML, the Windows Terminal rewrite, new Explorer extensions, updated context menu on Windows 11,....
It is a moving target, Proton is mostly stuck on Windows XP world, before most new APIs started being a mix of COM and WinRT.
Even if that isn't the case, almost no company would bother with GNU/Linux to develop with Win32, instead of Windows, Visual Studio, business as usual.
This is amusing but infeasible in practice because it would need to be behaviorally compatible with Windows, including all bugs along with app compatibility mitigations. Might as well just use Windows at that point.
WINE has been reimplementing the Win32 ABI (not API) for decades. It already works pretty well; development has been driven by both volunteers and commercial developers (CodeWeavers) for a long time.
There are many programs that still do not work properly in WINE, even though it has been developed for decades. This in itself demonstrates the infeasibility of reimplementing Win32 as a stable interface on par with Windows. The result after all this effort is still patchy and incomplete.
Stable interfaces and not being in versioning hell (cough libc) would actually be good for FOSS as well.
If you make a piece of software today and want to package it for Linux its an absolute mess. I mean, look at flatpack or docker, a common solution for this is to ship your own userspace, thats just insane.
Free software can still benefit from a stable ABI. If I want to run the software, it's better to download it in a format my CPU can understand, rather than download source, figure out the dependencies, wait for compiling (let's say it's a large project like Firefox or Chromium that takes hours to compile), and so on.
It still puzzles me decades later how MS built the most functional, intuitive and optimised desktop environment possible then simply threw it away
It still is if you're an enterprise customer. The retail users aren't Microsoft's cash cows, so they get ads and BS in their editions. The underlying APIs are still stable and MS provides the LTSC & Server editions to businesses which lack all that retail cruft.
I'm an enterprise user and I find Windows 11 a complete disaster. They've managed to make something as trivial as right-clicking a slow operation.
I used to be a pretty happy Windows camper (I even got through Me without much complaint), but I'm so glad I moved to Linux and KDE for my private desktops before 11 hit.
https://massgrave.dev/windows_ltsc_links
Everything after Win 2000 was a bad idea. Enterprise or not.
Windows 2000 was the last version where Dave Cutler was fully in charge of Windows.
Things started going downhill after that.
In my day job, Explorer still freezes every second day, GUI interactions take several seconds and the sidebar is full of tabloid headlines and ads.
Do you mean Windows 1x Pro/Enterprise?
Yes. Enterprise, Pro, and Home are the enshittified, retail editions. Enterprise just adds a few more features IIRC but still has ads. The other versions I mentioned above don't have any of that.
Enterprise is not retail and is usually done via volume licensing, but probably without any additional configuration it might have that stuff intact.
But you can use group policy etc. freely. I don't know how Win 11 is though
The problem with Windows after Windows 7 isn't really ads, it's the blatant stupid use of web view to do the most mundane things and hog hundreds of MB or even GBs for silly features, that are still present in enterprise versions.
Start menu search requires 7 web browser processes that consume ~350 MB of RAM to be constantly running.
The pivot point was Windows 95.
Competition. In the first half of the 90s Windows faced a lot more of it. Then they didn't, and standards slipped. Why invest in Windows when people will buy it anyway?
Upgrades. In the first half of the 90s Windows was mostly software bought by PC users directly, rather than getting it with the hardware. So, if you could make Windows 95 run in 4mb of RAM rather than 8mb of RAM, you'd make way more sales on release day. As the industry matured, this model disappeared in favor of one where users got the OS with their hardware purchase and rarely bought upgrades, then never bought them, then never even upgraded when offered them for free. This inverted the incentive to optimize because now the customer was the OEMs, not the end user. Not optimizing as aggressively naturally came out of that because the only new sales of Windows would be on new machines with the newest specs, and OEMs wanted MS to give users reasons to buy new hardware anyway.
UI testing. In the 1990s the desktop GUI paradigm was new and Apple's competitive advantage was UI quality, so Microsoft ran lots of usability studies to figure out what worked. It wasn't a cultural problem because most UI was designed by programmers who freely admitted they didn't really know what worked. The reason the start button had "Start" written on it was because of these tests. After Windows 95 the culture of usability studies disappeared, as they might imply that the professional designers didn't know what they were doing, and those designers came to compete on looks. Also it just got a lot harder to change the basic desktop UI designs anyway.
The web. When people mostly wrote Windows apps, investing in Windows itself made sense. Once everyone migrated to web apps it made much less sense. Data is no longer stored in files locally so making Explorer more powerful doesn't help, it makes more sense to simplify it. There's no longer any concept of a Windows app so adding new APIs is low ROI outside of gaming, as the only consumer is the browser. As a consequence all the people with ambition abandoned the Windows team to work on web-related stuff like Azure, where you could have actual impact. The 90s Windows/MacOS teams were full of people thinking big thoughts about how to write better software hence stuff like DCOM, OpenDoc, QuickTime, DirectMusic and so on. The overwhelming preference of developers for making websites regardless of the preferences of the users meant developing new OS ideas was a waste of time; browsers would not expose these features, so devs wouldn't use them, so apps wouldn't require them, so users would buy new computers to get access to them.
And that's why MS threw Windows away. It simply isn't a valuable asset anymore.
It's quite common for a company to build a good product and then once the initial wave of ICs and management moves on, the next waves of employees either don't understand what they're maintaining or simply don't care because they see a chance to extract short term gains from the built-up intellectual capital others generated.
It's functional - yes, intuitive - maybe, but optimized is highly debatable.
The answer to maintaining a highly functional and stable OS is piles and piles of backwards compatibility misery on the devs.
You want Windows 9? Sorry, some code checks the string for Windows 9 to determine if the OS is Windows 95 or 98.
Millions of total computer noobs hit the ground running with Windows 95. It was a great achievement in software design.
He was talking about user interface not app compatibility
Piracy. The consumer versions are filled with ads because most people don't pay for them.
Is this really the case? I feel like most windows users just bought a laptop with Windows already on it. Even if all home users were running pirated versions they would still become entrenched in the world of Windows/Office which would then lead to enterprise sales.
If you were able to wave a magic wand today and remove piracy, Microsoft would not remove ads.
This might offend some people but even Linus Torvalds thinks that the ABI compatibility is not good enough in Linux distros, and this is one of the main reasons Linux is not popular on the desktop. https://www.youtube.com/watch?v=5PmHRSeA2c8&t=283s
To quote a friend; "Glibc is a waste of a perfectly good stable kernel ABI"
AppImage, theoretically, solves this problem (or FlatPak I guess). The issue would really be in getting people to package up dead/abandoned software.
While true in many respects (still), it's worth pointing out that this take is 12 years old.
Maybe it's better now in some distros. Not sure about other distros, but I don't like Ubuntu's Snap package. Snap packages typically start slower, use more RAM, require sudo privileges to install, and run in an isolated environment only on systems with AppArmour. Snap also tends to slow things some at boot and shutdown. People report issues like theming mismatches, permissions/file-access friction. Firefox theming complaints are a common example. It's almost like running a docker container for each application. Flatpaks seem slightly better, but still a bandaid. Just nobody is going to fix the compatibility problems in Linux.
I agree 100% with Linus. I can run a WinXP exe on Win10 or 11 almost every time, but on Linux I often have to chase down versions that still work with the latest Mint or Ubuntu distros. Stuff that worked before just breaks, especially if the app isn’t in the repo.
Yes and even the package format thing is a hell of its own. Even on Ubuntu you have multiple package formats and sometimes there are even multiple app stores (a Gnome one and an Ubuntu specific if I remember correctly)
That’s actually an intentional nudge to make the software packaged by the distro, which usually implies that they are open source.
Who needs ABI compatibility when your software is OSS? You only need API compatibility at that point.
So every Linux distribution should compile and distribute packages for every single piece of open source software in existence, both the very newest stuff that was only released last week, and also everything from 30+ years ago, no matter how obscure.
Because almost certainly someone out there will want to use it. And they should be able to, because that is the entire point of free software: user freedom.
It's really just glibc
It's really just not. GTK is on its fourth major version. Wayland broke backwards compatibility with tons of apps.
Can't we just freeze glibc, at least from an API version perspective?
Or just pre-install all the versions on each distro and pick the right one at load-time
Cool. Having major distributions default to using binfmt_misc to register Wine for PE executables (EXE files) would be nice though. Next steps would obviously be for Windows apps to have their own OS-level identity, confined and permissioned per app using normal Linux security mechanisms, run against a reproducible and pinned Wine runtime with clearly managed state, integrated with the desktop as normal applications (launching, file associations, icons), and produce per-app logs and crash information, so they can be operated and managed like native programs. We have AI now, this should not be rocket science or require major investments. Only viable way Linux is replacing Windows.
Crazy how, thanks to Wine/Proton, Linux is now more compatible with old Windows games than Windows itself. There are a lot of games from the 90s and even the 00s that require jumping through a lot of hoops to run on Windows, but through Steam they're click-to-play on Linux.
Wine works on windows too. It's used by the shorthorn project to get software for newer versions of windows to run under XP.
My gaming PC isn't compatible with windows 11, so it was the first to get upgraded to Linux. Immediate and significant improvement in experience.
Windows kept logging down the system trying to download a dozen different language versions of word (for which I didn't have a licence and didn't want regardless). Steam kept going into a crash restart cycle. Virus scanner was ... being difficult.
Everything just works on Linux except some games on proton have some sound issues that I still need to work out.
>> some sound issues
Is this 1998? Linux is forever having sound issues. Why is sound so hard?
Sound (oss, alsa, pulseaudio, pipewire...), bluetooth, WiFi are eternal problematic Linux paper cuts.
As always It is Not Linux Fault, but it is Linux Problem.
It's one of the reasons why I moved to OSX + Linux virtual machine. I get the best of both worlds. Plus, the hardware quality of a 128GB unified RAM MacBookPro M4 Max is way beyond anything else in the market.
In some games I get a crackle in the audio which I don't get through any native application, nor some games run with proton. I don't know if that's what he means, but it hasn't bothered me enough to figure it out. I use bluetooth headphones anyway, I'm relatively insensitive to audio fidelity.
Linux sound is fine at least for me. The problem is running Windows games in proton. Sound will suddenly stop, then come back delayed. Apparently a known issue on some systems.
To be fair, you can have sound issues on windows too. It's not usually on issue on linux anymore either though.
It kinda works both ways, just yesterday I tried to play the Linux native version of 8bit.runner and it didn't work, I had to install the Windows (beta) version and run it through proton.
> There are a lot of games from the 90s and even the 00s that require jumping through a lot of hoops to run on Windows
What are some examples?
Lemmings Revolutions. Apparently to run in something else that is not Windows 95/98/Me requires some unofficial .EXE patch that you could download from some shady website. The file is now nowehre to be found.
It's a great game, unfortunately right now I am not able to play it anymore :( even though I have the original CD.
Pretty much all the Renderware based GTAs have issues these days that only community made patches can mitigate.
A recent example is that in San Andreas, the seaplane never spawns if you're running Windows 11 24H2 or newer. All of it due to a bug that's always been in the game, but only the recent changes in Windows caused it to show up. If anybody's interested, you can read the investigation on it here: https://cookieplmonster.github.io/2025/04/23/gta-san-andreas...
The last time I tried to run Tachyon: The Fringe was Windows 10, and it failed. IIRC I could launch it and play, but there was a non-zero chance that a FMV cutscene would cause it to freeze.
I see there are guides on Steam forums on how to get it to run under Windows 11 [0], and they are quite involved for someone not overly familiar with computers outside of gaming.
0: https://steamcommunity.com/sharedfiles/filedetails/?id=29344...
Anything around DirectX 10 and older has issues with Windows, these days.
One more popular example is Grid 2, another is Morrowind. Both crash on launch, unless you tweak a lot of things, and even then it won't always succeed.
Need for Speed II: SE is "platinum" on Wine, and pretty much unable to be run at all on Windows 11.
Isn’t this because the wine db has those tweaks pre configured?
Windows used to be half operating system, half preconfigured compatibility tweaks for all kinds of applications. That's how it kept its backwards compatibility.
More a case of DirectX radically changing how it worked [0].
[0] https://learn.microsoft.com/en-us/windows/win32/direct3darti...
Building GUI utilities based on VB6 instead of status quo web technologies might actually be more stable and productive.
I would pick Delphi (with which you can build Windows, Linux, macOS, Android, and iOS apps - https://www.embarcadero.com/products/delphi)
Alternatively, RemObjects makes Elements, also a RAD programming environment in which you can code in Oxygene (their Object Pascal), C#, Swift, Java, Go, or Mercury (VB) and target all platforms: .Net, iOS and macOS, Android, WebAssemblyl, Java, Linux, Windows.
Yes, you can build cross-platform GUI apps with Delphi. However, that requires using Firemonkey (FMX). If you build a GUI app using VCL on Delphi, it's limited to Windows. If you build an app with Lazarus and LCL, you CAN have it work cross-platform.
> Alternatively, RemObjects makes Elements, also a RAD programming environment in which you can code in Oxygene (their Object Pascal), C#, Swift, Java, Go, or Mercury (VB) and target all platforms: .Net, iOS and macOS, Android, WebAssemblyl, Java, Linux, Windows.
Wait you can make Android applications with Golang without too much sorcery??
I just wanted to convert some Golang CLI applications to GUI's for Android and I instead ended up giving up on the project and just started recommending people to use termux.
Please tell me if there is a simple method for Golang which can "just work" for basically being the Visualbasic-alike glue code to just glue CLI and GUI mostly.
> Wait you can make Android applications with Golang without too much sorcery??
Why don't you try it out: https://www.remobjects.com/elements/gold/
I would vote for Delphi/FreePascal, but share the sentiment.
I only had limited exposure to Delphi, but from what I experienced, it's big thumbs-up.
But if you liked that, consider that C# was in many ways a spiritual successor to Delphi, and MS still supports native GUI development with it.
Except on the AOT experience and low level programming, which only started to be taken seriously during the last five years.
Lua
Performance?
If there was sufficient interest in it, most performance issues could be solved. Look at Python or Javascript, big companies have financial interest in it so they've poured an insane amount of capital into making them faster.
Do you think that "most performance issues" in Python are solved?
Isn’t python still the slowest mainstream language?
Being slower than other mainstream languages isn't really a problem in and of itself if it's fast enough to get the job done. Looking at all the ML and LLM work that's done in Python, I would say it is fast enough to get things done.
I started with VB6 so I'm sometimes nostalgic for it too but let's not kid ourselves.
We might take it for granted but React-like declarative top-down component model (as opposed to imperative UI) was a huge step forward. In particular that there's no difference between initial render or a re-render, and that updating state is enough for everything to propagate down. That's why it went beyond web, and why all modern native UI frameworks have a similar model these days.
Only if I don't need to do anything beyond the built-in widgets and effects of Win32. If I need to do anything beyond that then I don't see me being more productive than if I were using a mature, well documented and actively maintained application runtime like the Web.
That's not really true. Even in the 90s there were large libraries of 3rd party widgets available for Windows that could be drag-and-dropped into VB, Delphi, and even the Visual C++ UI editor. For tasks running the gamut from 3D graphics to interfacing with custom hardware.
The web was a big step backwards for UI design. It was a 30 year detour whose results still suck compared to pre-web UIs.
Honestly, it’s probably faster and less resource intensive through emulation than your average Electron app :-/
Wine Is Not an Emulator (WINE). It provides win32 APIs; your CPU will handle the instructions natively. There is no “probably” about it.
Whenever people bring this up I find it somewhat silly. Wine originally stood for "Windows Emulator". See old release notes ( https://lwn.net/1998/1112/wine981108.html ) for one example: "This is release 981108 of Wine, the MS Windows emulator." The name change was made for trademark and marketing reasons. The maintainers were concerned that if the project got good enough to frighten Microsoft, they might get sued for having "Windows" in the name. They also had to deal with confusion from people such as yourself who thought "emulation" automatically meant "software-based, interpreted emulation" and therefore that running stuff in Wine must have some significant performance penalty. Other Windows compatibility solutions like SoftWindows and Virtual PC used interpreted emulation and were slow as a result, so the Wine maintainers wanted to emphasize that Wine could run software just as quickly as the same computer running Windows.
Emulation does not mean that the CPU must be interpreted. For example, the DOSEMU emulator for Linux from the early 90s ran DOS programs natively using the 386's virtual 8086 mode, and reimplemented the DOS API. This worked similarly to Microsoft's Virtual DOS Machine on Windows NT. For a more recent example, the ShadPS4 PS4 emulator runs the game code natively on your amd64 CPU and reimplements the PS4 API in the emulator source code for graphics/audio/input/etc calls.
Can somebody explain:
1. The exact problem with the Linux ABI
2. What causes it (the issues that makes it such a challenge)
3. How it changed over the years, and its current state
4. Any serious attempts to resolve it
I've been on Linux for may be 2 decades at this point. I haven't noticed any issues with ABI so far, perhaps because I use everything from the distro repo or build and install them using the package manager. If I don't understand it, there are surely others who want to know it too. (Not trying to brag here. I'm referring to the time I've spent on it.)
I know that this is a big ask. The best course for me is of course to research it myself. But those who know the whole history tend to have a well organized perspective of it, as well as some invaluable insights that are not recorded anywhere else. So if this describes you, please consider writing it down for others. Blog is probably the best format for this.
The kernel is stable, but all the system libraries needed to make a grapical application are not. Over the last 20 years, we've gone from GTK 2 to 4, X11 to Wayland, Qt 4 to 6, with compatibility breakages with each change. Building an unmodified 20 year old application from source is very likely to not work, running a 20 year old binary even less so.
You never ran into a GLIBC version problem?
Wasn't there also DLL hell on Windows?
My understanding is that very old statically linked Linux images still run today because paraphrasing Linus: "we don't break user space".
The model of patching+recompiling the world for every OS release is a terrible hack that devs hate and that users hate. 99% of all people hate it because it's a crap model. Devs hate middlemen who silently fuck up their software and leave upstream with the mess, users hate being restricted to whatever software was cool and current two years ago. If they use a rolling distro, they hate the constant brokenness that comes with it. Of the 1% of people who don't hate this situation 99% of those merely tolerate it, and the rest are Debian developers who are blinded by ideology and sunk costs.
Good operating systems should:
1. Allow users to obtain software from anywhere.
2. Execute all programs that were written for previous versions reliably.
3. Not insert themselves as middlemen into user/developer transactions.
Judged from this perspective, Windows is a good OS. It doesn't nail all three all the time, but it gets the closest. Linux is a bad OS.
The answers to your questions are:
(1) It isn't backwards compatible for sophisticated GUI apps. Core APIs like the widget toolkits change their API all the time (GTK 1->2->3->4, Qt also does this). It's also not forwards compatible. Compiling the same program on a new release may yield binaries that don't run on an old release. Linux library authors don't consider this a problem, Microsoft/Apple/everyone else does. This is the origin of the glibc symbol versioning errors everyone experiences sometimes.
(2) Maintaining a stable API/ABI is not fun and requires a capitalist who says "keep app X working or else I'll fire you". The capitalist Fights For The User. Linux is a socialist/collectivist project with nobody playing this role. Distros like Red Hat clone the software ecosystem into a private space that's semi-capitalist again, and do offer stable ABIs, but their releases are just ecosystem forks and the wider issue remains.
(3) It hasn't change and it's still bad.
(4) Docker: "solves" the problem on servers by shipping the entire userspace with every app, and being itself developed by a for-profit company. Only works because servers don't need any shared services from the computer beyond opening sockets and reading/writing files, so the kernel is good enough and the kernel does maintain a stable ABI. Docker obviously doesn't help the moment you move outside the server space and coordination requirements are larger.
Unironically, yes. It's time that Microsoft taste their own medicine of embrace, extend, and extinguish.
Reference to the famous “Win32 Is the Only Stable ABI on Linux" post
https://blog.hiler.eu/win32-the-only-stable-abi/
https://news.ycombinator.com/item?id=32471624
https://en.wikipedia.org/wiki/Longene
Again?
I'm back to running Windows because of the shifting sands of Python and WxWindows that broke WikidPad, my personal wiki. The .exe from 2012 still works perfectly though, so I migrated back from Ubuntu to be able to use it without hassle.
It's my strong opinion that Windows 2000 Server, SP4 was the best desktop OS ever.
I like this idea and know at least a few who would love to use this if you can solve for the:
'unfortunate rough edges that people only tolerate because they use WINE as a last resort'
Whether those rough edges will ever be ironed out is a matter I'll leave to other people. But I love that someone is attempting this just because of the tenacity it shows. This reminds me of projects like asahi and cosmopolitan c.
Now if we're to do something to actually solve for Gnu/Linux Desktops not having a stable ABI I think one solution would be to make a compatibility layer like Wine's but using Ubuntu's ABIs. Then as long as the app runs on supported Ubuntu releases it will run on a system with this layer. I just hope it wouldn't be a buggy mess like flatpak is.
This is a really cool idea. My only gripe is that Win32 is necessarily built on x86. AArch64/ARM is up and coming, and other architectures may arise in the future.
Perhaps that could be mitigated if someone could come up with an awesome OSS machine code translation layer like Apple's Rosetta.
There's not much x86 specific about Win32 and you can make native ARM Windows programs for years already. WinNT was designed to be portable from the start. Windows/ARM comes with a Rosetta like system and can run Intel binaries out of the box.
I build a gaming VM and decided to go with Windows because the latest AMD drivers (upscaling etc..) only works there for now.
I wanted to be nice and entered a genuine Windows key still in my laptop's firmware somewhere.
As a thank you Microsoft pulled dozens of the features out of my OS, including remote desktop.
As soon as these latest FSR drivers are ported over I will swap to Linux. What a racket, lol.
Starting with FreeBSD might be easier than starting with Debian then removing all the GNUisms. But perhaps not as much Type II fun.
Using Linux gets you much more hardware compatibility especially for the consumer desktop and laptop systems this is targeted towards.
Yea! I love the spirit. Compatibility in computing is consternating. If my code is compiled for CPU Arch X, the OS should just provide it with (using Rust terminology) standard library tools (networking, file system, and allocator etc) , de-conflict it with other programs, and get out of the way. The barriers between OSes, including between various linux dependencies feels like a problem we (idealistically thinking) shouldn't have.
Technically it's the only stable macOS ABI, too. The only way to run a legacy 32-bit binary on macOS today is a win32 exe running under Wine.
The difference between Win32 and Linux is that the latter didn't realize an operating system is more than a kernel and a number of libraries and systems glued together, but is, indeed, a stable ABI (even for kernel modules -- so old drivers will be usable forever), a default, unique and stable API for user interface, audio, ..., and so forth. Linux failed completely not technologically, but to understand what an OS is from the POV of a product.
Linux didn't aim to be an OS in the consumer sense (it is entirely an OS in an academic sense - in scientific literature OS == kernel, nothing else).The "consumer" OS is GNU/Linux or Android/Linux.
There really isn't that much GNU on a modern Linux system, proportionately.
This is going to be a bold claim but here goes.
This will never work, because it isn't a radical enough departure from Linux.
Linux occupies the bottom of a well in the cartesian space. Any deviation is an uphill battle. You'll die trying to reach escape velocity.
The forcing factors that pull you back down:
1. Battles-testedness. The mainstream Linux distros just have more eyeballs on them. That means your WINE-first distro (which I'll call "Lindows" in honor of the dead OS from 2003) will have bugs that make people consider abandoning the dream and going back to Gnome Fedora.
2. Cool factor. Nobody wants to open up their riced-out Linux laptop in class and have their classmate look over and go "yo this n** running windows 85!" (So, you're going to have to port XMonad to WINE. I don't make the rules!)
3. Kernel churn. People will want to run this thing on their brand-new gaming laptop. That likely means they'll need a recent kernel. And while they "never break userspace" in theory, in practice you'll need a new set of drivers and MESA and other add-ons that WILL breaks things. Especially things like 3D apps running through WINE (not to mention audio). Google can throw engineers at the problem of keeping Chromium working across graphics stacks. But can you?
If you could plant your flag in the dirt and say "we fork here" and make a radical left turn from mainline Linux, and get a cohort of kernel devs and app developers to follow you, you'd have a chance.
I think there's a quote from Linus himself saying this.
> What is this? A dream of a Linux distribution where the entire desktop environment is Win32 software running under WINE.
I might unironically use this. The Windows 2000 era desktop was light and practical.
I wonder how well it performs with modern high-resolution, high-dpi displays.
Xfce already exists and has less impedance mismatch. It’s almost as good in some ways, probably better in a few tiny ones.
I've also had the same thought...
I’m in if this is happening
But would you want to run these Win32 software on Linux for daily use? I don't.
Depends on what task you're doing, and to a certain extent how you prefer to do it. For example sure there's plenty of ways to tag/rename media files, but I've yet to find something that matches the power of Mp3tag in a GUI under linux.
I use some cool ham radio software, a couple SDR applications, and a lithophane generator for my 3d printer. It all works great, if you have a cool utility or piece of software, why wouldn't you want to?
Gamers have no other option, and thanks Valve, game studios have no reasons left to bother with native Linux clients.
Just target Windows, business as usual, and let Valve do the hard work.
> Gamers have no other option, and thanks Valve, game studios have no reasons left to bother with native Linux clients
But they do test their Windows games on Linux now and fix issues as needed. I read that CDProjekt does that, at least.
Not really, most leave that to Valve.
CDProjekt releases native linux builds.
I don’t think Witcher 3 or Cyberpunk 2077 have Linux builds available for the common folk? Cyberpunk has a ARM64 Mac build, though.
...game studios have no reasons left to bother with native Linux clients.
How many game studios were bothering with native Linux clients before Proton became known?
More than now, I own a few from the Loki Entertainment days.
That's exactly the point. They weren't, so a Linux user didn't have an option to run a native Linux client in preference to a Win32 version.
That goes back to address the original question of "But would you want to run these Win32 software on Linux for daily use?"
Well, not having Proton definitely didn't work to grow gaming on Linux.
Maybe Valve can play the reverse switcheroo out of Microsoft's playbook and, once enough people are on Linux, force the developers' hand by not supporting Proton anymore.
For making music as much as I love the free audio ecosystem there's some very unique audio plugins with specific sounds that will never be ported. Thankfully bridging with wine works fairly well nowadays.
Thus reinforcing development tools that target Windows desktop even further, the OS/2 lesson repeats itself.
And failing everything else, Microsoft is in the position to put WSL center and front, and yet again, that is the laptops that normies will buy.
Not to worry, Microsoft can't escape Win32 either. They've tried, with UWP and others, but they're locked in to supporting the ABI.
It's not a moving target. Proton and Wine have shown it can be achieved with greater comparability than even what Microsoft offers.
While true, people should pay attention that WinRT, the technology infrastructure for UWP, nowadays lives in Win32 and is what is powering anything CoPilot+ PC, Windows ML, the Windows Terminal rewrite, new Explorer extensions, updated context menu on Windows 11,....
It is a moving target, Proton is mostly stuck on Windows XP world, before most new APIs started being a mix of COM and WinRT.
Even if that isn't the case, almost no company would bother with GNU/Linux to develop with Win32, instead of Windows, Visual Studio, business as usual.
I mean... isn't that just X11 light compositor (like IceWM) with binfmt enabled?
Damn, they didn't miss a spot to add a Loss comic reference.
https://en.wikipedia.org/wiki/Loss_(Ctrl%2BAlt%2BDel)
Thank you. I was contemplating the logo but my brain could not make the connection.
This is amusing but infeasible in practice because it would need to be behaviorally compatible with Windows, including all bugs along with app compatibility mitigations. Might as well just use Windows at that point.
you have full control of a Linux system. win32/linux respects your rights that microsoft doesn't. that's the difference.
That is irrelevant to the feasibility of reimplementing the Win32 API on Linux.
WINE has been reimplementing the Win32 ABI (not API) for decades. It already works pretty well; development has been driven by both volunteers and commercial developers (CodeWeavers) for a long time.
There are many programs that still do not work properly in WINE, even though it has been developed for decades. This in itself demonstrates the infeasibility of reimplementing Win32 as a stable interface on par with Windows. The result after all this effort is still patchy and incomplete.
There are many programs that do not work properly in Windows 11, so using Windows to run Windows programs doesn't work either.
It's already been done, though. Wine has been around for 30 years and has excellent compatibility at this point.
5341 of the 16491 applications listed in the Wine AppDB have a compatibility rating of "garbage". This is not excellent compatibility.
How many of those entries have been tested with recent versions of wine or proton? Seems a poor metric.
Better to consider is the Proton verified count, which has been rocketing upwards.
https://www.protondb.com/
Relative to (64-bit) windows 11, it might be.
This is only ever relevant for proprietary software. Free software does not require a stable ABI. Great that wine exists but it should be useless.
(That and Linux doesn't implement win32 and wine doesn't exclusively run on Linux.)
Stable interfaces and not being in versioning hell (cough libc) would actually be good for FOSS as well.
If you make a piece of software today and want to package it for Linux its an absolute mess. I mean, look at flatpack or docker, a common solution for this is to ship your own userspace, thats just insane.
Free software can still benefit from a stable ABI. If I want to run the software, it's better to download it in a format my CPU can understand, rather than download source, figure out the dependencies, wait for compiling (let's say it's a large project like Firefox or Chromium that takes hours to compile), and so on.