>Win32 (via Wine or Proton) is the most stable target for Linux right now
Tangential, Winamp 2.xx from the '90s runs and plays MP3s just fine on Windows 11 today. There are better apps for that today, but I still use it because nostalgia really whips the llama's ass.
Pretty wild that the same thing is not the norm in other OSs.
Even wilder is that I still have my installed copy of Unreal Tournament 99 from my childhood PC copied over to my current Win 11 machine, and guess what, it just works out of the box, 3D graphics, sound, everything. That's nearly 25 years of backwards compatibility at this point.
It really is mindblowing that Windows 11 is still capable of running 32-bit programs written for Windows 95, that's 28~29 years of backwards compatibility and environmental stability.
If we look back to programs written for Windows NT 3.1, released in 1993, and assume they run on Windows 11 (because why not?) then that's 30 years of backwards compatibility.
Did I say mindblowing? It's downright mythological what Microsoft achieves and continues to do.
There's no guarantee that all the older apps from the Windows 9x/XP days will work today, as some apps back then, especially games, made use of non-public/undocumented APIs or just straight up hacked the OS with various hooks for the sake of performance optimizations. Those are the apps guarantee not to work today even if you turn on compatibility mode.
Personally I've had little luck with even running XP applications on Windows 7. More generally, going by the difficulties experienced by many companies and organizations in the transition from XP->7, it's hardly an isolated problem.
Perhaps Windows maintains the best backwards compatibility of any mainstream OS, however I would hardly describe it as "mythological".
The most fascinating example for that is SimCity: They noticed it didn't run under Windows 95 as Windows 95 reused freed memory pages, but SimCity did a lot of use-after-free. This would have been aborted by Win95. Microsoft developers however knew that people would blame Microsoft not Maxis, thus added an extra routine I. The memory manager, which detected SimCity and then didn't reuse memory as much.
I don't want to estimate how much such hacks they accumulated over time to keep things as compatible as they could.
Linux can do this; binaries from the 90s work today.
Something like xv (last release: 1994, although the binaries were built against Red Hat 5.2 from 1998) still work today, and the source still builds with one very minor patch last time I tried it.
And Windows has exactly the same problem, but the tradition is to ship these things with the application rather than just assume they're present on the system. And you can "fix" it by getting old versions, or even:
You'll probably run in to trouble with PNG and JPEG files, but e.g. loading/saving GIF and whatnot works fine. Note how libc and libX* work out of the box.
tl;dr: much of the "Windows compatibility" is just binaries shipping with all or most dependencies.
Much of the Windows compatibility is "just" stable API for Windows controls, GUI event handling loops, 3D graphics and sound (DirectX). Linux has stable API for files and sockets (POSIX), but that's all.
And I am saying you don't need to rely on any of that. You can just ship it yourself (statically link, or use LD_LIBRARY_PATH). That's what Windows applications that rely on GTK or Qt do as well, and it works fine, which works well, and it works fine for Linux too. The basics (libc, libX, etc.) are stable, and the Linux kernel is stable.
And this is what Windows does too really, with MSVC and dotnet and whatnot redistributables. It's just that these things are typically included in the application if you need it.
It's really not that different aside from "Python vs. Ruby"-type-differences, which are are meaningful differences, but also actually aren't all that important.
Stop spreading FUD, X and OpenGL have maintained stable ABIs. There is Wayland now but even that comes with Xwayland doe maintain compat.
Sound is a bit more rocky but there are compatibility shims for OSS and alsa for newer audio architectures.
Stop claiming that I'm spreading FUD and show me at least one Linux app which was compiled to the binary code in 1996 and exactly that binary code still runs under modern Linux desktop environment and has similar visual style to the rest of builtin apps.
Got no counterexamples? Then it's not FUD at all, rather a pure truth.
> It's downright mythological what Microsoft achieves and continues to do.
This seems like it was meant in a positive way, but I really don't think that if compatibility with your system requires "mythological" efforts, that should be seen as a good thing for your system.
It's also worth noting that backwards ABI compatibility only masters when people limit their software by not distributing the source. Early UNIX software can run fine on modern GNU by just compiling it.
> It's also worth noting that backwards ABI compatibility only masters when people limit their software by not distributing the source. Early UNIX software can run fine on modern GNU by just compiling it.
Have you ever tried building decades old programs from source? It's not as easy as you claim.
Here's source for grep from v6 unix. I'd be interested to know the smallest set of changes (or flags to gcc) needed to get it to compile under gcc on linux and work.
Note that this is not a fair comparison since that code is almost 50 years old, predating ANSI C and before even the 8088 existed.
Still, there is only one actual error in gcc 13.2.1 which is the use of =| instead of the later standardized |=. I'm not sure if that was a common thing back then or if it was specific to their C compiler. Either way, I don't think gcc has a switch to make it work. Switching that around in the source gives linker errors since it seems back then the way to print to or flush a particular file descriptor was to set a libc variable and then call flush or printf. If you had the right libc to use with it that one tiny change might be all you need. But you would likely need to set up a cross compile to be able to use the right libc.
My understanding is that most of the compatability issues on Linux are due to not having the right libraries rather than the kernel not supporting older system calls. It is just a lot of not that fun work to keep things working and no one is that interested (instead, some people just use decade old versions of Linux :/ and the rest use package systems to recompile stuff). NetBSD had better practical binary compatability for a long time, although I think some of it was remove fairly recently since there isn't much commercial NetBSD software (there was one lisp binary from 1992ish IIRC that some people were still using and I think that compat was kept).
Thanks for the details, I did give compiling it a try and saw some of the things you mention, but my knowledge of C, pre-standards C, and libc wasn't enough to fully make sense of them. I'll agree it looked better than I expected; I've seen much worse cases of single-decade old programs not compiling/working (one in Haskell and another involving C++ and Java, although they were much larger programs).
But I don't think it was unfair to use that as an example of early Unix software, or to point out that it was harder than "just compiling it". One defense against my argument could be that a version of 'grep' has been maintained through C versions and operating systems, with source code availability playing a part in that (although presumably GNU grep avoided using Unix source code).
My extremely limited experience with Java is similar, it seems to do much worse than C at compatability over time. Of course you are right that it is a fair example of early Unix code, I just didn't read the comment you were actually replying to carefully enough :(. Pre-standard C has more issues, so I don't think that is fair vs Windows but that is not what you were replying to. Source availability gives some additional options that you don't have with only binaries but I think you are right that without it being open source the use is limited (and potentially negative, like how the BSDs were limited by the USL lawsuit in the early 90s). Useful open source software is likely to be maintained at least to the point of compiling, though it may take a while to break enough for someone to bother (sox is in this middle stage right now with package systems applying a few security patches but no central updated repository that I know of).
What is? An immense amount of resources (developers) poured into developing live patches to make applications work on each newer version of Windows (or helping the application developers fix their applications). It's an interesting conceptual grey area - I don't consider it backward compatibility in a strict sense.
This is documented in the book "The old new thing" by Raymond Chen (it's possible also to read the blog, but the book gives an organic view).
It's fascinating how far-sighted Microsoft was; this approach, clearly very expensive, has been fundamental in making Windows the dominant O/S (for desktop computers).
It's because Microsoft understands and respects that computers and operating systems exist to let the user achieve things.
The user ultimately doesn't care if his computer is an x86 or an ARM or a RISC-V, or if it's running Windows or Mac or Linux or Android. What the user cares about is running Winamp to whip some llama's ass, or more likely opening Excel to get work done or fire up his favorite games to have fun.
Microsoft respects that, and so strives to make sure Windows is the stepping stone users can (and thusly will) use to get whatever it is they want to do done.
This is fundamentally different to MacOS, where Apple clearly dictates what users can and cannot do. This is fundamentally different to FOSS, where the goal is using FOSS and not what FOSS can be used for.
It's all simple and obvious in hindsight, but sometimes it's the easiest things that are also the hardest.
It's amazing how people don't want Linux to "Be like Windows"... but as far as I'm concerned windows is close to ideal, just with a few flaws and places where FOSS can do better...
This has very severe drawbacks, so it's not unambiguously desirable.
Windows APIs are probably a mess because of this (also ignoring the fact that only company with extremely deep pockets can afford this approach). There is at least one extreme case where Windows had to keep a bug, because a certain program relied on it, and couldn't be made to work otherwise.
Sure, from a user perspective, but not from an operative perspective: in the cases of live binary patching, Microsoft required to call the application developer to be legally clear; in orther cases, APIs behave differently based on the executable being run. There's a lot more than just keeping the API stable.
I get that my initial comment was a bit of a throwaway, but I can unpack it a bit. I think it’s a mistake to regard a working backward compatibility functionality as deficient because it requires maintenance and the cooperation of the parties involved. That’s just… engineering, right?
In a world where security flaws are so common, I'm not sure I want to run old software outside a virtual machine.
I also wish I could agree that win32 is a stable target on Linux; it may run old software, but in my experience it is often quirky. It's usually a better use of my time to just boot windows thanks to figure out how to get software to run under wine.