> It's also worth noting that backwards ABI compatibility only masters when people limit their software by not distributing the source. Early UNIX software can run fine on modern GNU by just compiling it.
Have you ever tried building decades old programs from source? It's not as easy as you claim.
Here's source for grep from v6 unix. I'd be interested to know the smallest set of changes (or flags to gcc) needed to get it to compile under gcc on linux and work.
Note that this is not a fair comparison since that code is almost 50 years old, predating ANSI C and before even the 8088 existed.
Still, there is only one actual error in gcc 13.2.1 which is the use of =| instead of the later standardized |=. I'm not sure if that was a common thing back then or if it was specific to their C compiler. Either way, I don't think gcc has a switch to make it work. Switching that around in the source gives linker errors since it seems back then the way to print to or flush a particular file descriptor was to set a libc variable and then call flush or printf. If you had the right libc to use with it that one tiny change might be all you need. But you would likely need to set up a cross compile to be able to use the right libc.
My understanding is that most of the compatability issues on Linux are due to not having the right libraries rather than the kernel not supporting older system calls. It is just a lot of not that fun work to keep things working and no one is that interested (instead, some people just use decade old versions of Linux :/ and the rest use package systems to recompile stuff). NetBSD had better practical binary compatability for a long time, although I think some of it was remove fairly recently since there isn't much commercial NetBSD software (there was one lisp binary from 1992ish IIRC that some people were still using and I think that compat was kept).
Thanks for the details, I did give compiling it a try and saw some of the things you mention, but my knowledge of C, pre-standards C, and libc wasn't enough to fully make sense of them. I'll agree it looked better than I expected; I've seen much worse cases of single-decade old programs not compiling/working (one in Haskell and another involving C++ and Java, although they were much larger programs).
But I don't think it was unfair to use that as an example of early Unix software, or to point out that it was harder than "just compiling it". One defense against my argument could be that a version of 'grep' has been maintained through C versions and operating systems, with source code availability playing a part in that (although presumably GNU grep avoided using Unix source code).
My extremely limited experience with Java is similar, it seems to do much worse than C at compatability over time. Of course you are right that it is a fair example of early Unix code, I just didn't read the comment you were actually replying to carefully enough :(. Pre-standard C has more issues, so I don't think that is fair vs Windows but that is not what you were replying to. Source availability gives some additional options that you don't have with only binaries but I think you are right that without it being open source the use is limited (and potentially negative, like how the BSDs were limited by the USL lawsuit in the early 90s). Useful open source software is likely to be maintained at least to the point of compiling, though it may take a while to break enough for someone to bother (sox is in this middle stage right now with package systems applying a few security patches but no central updated repository that I know of).
Have you ever tried building decades old programs from source? It's not as easy as you claim.
Here's source for grep from v6 unix. I'd be interested to know the smallest set of changes (or flags to gcc) needed to get it to compile under gcc on linux and work.
https://github.com/takahiro-itazuri/unix-v6/blob/0316b457acb...