Hacker News new | past | comments | ask | show | jobs | submit login
Why Linux Succeeded (riskmusings.substack.com)
51 points by ivanvas on July 9, 2022 | hide | past | favorite | 100 comments



Not to take anything away from Linus, but Linux was also in the right place at the right time. The BSD licensing issues were up in the air at the time and Microsoft had Windows 3.11, which was hardly a technological tour de force. There was a gap that was ripe for exploitation. It could have been Minix that won, or BSD, or the Hurd, but it wasn't, due to various missteps.

There's a lot of stuff out there that's good, but didn't make it for "various reasons". Time and chance happen to all men, as it says in the Good Book.


Don't underestimate the rise of Internet, websites, and the web servers in pushing the rise of Linux. Linux usage took hold not because people dropped Windows for desktop and adopted Linux. Linux usage took hold due to the increasing adoption of server and its related technologies. Simply put, nobody cared about the mass scale use of Active Directory for ldap and whatever Microsoft was selling as a web server. Linux servers provided better adoption to evolve with the rise of Internet.


ahem What of the free BSD derivatives?

The vision and individual will of Mr Torvalds cannot be discounted as a factor; yes all the communities around Linux did the work but they had not crystallized around a different core.

If M$ hadn't been such pure shits in the late 80s and onward, they could have had a lot of that community themselves. I think, in hindsight, the world may have worked out better for it tho. I can imagine a "source available" xenix based world where M$ plays the role of Apple without the lovingkindness. Imagine a human face being stamped by a "windows update" wait screen... forever.


Had Microsoft taken POSIX support on Windows NT more seriously, Linux wouldn't have mattered.

As proven by hordes of UNIX developers using macOS and WSL.


Whatever happened to that? I've read the articles from the time how the NT kernel was designed to have Win32, POSIX and OS/2 as equal ways to talk to the kernel, via the subsystem architecture. And if you dig deep enough in modern Windows you find things that were probably invented for that (like support for case-sensitive names in NTFS), but the feature seems to have vanished as soon as it launched, only to be resurrected two decades later as WSL1.

I can just imagine Microsoft-internal politics went against the idea?


Yes, it was mostly for government contracts.

Eventually it got replaced by Interix, which was further replaced by SUA until Windows 8.

Naturally nowadays Linux syscalls compatibility relevance grew to the point that even other UNIXes have Linux compatibility layers, so WSL made more sense than plain POSIX.


I’m curious why Cygwin usually/always gets left out here, it’s been pretty damn good for a long time.


Cygwin is Awesome. It feels just like I'm in a VM but not a VM? And as it does not require admin rights, it doesn't need IT department blessing unlike WSL.


Because it always provided a half way experience outside of Windows itself, instead of embracing the whole platform.


Cygwin (which is in fact excellent) was only one of numerous Unix-on-Windows toolkits. Others included MKS Toolkit (from Mortice Kerns Systems) and UWin (from AT&T / David Korn, of Korn Shell fame).


There might've been an alternate universe where Sun didn't "do" Slowlaris but instead brought an affordable 32bit unix to desktops with SunOS. Linux would've been much less relevant in that world, too.


Indeed, until 2005 I only used Linux at home.

Work was all about HP-UX, AIX and Solaris, the exception being CERN in 2002.

They were slowly transitioning from Solaris into Linux for the clusters, while most researchers were either adopting OS X or Windows 2000.


I will never forgive Oracle for killing the Sun.


They didn't. They extended the life for eight more years and still make a profit from Sun hardware sales today. If you looked at the alternatives: IBM or stay independent then those were not going to end well. Oracle gave Sun life for eight more years.


Now, did it end well?


Oracle was the best think to happen to Sun in 2010. You have to understand the market dynamics at UNIX vendors at that point. Look around. There are zero UNIX server vendors any more.


Amazon and Azure and Google etc are UNIX server vendors: what they vend is just for rent rather than purchase.


They mostly run on Linux though, not Unix.


To be fair, IBM still has them. But IBM doesn’t care about AIX much anymore either.


Mostly they have a sustaining team, just like Oracle. Solaris is still supported but purely in a payed sustaining mode for legacy customers until 2034.


Oracle was the only company that even bothered to actually buy Sun assets.

Everyone else let it fall down hard on the floor, after management run the company into the ground.

It was Oracle that delivered into production the first UNIX with hardware memory tagging, still the best one at it today.

It was Oracle that made MaximeVM into a real product with GraalVM.

It was Oracle that pushed Java design from its long term stagnation.

Yes there were victims along the way.

Without Oracle not even the former ones would have taken place.


Oracle bought Sun because they wanted MySQL. To me, it looks like they noticed everything else in their portfolio only after they bought it.

“Oops. We accidentally bought a CPU design, a complete enterprise OS, a virtual machine, an Office suite and quite a handful of other things. Let’s just pretend we didn’t.”

Or what exactly has survived the sellout?


Nothing would have survived, that is the point.

Java would have died at version 6, never open source (process was finalized under Oracle's management).

GraalVM would never existed.

Solaris would have died.

All Sun assets would have been taken by the debt collectors and done accordingly.


And now, Solaris has, in fact, died, other projects have left ‘Orrible before it was too late. It’s all over and gone because Oracle bought it and does not care.

(One could debate whether keeping Java alive was a good thing.)


Still better than dying in 2010.

Solaris licenses are still being sold, then again, Oracle hatters tend to like free beer instead of understanding how to keep lights on a running business.


> Oracle bought Sun because they wanted MySQL.

You're way of in the weeds.


Can we stop blaming every bullshit on Oracle? Every company is just a paperclip-machine optimizing for money (after a not too big size). They will all chop off your hand if you stick it inside without second thought.

Kindness, or care from a company only means that pretending that emotion towards something is financially better than not caring, or eradicating that something.


It AT&T wouldn’t have been such a PITA in the late 80s, Linux would not exist. BSD became “free” one year too late, sadly.


Linux succeeded because of these reasons:

1) Over time it changed and morphed to include every feature people wanted. It started as a 386, single core kernel whereas now it supports every processor and 1000s of cores.

2) Linux always had the least amount of friction to get stuff working. Back in the early days, trying to get any rando program to compile and work on SunOS, AIX, HPUX, etc -- it was a touch and go / pain the ass process. IF your system had a compiler, you were probably missing any number of dependancies. Linux, you could always get stuff to compile and work.

3) Desktop - Yes, the "year of the desktop" hasn't happened. However, the 30+ years of trying has yielded the best options for a *NIX based environment.


> Linux always had the least amount of friction to get stuff working. Back in the early days, trying to get any rando program to compile and work on SunOS, AIX, HPUX, etc -- it was a touch and go / pain the ass process. IF your system had a compiler, you were probably missing any number of dependancies. Linux, you could always get stuff to compile and work.

Yeah, the proprietary vendors really shot themselves in the foot with unbundling, or the practice of moving packages out of the base install and into "products" people had to buy separately. Linux distros could be batteries included because of the GNU tools and, to some extent, the BSD tools that got ported, not to mention other ecosystems like X and LaTeX and various browsers, etc.


For those using tarballs distributions like Slackware, not really.


Linux and its ecosystem is under the most aggressive EEE cycle from Microsoft in its history. Just look at wsl, github, python,...


I would agree that the first two Es are happening. I'm just not sure how they would accomplish the Extinguish step from there.

Microsoft's strategy as I interpret it is to embrace Linux as the server OS most will choose, but to stay relevant double down on providing the development and deployment experience. .NET Core makes their flagship backend framework available on Linux, and WSL makes that case easy to test from the comfort of your Windows computer, using Visual Studio. The same Visual Studio that just so happens to integrate well with MSSQL and Azure.

If you aren't an enterprise, there is Visual Studio Code for the languages you like, with even bigger emphasis on making Linux development a smooth experience on a Windows computer, along with extensive Azure integrations if you want them.


The third E seems to be evolving into coexisting directly into Profit.

Decisions and investments made over time while using a cloud platform is its own stickiness.


They learned from the community adoption of Apple products that the majority only cares about POSIX CLI and not really Linux as such.

If they cared, they would be buying Linux OEM hardware instead of giving money to Apple.


Wait, what is MS doing with python?



Perhaps they are referring to Pylance (Microsoft's proprietary Python tools for VS Code)? https://www.reddit.com/r/linux/comments/k0s8qw/vs_code_devel...

IIRC there's a similar problem with VS Code's official C++ and C# tools.


But let's be honest with yourself

GitHub is incredibly good tool, unparalleled even.


Not really. It's easy, but the feature set is rather limited compared to what KDE (the FOSS project I'm closest to) has.


And so is wsl, they have good value. But it's undeniable that they are used as part of eee cycle


GitHub is toxic though, it’s focused on social justice these days. Meritocracy is dying and everyone rejoices, probably because they can’t bring merits.

Has GitHub stopped violating our licenses already?

http://www.mirbsd.org/permalinks/wlog-10_e20170301-tg.htm


5 years ago, sure, but GitLab?


Easy answer.

The GPLv2

Linus Torvalds

—-

GPLv3 would have flopped. BSD results in variants and splinters. as a dictator Linus made a ton of good calls - and a dictator making good calls is pretty efficient.

The part that makes me laugh is fsf ignoring the views of their number one success as they fragmented copyleft with GPLv3. Linus has built THE most successful community collaboration. Instead the FSF brought in the lawyers


> The GPLv2

Though this helped a lot, I believe the commandment "Do not change user space" was a much larger factor. In some upgrades of any of the BSDs, some user program will crash depending upon what changed in base. Linux, hardly ever happens, I have never had an issue with User Space on Linux Upgrades.


> In some upgrades of any of the BSDs, some user program will crash depending upon what changed in base.

For FreeBSD at least, only in major version releases (12.x->13.x); with minor updates (13.1->13.2) there is API/ABI stability. (Free)BSD also has good kernel compatibility, so you don't have to worry about kernel drivers breaking 'randomly' with each update.

However, if you install the compatibility libraries then even recompiles may not be necessary:

* https://www.freshports.org/misc/compat12x/

* https://www.freshports.org/misc/compat11x/

* https://www.freshports.org/misc/compat10x/

* https://www.freshports.org/misc/compat[4…9]x/


A good point. MS has done well with their compatibility efforts too. Wasn’t it don’t break vs don’t change?


Isn’t linux kinda shitty in binary backwards compatibility? Sure, the kernel interface doesn’t break, but it doesn’t mean much in practice. Good luck running a binary that was compiled one Ubuntu LTS version before the current, let alone multiple years.

Windows does actually have superb backwards compatibility.


yes, you are correct, it is "Do not break user space", thanks.


The other issue is that the FSF is much more focused on ideology than code.

When attempting to be relevant to people who can enrich your product (i.e. hackers), working code is way more important than whether you write letters and a slash before the "linux" part of the name.

Linus is a hacker first and foremost, and ideology takes a backseat. FSF has the cart before the horse.


I'd just like to interject for a moment. What you're referring to as Horse, is in fact, Cart/Horse, or as I've recently taken to calling it, Cart plus Horse.


You think principles unimportant? :)


Different principles is not a lack of principles.


> BSD results in variants and splinters.

Please count the Linux-based systems.


Not OP so not sure what OP meant but I think the reason GPLv2 beats BSD here is that code derived from BSD-licensed code is often taken closed source.

Industry modifications to FreeBSD and Minix (and maybe NetBSD too?) were never sent back upstream or were sent back with a delay.

Splinters in Linux-based systems are still open source so good modifications can be added to upstream.


> Industry modifications to FreeBSD and Minix (and maybe NetBSD too?) were never sent back upstream

The most popular software that was derived from (a number of predecessors and) FreeBSD, Apple’s Darwin, is free software, nobody stops anyone from taking code from there. In fact, quite a few FreeBSD developers are (were?) paid by Apple.


We only learned by accident in 2017 that Minix is deployed on the Intel Management Engine[1] and we don't know what it does. To this day we don't have the code for the Minix variant that it runs. This wouldn't be the case if Minix was released under GPLv2.

Sony also used FreeBSD in PlayStation 4[2] and to my knowledge their modifications have not been open sourced. I may be wrong though.

[1] https://en.wikipedia.org/wiki/Intel_Management_Engine#Design

[2] https://en.wikipedia.org/wiki/PlayStation_4_system_software


At least, the assumption that “nothing” was sent back is wrong. I agree that the BSD license allows to take the code and run away with it, but you don’t have to. In a way, that makes BSD more free than the GPL.


> At least, the assumption that “nothing” was sent back is wrong.

That is true, I should have phrased it more carefully.

> In a way, that makes BSD more free than the GPL.

Yes, BSD is a permissive license and GPL isn't. But the original question was whether GPL leads to more splinters or BSD.

I think that the copyleft nature of GPL leads to fewer "effective" splinters since good modifications can be merged into upstream, and what is left unmerged tends to be less important to the success of the parent project. This has helped Linux gain momentum and unfortunately hasn't much helped the BSDs or Minix. I would like to see the BSDs succeed but I suspect that the permissive license has been one of the impediments to that.


You might take code from there. But you cannot take code from OS X and add it to Darwin. And that is why I cannot just install Darwin and run it on my computers if I wanted to use e.g., network cards.


OS X is Darwin.


> Industry modifications to FreeBSD and Minix (and maybe NetBSD too?) were never sent back upstream or were sent back with a delay.

And the companies that did this paid a price for it when FreeBSD moved forward and they were left behind holding a whole bunch of patches.

After the learning the hard way about withholding non-secret sauce patches, many companies contribute back anything that they find in the 'common code' that wouldn't threaten their value proposition. You will regularly see patches with the "Sponsored by" tag in the commit logs:

* https://www.freshsource.org/commits.php

Here's one from a few hours ago by Netflix:

* https://www.freshsource.org/commit.php?message_id=75ad24775b...

NetApp, Dell-EMC Isilon, Juniper, iXsystems, pfSense, etc are often seen in base, but also in ports and also drivers (Intel, Chelsio, Mellonox).

* https://en.wikipedia.org/wiki/List_of_products_based_on_Free...


In the words of Count von Count: One. One Linux kernel. Ah-ha-ha-ha!

Or at least one that matters.


Which of the several kernels is it? Android’s? Chrome OS’s? The anti-free Ubuntu’s?


ChromeOS developer here. You can see our kernels here: https://chromium.googlesource.com/chromiumos/third_party/ker... (just linked one of them, we use a few)

Everything is open source, we mostly keep up with Linux development and upstream all of our patches but some take time so we have our own branch. Still entirely open source.


The fragmentation here was I think a Linus misstep - when the wake lock and other ideas came out some subsystem maintainers were fussing.

But code talks and talk walks, the android folks were solving real issues - so my own view would have been to be supportive of their efforts which were not insignificant.


All those kernels are legally required to be free by the GPL. Those OSes have "non-free" userspaces.


> Please count the Linux-based systems.

And they all take the same source from upstream and maintain compatibility. Packaging and some user-space dependencies aside, binaries run on any Linux system.


It's irrelevant because they are all GPLv2 and intercompatable.

Distros are just that: different distributions of the (largely same, interchangeable) software. There is no splintering where it matters: legally and technically.

In the BSD world you will have companies like Sony take BSD into their closed source world. The linux equivalent would be some semi-free environment like Android.


So you can reliably install rpm's in debian and vice versa with .deb's? glibc versions? X11 vs Wayland? System-d vs init.d ? Android?


Yes, to all of those, though some take more work than others.

It says a lot that "systemd vs init.d" instead of "systemd vs sysvinit" was there.

The question is not about the ecosystem, though, but about the kernel itself. Yes, there are embedded vendors (mostly networking equipment, and especially routers) who violate the GPL by not complying, but generally speaking, hardware vendors who choose Linux publish sources.

Those sources may not ever make it into mainline, but the device trees/drivers for embedded stuff does provide a reference for a cleaner, better implementation.

Compare to FreeBSD for the Playstation's management layer. Where are the contributions for Cell support? Where are the patches Sony certainly needed to make for wifi and bluetooth? Where are the patches from Juniper for validated FIPS?


Nvidia graphics cards. Just saying.


I think it's only feasible for nvidia to release GPL-violating closed source graphics drivers because of the leverage that BSD provides them against Linux.


>So you can reliably install rpm's in debian and vice versa with .deb's?

Yeah, just use a chroot. It's the same kernel. You could also just distribute static binaries or have your dynamic libs bundled in a single .AppImage file.

>glibc versions?

Linux guarantees source compatability, but doesn't guarantee binary compatability (unlike most proprietary software like ms-windows).

>X11 vs Wayland?

X11 and wayland are different protocols, but Wayland is reverse-compatible with X11.

>System-d vs init.d ?

I don't know much about init systems, but I don't use systemd, I use OpenRC and it works fine with "systemd-dependent" software like GNOME or KDE by using elogind[0][1] for example.

>Android?

For android apps, you can containerize an android userspace with tools like Waydroid[2].

[0] https://wiki.gentoo.org/wiki/Elogind [1] https://github.com/elogind/elogind [2] https://waydro.id/


On chroot installs.

Debian's installation manual has a section on how to perform chroot installs. That is, there's a documented process for how to install Debian within another Linux distribution should you choose / need to do that.

It came about because the original author was hoping to try out Debian on a Red Hat system.

It's all userland over the kernal.

These days, even at the process level (or using tools such as jails), it's possible to define largely independent contexts for each individual process. CPU, memory, and kernel are common and/or shared, but other resources can be strongly segregated.


The fact that the article calls Linux a product right from the beginning shows one of its flaws: Unlike other free systems like OpenBSD, Linux suffers from being required to satisfy “the market”. The results: systemd and the choice between being “like Windows” (KDE) or “like macOS” (Gnome).

“In some sense, you only hit what you aim at. What was the goal of the Linux community--to replace Windows? One can imagine higher aspirations.”

- Bill Joy, 2010.


> the choice between being “like Windows” (KDE) or “like macOS” (Gnome).

Or, you know, neither, like Window Maker.


Which I prefer to use myself, but it was born as “like NeXT” (and it still is); so, just another “like an actually successful desktop”.


So what "actually successful desktop" are the tiling window managers like?

They're probably more successful than Window Maker.


I vaguely remember using KDE in the Konqueror days, and Gnome in the early foot days, but I haven't used either for 20 years, using black/fluxbox and more recently xfce


Xfce started as a CDE clone. I wonder what made them resemble Windows instead.


I feel some credit should also go to Canonical and their Ubuntu free CDs in the mid-naughties.


When multiple different people and multiple different circumstances align in just the right way, I do believe it's magical. Literal magic.

My experiences as a person tell me that people and egos and circumstances constantly conspire against goals, ambition, and vision. There's always something that stymies a project. Maybe it's someone's ego. Maybe it's a problem that drains someone out completely. Maybe it's just a bad day for someone.

When networks of people are able to join together and fight entropy enough to sustain the life of a project... Wow. Just wow.

Anyone who added any type of energy into Linux that sustained it for everyone has my admiration.


For my personal use cases, I use Windows and macos.

For running web apps and hobby projects, I always use Linux (Debian and Ubuntu distros).

To me, Linux is the operating system for anything where I only interact with via the command line. This usually means VMs or containers for my web apps and hobby projects. Most cloud environments that I know of (AWS, Linode, Digital Ocean, Vultr, GitHub actions) support Linux.

While I have been curious about trying out other OS's like OpenBSD or CoreOS, my experience with Linux seems to be the most portable. It would never occur to me to even try running a web app on a VM or container that isn't Linux.


>For my personal use cases, I use Windows and macos. For running web apps and hobby projects, I always use Linux (Debian and Ubuntu distros).

It's the same for me. Windows for desktop and Linux for servers.


Looking forward to that desktop success.


I think at this point it's pretty clear that it will never happen. The specialist software on Linux is often great, but those specialists are technical people. Software for business users is lightyears behind and nobody would convince me to use Gimp(I did try) or Open Office for( Also tried) in a business environment.


Governments would be a great beachhead to start increasing Linux adoption, if only the Microsoft kickback schemes to corrupt officials and derail competition could be curbed for good. See for example https://www.washingtonpost.com/technology/2019/07/22/microso...


Its not impossible, but it needs someone to figure out how to monetize desktop linux so they can then put the marketing effort in and pay a few companies to port big name apps such as photoshop that would give it some momentum and mindshare. A single profitable desktop would then give other companies a good target to port their own desktop software too, which would encourage more users and get to a virtuous circle.

But without that (and no one has figured out how to do it so far) I agree, I cant see it happening.


Can you share one or two aspects of FOSS office software that caused problems for you?

(from my perspective, it's a modern miracle to be able to run "apt-get install libreoffice" at a command-line and have freely-available, no-license required office software available within a few moments including nearly like-for-like functionality and file-format compatibility with other office suites)


Sure. I'll take my technical hat off and put 'the average office user' hat on: I login to my Ubuntu account and head to Ubuntu appstore. It's a galore of half baked and a few well developed gems amongst them. "Apt-get install libreoffice" is great for us geeks but if I'd go to my office colleagues and tell them to run some commands in the shell, they'd think I'm mad.

I have no doubts that both Gimp and Open office are close to feature parity, but the UX is just not there. The user interface has to be super slick everywhere because an average user is very spoilt.

One hope I have is that Windows have been going down the drain usability wise, so hopefully they'll screw up things even worse in Windows 11/12 etc, so the competition could pick up on it.


Thanks for the response!

> if I'd go to my office colleagues and tell them to run some commands in the shell, they'd think I'm mad.

Depends on the colleagues, potentially - I often feel like I underestimate what other people are capable of learning, and that the resulting conversations can seem unintentionally condescending as a result of that (i.e. not preparing and demonstrating what's possible for fear that someone may not understand).

> The user interface has to be super slick everywhere because an average user is very spoilt.

Yep, that makes sense. However, whether I'm an employee, a business owner, an investor, or a partner who wants to see a business succeed: if I learn that the company is spending on software when there are lower-cost alternatives available that are ignored largely due to look-and-feel concerns.. some cognitive dissonance may develop. Especially if the potential cost savings could be pooled with others towards resolving those issues.

(on a potentially more practical note: what I hear from you is that user experience frustration can lead to dissatisfaction with software; I'm not sure what the best routes forward there are, other than encouraging further feedback and finding ways to improve and promote product design in user-facing FOSS)


I've been using Linux on my private machines for 20+ years, so to me, every year is the year of the Linux desktop.

But I also think it's actually a good thing I'm part of a tiny minority. Malware authors target Windows because it dominates the desktop. If suddenly 25% of the world's desktop users switched to, say, Ubuntu and GNOME, that would probably change. Right now, I enjoy all the benefits Windows users get from their desktops, without the drawbacks.


Truly this is the year of Windows on Cellphones!


It isn't, but neither it is for GNU/Linux.

If you are thinking about Android, go have a chat with Termux guys how their adoption of Java APIs is going on.


> There was a degree of central organization since the project leader, Linus Torvalds, approved kernel changes and set the philosophical tone, but Linux probably could have survived the removal of its leader. It was decentralized enough to have no single point of failure.

I submit that Torvalds' genius could be subdivided into technical heft vs. laissez faire management.

He made the in-vs-out calls more or less optimally, and leveraged the "libertaian" feel of the GPL, without getting bogged down by FSF ideology.

It was a small target to hit. The pattern might be less difficult to replicate now, but would a Theo de Raadt be able to carry the torch?

Torvalds nailed the timing/location problem.


None of these are why. Linux succeeded because the incumbents were so intent on fighting Windows NT without making real changes that they created a terrible situation for developers and operational staff. Entire companies existed to make multi-vendor Unix frameworks and the vendors themselves weren't in great shape by the late-mid-90s.

Given the choice between standards and varying compliance (and there were a plenty of these and so #ifdef IRIX... never went away) and just picking one sub-optimal solution that was available on commodity hardware, the outcome was obvious.

The same situation is playing out in other industries as well right now.


GPLv2

Think about it. Most of the reasons people are giving for "why Linux succeeded" also apply to BSD. And these things have all applied to BSD for a long time.

Maybe it's some variation on the "first mover principle" where Linux just got there first. But I don't think that's a very good explanation.

Personally I think that the license has a lot to do with it.


Substack is awesome to read. loads fast, great typo, good content exactly what the web should be like


I'm guessing the author wasn't a Linux user, and possibly wasn't alive at the time.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: