Hacker News new | past | comments | ask | show | jobs | submit login
The SCO lawsuit, 20 years later (lwn.net)
332 points by chmaynard on March 3, 2023 | hide | past | favorite | 254 comments



I was actually the "author" of one of the pieces of code that SCO claimed had been stolen from them - SCO's implementation of the Berkeley Packet Filter.

SCO had a collaboration with IBM called Project Monterey, in which SCO and IBM were merging their operating systems in order to be better positioned for IA64 (which at that point people still thought would be good).

One minor detail of the Monterey deal was that SCO got the right to port IBM's superior networking stack into SCO's operating systems. One part of IBM's networking stack was their implementation of the Berkeley Packet Filter, and that was one of the parts I ported over (I'm pretty sure the file still had a BSD copyright notice on it).

That was back in the days when SCO was a cool UNIX vendor that thought of itself as an extension of UC Santa Cruz. Back then employees couldn't really imagine it turning evil, and it made me wary of what my subsequent employers (Google and Facebook) could turn into.


SCO the "cool UNIX vendor" was not the company that sued. In 2001, the cool UNIX vendor was struggling and sold its UNIX business to Caldera. Caldera renamed itself "the SCO Group" and filed the lawsuit. [1]

[1] https://en.wikipedia.org/wiki/Santa_Cruz_Operation


Let's not forget the following [0]:

> SCO's Linux lawsuit made no sense and no one at the time gave it much of a chance of succeeding. Over time it was revealed that Microsoft had been using SCO as a sock puppet against Linux. Unfortunately for Microsoft and SCO, it soon became abundantly clear that SCO didn't have a real case against Linux and its allies.

In a way this was a proxy war of Microsoft vs Linux.

---

[0]: https://www.zdnet.com/article/sco-linux-fud-returns-from-the...


Microsoft is very very American.


In my mind, I associate Caldera with that Linux distribution that was pretty OK, but nothing to write home about.


Caldera’s big trick IIRC was its installer. You could play Tetris in the installer while it copied packages onto the hard drive. It was also a better installer for new users than a lot of others. I think the desktop experience was pretty polished for the time, too.


They also had Minesweeper too (I can't remember if it was a choice, or they had two distro flavors - one with Tetris and one with Minesweeper).


That's brilliant! I wish more installers did that.


Caldera, pre SCO, was actually pretty sold 'professional' desktop OS. A place I worked at used it and we had a bunch of developers and researchers with Windows backgrounds that were using Linux for the first time and they all found it a pretty smooth and polished desktop and development experience over all.


I remember seeing original SCO unix in our server room in 1998 - they were a legitimate player then but I also remember seeing all new orders going for SunOS (and later for Sun Solaris).


Internally, SCO used to joke that they were “the most popular operating system you haven’t heard of”.

It was (at that time) the best Unix on cheap x86 hardware, so it was deployed everywhere. In gas station pumps, in cash registers, in ATMs, in back office databases. I think SCO made up the majority of Oracle database instances.

Most people interacted with SCO code on a daily basis but didn’t know.

Of course they rapidly lost the x86 UNIX crown to Linux. At the time I was there SCO could still do a lot of stuff well that Linux couldn’t, but the advantages were rapidly going down and the writing was on the wall.


The original SCO wasn't particularly cool either, I worked for a competing company that got called in to help them when they couldn't get get the 286 MMU to work, the deal was that we would get a copy of the resulting code, we got it working but in the end got zilch


I can’t speak to what it was like to work with them from the outside, but (at least when I was there) they definitely had a fun internal culture.

Small enough to not have a “big company” feel, yet profitable enough (at that time) to not have “startup panic”. Plus lots of cool hackers, including some of the original authors of UNIX, who had joined through the AT&T deal.


I had to deal with fallout from this. At the time I worked at eBay, and were not allowed to use any OS except Windows or RedHat Linux. The reason we had to use RedHat? Because we had a paid license from them that included them absorbing all liability from IP claims regarding Linux.

But if anyone was around then, you'd probably remember that RedHat was one of the worse distros of linux at the time. So we were forced to use an inferior product because it came with IP indemnity. Thanks SCO!


While I agree that the SCO FUD surrounding Linux was real at that time (we all felt it), Red Hat was one of the most respected, functional, and widely-implemented distributions for the entire time SCO was in the news headlines.


It was also a huge pain if my experiences (early-mid 2000s) with Fedora are any indicator. Red Hat and co. were great because, given enough effort, they could be tuned into unshakable systems for any purpose. And you paid for that with your time and energy (or a support contract with Red Hat itself) and occasional frustration at things that didn't work like any other distribution.


> But if anyone was around then, you'd probably remember that RedHat was one of the worse distros of linux at the time.

There's no better microcosm of the early Linux world than this. "Yeah, that giant market-driving lawsuit was bad, I was there too and suffered along with the rest of you. But that's not the important thing: let me tell you about how bad RPM is compared to DPKG! Did you know the underlying archive format was cpio? CPIO!"

That's not intended to be too much of a barb (I was a Debian nut too, FWIW), but really, all that stuff seems pretty silly in hindsight.


> Did you know the underlying archive format was cpio? CPIO!

And what's wrong with that? deb's underlying format is ar, which is basically the same class of simple archives that cpio is.

It solves the problem with two nice benefits:

1. You're not reinventing an archive format, the existing ones are perfectly adequate.

2. If your system is borked enough that you don't even have a working package manager, you can manually extract your package manager from a package file with nothing more than the cpio or ar programs.


> me tell you about how bad RPM is compared to DPKG!

I'm pretty sure the person you're replying to was taking about the distro as a whole, not the package file format


Yes, but the arguments of that era always seemed to revolve around minutiae like the package manager, or default filesysem, or choice of default desktop.


You're being dismissive of issues that were far bigger than you make them look.

>like the package manager

It wasn't about, for example, dpkg vs rpm, but apt-get versus.. nothing, because Red Hat had nothing to solve dependencies. Installing software on Red Hat was a truly hellish experience.

>default filesysem

I have never in my life seen people dismiss a distro because of a default filesystem. I question whether you've truly experienced the early days of Red Hat.

They shipped broken development snapshots of GCC that led to serious compatibility issues :

https://linux.slashdot.org/story/00/10/07/0027218/gccs-respo...

They employed one of the most abrasive and divisive personality to helm the development of the libc :

https://news.ycombinator.com/item?id=2380062

They were one of the most barebones distro in terms of tooling to configure and manage them. Debian was more fully featured on the terminal tooling front, while Mandrake and SUSE had a lot of GUI wizards for people more used to the Windows ways.

Even much later in the Fedora days, when Kernel developer Alanx Cox questions the sanity of releasing a completely broken distro (version 18 of Fedora) this is what they had to answer :

>So yeah: in case you didn't get the memo, F18 has a new installer and a new upgrade tool. They are both v1.0s. As in the case of all v1.0s, you may want to exercise some frickin' caution. If you want a Fedora release whose installer and upgrade tools were stabilized over a period of several years and 20+ releases, Fedora 17 is right in the torrent list

Uh, like.. okay? but maybe don't release it, until, you know, it's not garbage?

And then there's the whole thing with what Red Had did to take over the whole userland desktop stack with systemd, logind, dbus, pulseaudio, wayland.. all of which caused more issues than they ever solved. You still can't make software like autohotkey on wayland. All those pieces of software are heavily interconnected with each other making it less and less tenable to maintain a distro that doesn't package them.

There are many, many reasons to be soured with what Red Hat contributed to linux.


> You're being dismissive of issues that were far bigger than you make them look.

You whooshed on the humor above, so I should spell it out. The point is exactly the opposite. This focus on tribal distro warfare obscures the objectively much more important threat posed by this lawsuit to the entire ecosystem.

I mean, no, you're simply wrong. SCO was a far, far, far larger threat to the Linux world than the fact that Red Hat inexplicably dragged their feet shipping yum on their enterprise distro.

And it's important, as a matter of history, to remember that period and the players and the resulting and continuing effects on our culture. While on the flip side, no one cares (or rather: no one should care) about the lessons of apt vs. yum, all of which have been recapitulated a thousand times since. Let it go.


+1 to this. I remember installing software in 1998 on Red Hat and it was absolute hell. It was a hit or miss endeavor and when it failed, it failed badly. All those broken dependencies took ages to resolve.

Then I switched to Debian Potato and never turned back. It was such a clean and sane experience.

Unfortunately, I could not scape rpm hell at work as Fermi Linux, and then CERN Linux, were both based on Red Hat.


> It wasn't about, for example, dpkg vs rpm

No, it totally was. APT vs. nothing was a big deal, but RPM per se had a lot of mistakes. Debian also had massively better standard tooling around conffiles, diversions, alternatives, etc. They also had publicly-published packaging standards so even out-of-distro debs were often pretty good, while third-party RPMs were a garbage fire.


Nah, RH was very much it's own little world. Many of us were forced to use it because they offered a package (indemnification, support) that appealed to the bean counters and lawyers. For a few years they patched _everything_ that mattered (compilers, kernel, core libraries) to a massive extent in an attempt to differentiate and to give the appearence of adding value. I don't have fond memories.


There were several good versions of Red Hat (IIRC 7.3 was pretty good) and then RH9 had NPTL but unfortunately was incompatible with the rest of the world (very custom kernel, userspace, compiler). IIRC the egcs included in that led to all sorts of interesting outcomes.


7.3 was my first Linux distro, I distro hopped for a few years and settled on Gentoo in 2004...

Flash forward to a few weeks ago, I tried installing good ol' 7.3 again and it wouldn't even finish the installer...

I don't remember if there was a trick to it, or whether bit rot and architecture changes make it not installable anymore.


I'm confused. RH9 is the latest release, how is it custom? What are you referring to exactly?


RH 9 is different than RHEL 9.

Just like Windows 11 is different from Windows 98 (98 doesn't mean it's newer).

They went through a rebranding from RedHat Linux to RedHat Enterprise Linux over a decade ago but after or during the SCO nonsense. Early versions of RedHat's Linux were just called RedHat Linux because at the time it was just like any other Linux distro or starting from those early hobbyist roots. RHEL was meant to be stable and have commercial add ons for big companies.

https://en.wikipedia.org/wiki/Red_Hat_Linux

https://en.wikipedia.org/wiki/Red_Hat_Enterprise_Linux (at the top it notes, not to be confused with Red Hat Linux)

What could be thought of as Red Hat Linux today is the Fedora distro (which it was spun off into), which is upstream, more frequently released or up to date (and thus less stable), and targeted at the hobbyist or enthusiast user, which I'm one of the weird people that still uses it. Once something useful goes through its paces on Fedora, it'll eventually get integrated into RHEL where reliability is of higher value than the latest features.

I started on RH 5.2 myself, in '99, and literally got it on disc for $40 at BestBuy. I also lived near RedHat's offices (in Durham, NC) at the time and got to visit them once with coworker friends. Exciting time, I was very young, and wish I knew enough to buy stock during its IPO at the time.


No, the latest release is rhel9, not rh9.

At the time it was just “red hat linux” while today it is “red hat enterprise linux”.

The split (and the counter reset) was made to mark the split between a regular distro (fedora core, at the time) and an enterprise-oriented distro (red hat enterprise linux).


Red Hat 9- https://en.wikipedia.org/wiki/Red_Hat_Linux was in 2003. I may be misremembering some details since I thought RH9 brought in egcs but it was 7.3 that did that.


There was a distro called Red Hat, which more-or-less mutated into Fedora, which is and has always been a different distro from Red Hat Enterprise Linux.


Rh 4.2 ran half the planet, where are you getting this from?


From my experience as a Linux sysadmin? RedHat ran half the planet because they had the best sales force and legal team (as exhibited by my example), but they did not have the best technology.


You are not wrong...


It looks like RHEL zealots never die from the other people’s comments. RHEL had plenty of problems just like all the other OSes and distros.


I think you're romanticizing it. Until RHEL and LTS releases for Debian/Ubuntu, most distros you never knew if running and update was going to break something because there simply wasn't effective quality control testing in the hobbyist distros. Best you could do was run a version behind, but that hurt if you needed security updates.

There were plenty of people and small highly knowledgeable shops and academics that thought 6 hours fixing a bug after custom compiling a patch was fine and normal (and a RHEL subscription at least meant RedHat would have a team doing that part for you if absolutely necessary), but its not the way companies operated. RHEL at least meant whatever release was stable and an actual QA team put patches through their paces on various hardware and configurations (especially those enterprise high end server configs with special SCSI/RAID controllers, high end network cards, and other chipset other distros simply didn't have the means to test on). The QA/support team wasn't bug reports and guys on usenet going "it works for me, you should have gotten the exact same hardware I have, or be willing to go through the code and figure it out and patch it, and submit it to the source, like a good user should". Or tell you go back to Micro$oft if you want support for your storage controller that the kernel module for worked fine in the last version. Those were the zealots, the rest were sys admins with too much other things on their hands to do than deal with Slackware or whatever the hot distro was on distrowatch.


My Redhat experience always seemed to devolve into "this package that I want has a dependency that isn't listed yet..." (cue 2 hours of recursively and manually tracking down dependencies on the early web).

But I was a lot younger and didn't know a lot of what I do now, so was probably doing everything RPM wrong.


I had the dependency spiral issue on every distro (I played with quite a few), compiling with something like Gentoo made it worse. RPM Forge existed, I think the Linux experience in general was bad back then, and RedHat actually was one of the least problematic. Until Ubuntu, it was the easiest and most approachable to use for novices.


Using redhat before yum, meant visiting rpmfind.net and manually collecting what you needed.

In some ways, RHEL is still like that, because popular packages are usually a major version or two behind if they're even there at all. You have to hunt down an EPEL that has whatever you need.


Yeah, I don't miss the old days. Like another poster though, I remember apt-get being decent while pre-yum RedHat was still pretty bad.

But I also realize my perspective was that of a hobbyist, not an enterprise sysadmin who was probably upgrading to well-known versions through known paths.


I worked with sysadmins who used rpm based distros back then and their experience was basically mine: hunting down the right rpms that both satisfied the constraints and actually worked.


I think you are exaggerating.

Computers of all stripes are more reliable now. In the late nineties I ran apache + mod-perl built from source.

We had a lot of problems, but I cannot recall ever having a problem with those two.

We tested every update, of course, but it was not a huge burden.


If you were running Apache fine, I think Linux in the late 90s was the work horse of web servers. That might have been the one thing that just worked. I was using it as a workstation and for everything else. Try getting your window manager to work with your Xserver and graphics card. Upgrade a package that needs a newer packlage, that needs another package that no one has built yet, so now you need to custom compile the library, but if it replaces the existing library it breaks something else that relies on the older library.

That's inbetween figuring out how to get things to compile and the dependencies of dependencies, etc.

I got Linux to work, but it was also a love hate relationship, when I got it working, it worked and worked for months, but I had almost a PTSD reaction when it was time to upgrade anything, I knew what was coming and I was afraid.


You bring back memories. In my early Linux days, every attempt to upgrade an RPM based distribution lead to me needing to wipe the system.

The first time I did a major version upgrade on Ubuntu, I was shocked it worked.


Wrong. Two reasons actually:

1. RHEL was the first distro directed at enterprise deployment (meaning, strong preference of rock solid stability and predictability over constant churn). Which made it the only distro Dells and HPs of the world recognized and agreed to support.

2. RHEL was created on legacy of RedHat Linux, which was the best distro for non-hobbyist environments (from reproducible deployments to the breadth of packages available) - since 3.0.3 onwards. RedHat JUST. WORKED.


My experience disagrees strongly.

1. Debian always had more stability and predictability than Red Hat in practice. Too much so. The Dells and HPs of the world didn't recognize it because it was not a company.

2. My impression was again that Debian was technically better than RedHat in every way I might care about. We happily installed it at $work, and the experienced Unix sysadmins I knew could use RedHat but didn't like it so much.


> Dells and HPs of the world didn't recognize it [Debian] because it was not a company.

IIRC Debian maintainers formed companies for that reason.

I was a Debian user because it suited me. It was obvious (to me at the time) that suits were going to choose RedHat, and we ran it a bit to have experience.

How wrong I was. Ubuntu made Debian every bit as much as a "choice of the suits" as Redhat.


I don't think "Wrong" is a very interesting, helpful, or productive response to someone's lived experience.


[flagged]


Wrong. Two reasons, actually:

1. Culture is a dynamic and malleable thing, and through thoughtful criticism, it's possible to help people not be pedantic assholes on internet message boards.

2. If a post is both unnecessarily abrasive, and doesn't meaningfully engage with the post it's responding to, it adds no value to the conversation, and thus is not worth defending.


Are you stating that someone’s identity is an excuse for them being disagreeable


Isn't that the current definition of neuro-divergent tolerance?


This was pre DNF/YUM days. RedHat could be a pain installing stuff from RPM’s that had crazy dependencies back in the day.


First time I installed Linux, I was trying to Gaim on it.

I managed to compile Gaim before I managed to install it from RPM packages.

Using Debian was a moment of pure epiphany.

Good times.


I remember using Mandrake (later Mandriva) - it also used rpm. Dependency hell was an issue there too; it probably had to do with state of rpm more than the distro itself.


Dependency hell was a thing for sure. I had at the time made a script to easily install rpm's at with a shell script that was searching the mirrors. It didn't handle dependencies but would show what was missing. But so many years later now looking back, a big issue was really just a lack of understanding of rpms with all the other os's with rpms, different arcs and versions. yum was a welcome changed especially since apt had for so long solved dependency issues.


up2date was before then


My main experience with RedHat has been needing to bend over backwards to support its wildly outdated library versions. Because RedHat “supports” operating systems for about a decade, there’s always an argument that a library should be written to support whatever toolchain is provided by the oldest supported RHEL version.

RHEL’s “support” should be seen as “your software will continue to run unmodified when on this system”, but is frequently interpreted as “this is a sane platform for current development”.


For a while, Python binary wheels (packages) for Linux had a standard that was defined, essentially, in terms of RHEL (or rather, CentOS) as a baseline platform for this exact reason.

https://peps.python.org/pep-0513/

https://peps.python.org/pep-0571/

https://peps.python.org/pep-0599/


Current development was better done on Fedora, the upstream of RHEL, I think they were pretty clear that's what it was for. RHEL was for when you needed everything to work no matter what. Fedora was for new development and latest and greatest.


The problem with needing “everything to work no matter what” is that it just doesn’t exist. The type of stability that RHEL provides is conditional on never using third-party libraries. RHEL backpoets security fixes to packages that it provides, but you’re on your own for anything else. For example, RHEL 7 was released in 2014, has production support for another year, and extended support until 2026. As libraries are requiring C++17 support today, maintaining compatibility with stock RHEL 7 requires running older versions of a library.


That level of backwards compatiblity with legacy versions exists on other platforms (Windows, z/OS, AIX), especially those implemented in enterprise environments. If I have a Windows app written in Win32 API, it'll run on Windows 11 on 64-bit architecture. That's what companies want. That's how Linux became viable for those customers. Even Edge has an IE mode because that's what enterprise environments need, support for older libraries and APIs. I won't deny RHEL was higher technical debt for sure, but that was the trade off, because that's what enterprises prioritized, support, stability, and knowledge that an update during their monthly patch cycle and change window for rebooting the server won't break the business applications they rely on for fundamental operations, at least not on the same major release version.

RHEL was never supposed to be the latest, I think even a every new major release they'd be on a kernel and library version that was 2 years behind but had been through its paces. IOW, it was a feature not a bug, and its what companies were paying for (their customers were boring legacy Fortune 500s not startups). As a business model, it was solid, and the most successful in the commercial market by far because it catered to their customers needs, even if they're not our own, even Ubuntu Server took pages from their book.


They're talking about stuff that predates RHEL by half a decade.


True, more adding it as agreement on RedHat being consistently and intentionally behind the curve.


Windows ran half the planet, where are you getting this from?


These statements are not mutually exclusive. Math checks out :)


Only if MS and RH were the only thing that existed 20 years ago, which we know is false. Sun Microsystems, for one. Debian for another.

And the thing is I was being conservative about MS running half the planet. They had their grubby hooks in everything.


For sure. My comment was in jest.


> Windows ran half the planet

That half of the planet did not run....


That was Debian and FreeBSD not RH.


> Because we had a paid license from them that included them absorbing all liability from IP claims regarding Linux.

Not a trivial thing by the way, if you think about it.


Ahh yes. We also kept our hands and feet inside the roller coaster with SLES and RHEL.

The ever-looming “threat” might be laughable in hindsight had it not stifled innovation at such an early stage of growth.

IT Directors everywhere should atone.


What were the best distros then?


When I was a CS major in 1999-2003, the big names I remember people using were RedHat, SUSE, Debian, and Mandrake was gaining market share for desktop use. The installfests I went to were all RedHat. There seem to be a lot of complaints here about RedHat after that era, but my memory is that RedHat was the main linux distro circa 2003.


Mandrake! What a blast from the past. I also remember Gentoo and Slackware from around the same time, though apparently the latter predates the former by a decade or so.



I'm equal parts gobsmacked that funroll-loops is still around, and that it's thanks to Shlomi Fish who is still writing his creepy-ass fanfic.


Slackware in the 90s was amazing. Super stable. I think I continued to run Slackware until around 2002 to 2004 (can’t say for sure but it was to run an early(ish) version of Ableton for laptop DJing - as I was creating a concept set that just wouldn’t have been possible with vinyl alone), and wanted a distro that was a little lower maintenance given the advances to Linux at that point.

I’ll always have a soft spot for Slackware even if I’d never dream of running it any longer.


Aaaaah, so that's why <company I interned at a few years ago> used RHEL!


One thing that the author of this article left out is that SCO was not SCO during this dark period of litigation.

SCO (Santa Cruz Operation) was an x86 UNIX vendor that wasn't great, but enjoyed a lot of market share. I'd estimate they were #2 to SUN in installations because it ran on commodity x86 hardware. But by the late 1990s, they knew their time was up given the pressure from Linux. When the company was sold to Caldera in Utah, very few original SCO people stayed, and those that did left quickly because Ransom Love (yes, that was his name) made it very clear the Caldera culture was not the old SCO culture, and that you'd have to relocate to Utah.

Following this, Caldera rebranded themselves back to SCO (The SCO Group) in an effort to convince people they were the same company. After all, they needed all the help they could get.

So, essentially, litigation SCO was not the original SCO. And in many ways, litigation SCO tarnished whatever respect the original SCO had.


Caldera even had a Linux distro for a time, before the lawsuit.

https://en.wikipedia.org/wiki/Caldera_OpenLinux


Fond memories of OpenLinux because (iirc) the installer had off-brand Tetris you could play while it laboriously copied files from CD to HDD. Always thought that was a little funny given the antics they got up to later.


Ubuntu has been including quadrapassel in their installer discs for a long time, I love playing Tetris instead of watching their lame slideshow.


Yes the days of SCOC were great but the days of SCOX were dark. I left just before the Caldera take over.


>But by the late 1990s, they knew their time was up given the pressure from Linux.

They were savaged by Windows NT before Linux became an appreciable factor. Which was one of the main arguments against Linux causing faux-SCO (as an industry colleague of mine liked to refer to them as) economic harm even if their claims were true--which they weren't.

But also, yeah, charting the corporate identities through that period was hard to keep straight. There was also a branch of Santa Cruz desktop products that ended up with Sun.


And litigation-SCO deliberately tried to confound the two, so that people assumed that litigation-SCO had all the IP rights that original-SCO once had. They didn't, though, as the court case showed.


Interesting side effect of this (to me :-)) is that when Blekko was acquired by IBM I got stuck doing what is called "Blue Washing" where IBM tracks down the licenses, origins, and usages of all the source code the acquired company is bringing to IBM. According to people I worked with when doing this, the entire process and toolset was an outgrowth of the work they did to disprove SCO's claims in the lawsuit.

I found it was both invasive and thorough, and generally engineers didn't like it when the answer was "You have to delete that, we can't verify we have rights to it."


There’s an entire cottage industry around this. As one example:

https://www.synopsys.com/software-integrity/security-testing...

Frequently used for things like internal audits, compliance, due-diligence during financing, etc.

It’s remarkably thorough but not always completely accurate - it flagged an open source project I created that we were using as GPL. I had to correct them, which was entertaining.


They are almost always inaccurate. What they always are is expensive


As CTO I generally tracked software and applicable use license(s) across the entire codebase.

My internal sheet more or less lined up with their results - with the notable exception being my project I referenced. FWIR it was something like 400 entries and I’d put it at roughly 95% accurate on this anecdotal rough estimate.


That is surprisingly bad. I would expect it to be some sigma level of accuracy for the money they charge. Perhaps I misunderstand the problem.


This was over 10 years ago. Not only is my memory fuzzy I would imagine (hope) it's gotten significantly better since then.

Per usual for these kinds of things what you're really paying for is the name and "trust" associated with it.

In the grand scheme of things when you're raising tens of millions - billions of dollars the cost of these tools in terms of total services across due diligence is nothing. Additionally, tools like this are selected by the investor and you don't really have much say in the matter.

Basically, you use "expensive" $SOLUTION the investor prefers, check the box, and move on to the hundreds/thousands of other due diligence items in the transaction.

Where it gets really interesting (to me) are cases with super aggressive investors like Tiger Global and anything in bubble (crypto) where close times are days and almost no due diligence is performed.

As one example I doubt anyone used these kinds of tools at FTX...


Anecdotally I have observed the same but I don't understand the problem. Why is that? It seems like one could fairly easily search for code in the older project, find it in the new project, and link to it in the report. What am I missing?


I was at IBM around the time of the SCO lawsuit and had to help out with blue washing a few projects. It was also around that time that there had been some high-profile cases of GPL code getting shipped accidentally. We had a tool that basically grepped through all the source for keywords like "Copyright", "License", or "GPL", and then we had to compile a report and get the lawyers to sign off on it. It didn't seem to me to be a very thorough way of proving provenance because it relied on proper attribution.


I wonder if it'll be much easier going forward. Take a suspect piece of code, run it through an LLM to explain it, pass the explanation to another LLM to generate code from the explanation. An automated pipeline for clean room reverse engineering.


As long as you can prove that you had a license to all the code that was used to train the LLM! :)


Hah! It appears that no one training these models is doing that. GPT doesn't generate the correct GPL/MIT/BSD/Apache license information and attribution for the source code it generates from its training set.

All of these code-based LLMs are willfully violating copyright already. It's doubtful this goes anywhere in the courts, however.


Someone would first have to prove you used an LLM, no?


That’s what discovery and depositions are for


IANAL but it seems that if you rely on looking at the suspect piece of code to see what it's doing (even if done by an LLM), it's not exactly clean room.



Personally, the most interesting thing about that event was groklaw. (http://www.groklaw.net/index.php) It is hard now to reconstruct the experience of having a website dedicated to something I cared passionately about. Groklaw is an example, in my opinion, of what the internet could be, should be and isn't. A website that provided a community for people with a common cause.

I don't mean to diminish the importance of Linux at all.


I really miss Groklaw. Sometimes I wish they could come back, the insight of the author and the community around the site was really valuable and allowed me to have a more informed opinion.


One of my greatest white elephant gifts were SCO stock certificates. Was a real challenge to figure out how to spend $20+ for a bit of paper as they were getting close to delisting. Really wish I had kept one for myself.

Miss groklaw... what a loss when that shut down.


This is a hilarious quote, in the article, attributed to Linus, related to the concept of source code pedigree and the developer certification of origin for any contribution.

> For example, in the case of "ctype.h", what made it so clear that it was original work was the horrible bugs it contained originally, and since we obviously don't do bugs any more (right?), we should probably plan on having other ways to document the origin of the code.

The concept that bugs would prove the origin makes me giggle. Because yes, if the code was stolen, it would have been stolen without the bugs included.


Well, mapmakers often sneak errors here and there to catch plagiarists. https://en.m.wikipedia.org/wiki/Phantom_settlement


It works both ways, very similar bug patterns can be strong indication of copying does it not ?


I would love to get a signed picture of Darl McBride for my bathroom. Because he paid for it. I heard about this lawsuit, called up a couple of IP lawyers, talked to a bunch of programmers, and read everything on groklaw. I called both analysts who covered the stock -- one had a price target of $5, while the other had a price target of $45 (stock was about 20 at the time). Then I shorted it with most of the money I had. It went down and I used the equity to short even more of it. In retrospect, the analysis was correct (i.e. don't sue a company that has more lawyers than you have employees without a good case), but the level of risk I took was insane, and it kept me up late at night for a long time.


OK, that beats my "I made $25,000 on the VA Linux IPO simply because I was an early sourceforge user and got invited to the friends and family program".


I had a buddy in college who made $25K on the VA Linux IPO for the same reason. He used it to pay for a used car and a couple year's tuition. Good times.


ours paid for part of a new car. I later ended up working with the people who set all this up- VA Linux employees- and they got nothiung because the stock crashed after that.


Eric Raymond (ESR) was a director of VA Linux. I remember seeing on Yahoo Finance in '00, but haven't been able to find again, that on Dec 30th or 31st, 1999 he exercised options for a 'gain' of around $2 million. He was locked from selling until around April. At that point all of his stock was worth around $6 million, which might have covered the AMT tax bill.

Edit: Found the log, not sure if "Aquired via Exchange" was a taxable event. If it is, there are a bunch https://web.archive.org/web/20000617155153/http://biz.yahoo....


I wonder if anybody here can correct my memory.

When Microsoft sold Xenix to SCO, I seem to recall that part of the deal was Microsoft agreeing to not compete in the Unix market. Today, Microsoft is embracing Linux with their WSL stuff and that certainly impacts the Unix market.

Maybe the 1997 agreement between the two companies ended Microsoft's exile from Unix-land or maybe when Caldera bought SCO's Unix business the Microsoft agreement didn't transfer?

And I realize that, unlike macOS, Linux isn't Unix but it does compete in that market.


That sounds like Apple agreeing not to go into the music business with their settlement with apple music - and then just doing what they want any way. Microsoft being so large, could just do it and settle later.


Back in 1999 Microsoft bought Interix, and marketed it as Windows Services for Unix under their own banner. So the agreement was probably non-binding as of 1999 at the latest.

Microsoft just didn't see Unix, and especially Linux, as competition because they felt what they built was better. Interix/SFU was a compatibility path to migrate Unix devs to Windows NT more than anything; it was more like Cygwin than like WSL.

They figured that Linux, and the dirty hippies that built it, would be easily swept away by Microsoft's mighty hand. Only when the Gates-Ballmer dynasty stepped down would the company be convinced otherwise.


Side note: I purchased Interix and used it to port a million-line C++/X11/Motif/OpenGL app to windows (successfully, although there wasn't really any demand for the result). Must have been '96 or '97. I didn't find it compelling, and the authors of the code decided to rewrite a new system from scratch, using Python and Tk (along with C++ and OpenGL) to be portable to all the commonly used platforms. That worked on Windows, Linux, IRIX, and later Mac OS X for at least 2 decades(!) with very little platform-specific code (except in the Tk dependency). They finally switched to Python/C++/Qt/OpenGL a few years back which I expect will work for another 20 years.


Interix/SFU/SUA was only the last pre-Linux iteration of Unix on top of Windows—a POSIX subsystem was included in NT from the beginning, even if it was practically useless to the point of having no network access (I believe early design documentation explicitly says it’s there to satisfy government contract requirements).


There was definitely a period in the 90s when a lot of people and companies expected/were resigned to Windows NT completely dominating both the desktop and the server. Even companies like IBM that were probably less convinced had backup plans like Monterey in IBM's case which factored into the SCO lawsuit.


To be fair around WinNT 4.0 and Win2k, Microsoft had much better operating system compared to Linux. FreeBSD and Solaris were much better competitors.


Having worked in a Solaris shop for a while, I still wonder what could have been if Solaris made the decision to open-source about 5-10 years earlier than they did. I think Solaris would have been such a better base for the Linux ecosystem than Linux ever was. Though the licensing issues had to have been resolved for this to work out, and knowing Sun/Oracle - that would never have really worked out...


What's-his-name, the Sun CEO, hated both Microsoft and Open Source.

Even when they open sourced Solaris, after he left, they did it as a half-measure with an anti-GPL license and then I think Oracle pulled the plug and doomed Solaris to the dustbin of history.


McNealy was "just" chairman of the board when OpenSolaris was released. Also, I'm not sure it's fair to describe CDDL as anti-GPL unless you think the Mozilla Public License is as well.

That said, it is reasonable to ask whether Solaris should have simply been placed under the GPL or a permissive license.


I think it’s fair. They went out of their way to make the license incompatible so it couldn’t be mixed with GPL’ed code.

Had they moved sooner to open source, and embraced it instead of trying to create a moat around their stuff, they probably could’ve stayed relevant.


Ok, to rephrase that. They licensed Open Solaris in a certain way to make it explicitly incompatible with Linux' GPL2, or at least to sow enough doubt that the licenses are incompatible.

That didn't scream confidence in their ability to migrate to Open Source or their overall technical superiority, despite some very nice features.


>such a better base for the Linux ecosystem than Linux ever was.

That's so true.

But OmniOS and OpenIndiana still work excellent.


I was surprised by the success of Linux around 2000.

NT or Solaris were actually better and BSD variants had been around for a while.

Samba and Apache were the driving forces back then. In Germany, SUSE had great influence.

But Linux success is still amazing.


And what's interesting, if the SCO lawsuit had gone any other way, Windows NT would probably have ended up completely dominating. Linux wouldn't be here and probably neither would SCO Linux either. What a dire state that would be.


I'm pretty sure that even if SCO had all the rights to Unix that it thought it did [0], IBM would have found some way to make the problem go away even if involved buying SCO or effectively paying it protection money. Though from what I've seen of SCO's claims (I co-wrote one of the expert witness reports), it's hard for me to imagine they had much of a chance, Boies or no Boies.

[0] Which remains one of the most inexplicable aspects of the whole case to this day whether Novell pulled a fast one and/or SCO's lawyering was just incompetent.


Yes, but honest question, do you really think that if IBM had purchased SCO (and maybe throw Novell in there too) that Linux would have still had any chance? I don't know what the outcome, but I just think there's too many possibilities at that point for Linux to just stay hobbyist without corporate adoption (fearing now IBM licensing/lawyering at that point).


At the time of the SCO lawsuit, IBM had already made a huge bet on Linux. [1] Basically, that's why SCO sued them. And they had their own Unix (AIX) which they could have ported to x86 if they had wanted to--but they didn't.

Here's what the person who headed IBM's Linux initiative told me a couple of years back:

And I still remember very well in December of ‘99, I called Sam Palmisano, the head of IBM Systems Group. And I said, Sam, the task force recommends that we should embrace Linux. And Sam said, okay, Irving, we will do that. But you have to now come over and run an IBM Linux initiative. And I said to Sam, okay, we were pretty much done with our internet strategy. So I was no longer needed to run the Internet division And I said to Sam, when do you want to announce it? And Sam said, how about now? And I said Sam. It's the Christmas holidays. Maybe we should wait until the new year. And in the second week of January of 2000, we made a major announcement saying that IBM would embrace Linux across all of these offerings. And in fact, later that month in January of 2000, I gave a keynote at LinuxWorld, which was taking place in the Javits Center in New York City, about IBM’s Linux initiative.

[1] https://blog.irvingwb.com/blog/2006/01/ibms_linux_init.html


Thank you for taking time to reply. It's awesome to see the insider's view a bit here.


No, Novell didn't pull a fast one. SCO didn't have the money to buy everything they wanted to buy, so the deal was deliberately structured so that they bought less. At the time, both sides knew it. And then most everybody who knew it left Caldera...

And, mind telling us which expert witness report?


I know that the deal was structured in the way it was because SCO didn't have enough money but I still find it bizarre that they entered a high-profile lawsuit with a fundamental misunderstanding of the cards they held. Did they really think that Cravath Swaine wouldn't notice?

They don't have titles as far as I can see. But basically around Unix history, whether it was reasonable that Linux caused SCO's distress (relative to Windows NT), supposed economic damages, credibility of various claims that weren't thrown out because of actual Unix ownership, Project Monterey, etc. My boss's name is on it so don't want to say more than that.


As I recall, Sun Microsystems was fond of reminding everyone that they'd bought and paid for a perpetual, irrevocable license to Unix, so there wasn't and couldn't be any legal issue with Solaris.

I found a commentary (https://landley.net/writing/halloween9.html) that says something similar:

> UnixWare and OpenServer were always minor versions of Unix, the versions belonging to other vendors have always been more important and more lucrative: Sun's Solaris, HP's HP-UX, and IBM's AIX being three surviving profitable examples. All of these companies have purchased "irrevocable, perpetual" licenses to AT&T's old Unix codebase, and will never owe SCO another dime for it.

So, if SCO had managed to kill Linux, then Solaris, HP-UX, and AIX could have stepped in to compete with Windows NT.

One of the reasons Linux was eating commercial Unix's lunch was that x86 hardware was cheaper than RISC workstations and had finally hit a point where it performed as well (and eventually better). RISC systems were on their way out, and commercial Unix companies didn't seem to want to give up that fight.

So commercial Unix vendors would have needed to sell x86 ports of their Unix systems to compete with Windows NT, but as a technical matter, those x86 ports already existed (Solaris x86 certainly did).

TLDR: If Linux had been killed, there would be other Unix options to compete with Window NT.


I recall the Halloween memos...


Linux is not unix, by the way.

Besides the obvious things (linux is just a kernel), the unix specification has many mandates in terms of API (for compatibility), in terms of behaviour and even in terms of commands and utilities (a specification of a vi editor is also in the unix specification iirc).

So yeah as long ad Microsoft doesn’t try and get anything certified as unix, they’re fine I guess (but I’m not a lawyer).

Worth noting: mac os is unix (every release gets certified iirc).


Amusingly, Microsoft used to sell Xenix, which was a certified Unix IIRC.

That predates the SCO lawsuit by about 2 decades.


One of the weird side stories from the SCO saga was that of Dan Lyons, who at the time was a reporter with a little cottage industry in stories praising the work of SCO and predicting IBM’s imminent loss and comeuppance for all the Linux neck beards.

It later turned out he was also behind the briefly-popular “Fake Steve Jobs” blog, and on the strength of that reputation he left journalism to go into tech company marketing, with a brief side job writing for the show Silicon Valley.


Lyons has also made a "career" out of hating on technology and SV in general. Besides writing for the TV show, he's also written several books about why technology and SV really, really suck and are to blame for practically everything wrong in the world (e.g. Lab Rats: How Silicon Valley Made Work Miserable for the Rest of Us). Oddly, he did actually try to work at a high-tech startup on the east coast, and when he was booted out for incompetence, he wrote a book about why startups are just such bullshit, man. I have personally spoken to him, by sheer coincidence and long before any of this came to pass, and he was very interested in working at a startup someday, although he was writing for Forbes at the time. In other words, if you want my take, dude has serious envy and has spent years trying to exorcise it by punishing everyone who has what he covets.


He DID admit he was completely wrong with his predictions, so it's not like he was a blind cheerleader.


Only when forced to.


It's still better than lying.


This article doesn't give Microsoft enough credit for funding the whole thing.

I was going to write that the Internet remembers, but apparently that has its generational limitations. :(


> Microsoft, which had not yet learned to love Linux, funded SCO and loudly bought licenses from the company.

They could go into more detail how SCO vs IBM ended up being a proxy war for larger corporations and interests, but it's a short article and it gets most of the high points.


> but it's a short article and it gets most of the high points.

I disagree. It's a pretty detailed article (~1800 words) and Microsoft's role just gets a passing mention.


Huh, I'd forgotten about that but you're exactly right.


It's a friend now, remember?



Interesting choice of Ballmer as Dr. Evil... I always felt that Bill Gates's speaking voice resembled that of Evil.


Ballmer did crusade against Linux for a number of years.


Even when he was CEO, though, he was Gates's Number Two.


How's that Chrome download page hijacking thing goin'?


With friends like that, who needs enemies?


How different would the history of Linux have been had FreeBSD (and its forks) not been encumbered by lawsuits in the 1990s? Would Linux have gained the mind and market share it did or would the BSDs have won?


> Would Linux have gained the mind and market share it did or would the BSDs have won?

From the horse's mouth:

> Linus: Actually, I have never even checked 386BSD out; when I started on Linux it wast available (although Bill Jolitz series on it in Dr. Dobbs Journal had started and were interesting), and when 386BSD finally came out, Linux was already in a state where it was so usable that I never really thought about switching. If 386BSD had been available when I started on Linux, Linux would probably never had happened.

- https://gondwanaland.com/meta/history/interview.html


I suspect that if Linus had tried to actively contribute to BSD he might have gotten fed up and done his own thing anyway.


I think Linus would have forked it and made LinBSD and more people would have joined his effort. The Linux community was a lot more friendly and helpful to me than the BSD crowd in the mid 90's


The bsd community is still ups and downs today.

Not all the bsd communities are equally friendly.


That's what I was thinking too: when Linux was released there was no viable Unix-like OS for the current day PC machines.

Edit: actually there was SCO, but it was costly. So above really I mean "no viable free Unix-like OS for...PCs".


I've talked with a lot of people about this. The general opinion is that it was a combination of lingering concerns from the AT&T lawsuit and, perhaps more controversially, a somewhat difficult to engage with community. However, most of the same people also think that, had Linux not happened, BSD in some form would have occupied the space that Linux came to.

I did a series on this and related questions a few years back. (Podcast and transcript) http://bitmason.blogspot.com/2020/05/podcast-was-open-source...


BSDI was legally in the clear in January 1994 and FreeBSD had an unencumbered version out by the end of 1994. Linux distros were still toys at that point. I think it's fair to say that the lawsuits didn't help BSD, but the lawsuits don't seem like they should have been a decisive reason for Linux overtaking them in mindshare and marketshare.


Linux wasn't a toy in '94: I had a full X11 with emacs, g++, and all the other goodies and easily ported scientific software from the grown-up UNIX in the lab, using it to write my undergraduate thesis. This was using Slackware from 4 floppies. At the time I was offered two pills: Slackware or BSD. I picked linux since it sounded "newer and hotter".

It was a toy in '92-93 and they still didn't have good dynamic linking in '94, and I'd argue things were pretty dicey before glibc (I stopped using debian for years during the bo->hamm transition) but it was usable for production work at the time.

(none of this should be taken as a statement that BSD wasn't in great shape in '94; I had heard of the lawsuit but I don't think it affected my decision to go linux at that time)


'94 it was definitely not a toy, but certainly needing a lot of work from the user. But to me (at the time), that's also what made it so much fun. Recompiling the kernel every single weekend. Big part of my childhood.


> Linux distros were still toys at that point.

I had Slackware 1.1 running in 1Q94. It had X11 and you could run Mosaic on it plus all of the shell userland. It was pretty usable.


I installed Slackware 2.1 in the fall of 1994. I had everything working. X11, networking, sound, Mosaic. I compiled Spice 3f4 and used it for my EE circuit sims.

The next year I asked some questions on a FreeBSD usenet forum and was told to buy a SCSI card, SCSI hard drive, and new network card. No thanks. I just stayed with Linux because the hardware support was much bigger.


I wasted so much money on SCSI back in those days. This was the thing that finally killed it for me: https://en.wikipedia.org/wiki/Jaz_drive


I ended up buying a SCSI card in fall 1995 so I could use this. The drive was about $500 but each 650MB cartridge was only $30. I could format it with a ext2 and use it as a regular filesystem. No need to make an ISO image to burn. CD burners weren't cheap yet.

https://en.wikipedia.org/wiki/Phase-change_Dual

This technology was later turned into DVD-RAM which was never as popular as DVD+/-R(W)


FreeBSD didn't support shared libraries while Linux did, and that made a huge difference trying to run X11 on the 386 systems of the time, specifically because of hard disk space required.


linux's shared libraries were pretty limited in '94- I forget the technical reason but it had to do with runtime linker not being able to compute unique addresses for each .so, and so they sort of had to be assigned "ranges" that a particular .so owned (https://www.linux.co.cr/free-unix-os/review/1994/0914-a.html). Probably a.out vs. ELF.


Yup, that was the limitation of the old a.out format. ELF was a breath of fresh air, but it didn't arrive until after Linux 1.2.


What an interesting question!

I'm a bit glad that BSD didn't "win", because as time has been passing, Linux is getting to be an increasingly bad fit for me. I'm looking at switching to BSD (I haven't yet because that switch will be a lot of work -- I have a lot of machines) as a better alternative.

So BSD looks like an escape hatch to me and I'm glad that it's there.


>because as time has been passing, Linux is getting to be an increasingly bad fit for me.

Same for me, i switched about 6 years ago to FreeBSD.


It's not really a matter of Linux or BSD "winning" in the 90s. It wasn't the kernel and base system that was super important but what you could run it on and what you could run on it. A Free Unix(like) was an important base component but went hand in hand with FOSS software running on top.

With the likes of Apache and Samba PCs running Linux (or a BSD) could replace many thousands of dollars in software licenses for server products for zero licensing dollars. It didn't really matter if the base OS was a BSD or Linux, as long as the services ran on top on relatively inexpensive hardware it was a massive win for a lot of organizations.


IIRC, the BSD lawsuits were mostly about userspace stuff, ISTR the kernel was known clean. I wonder how big the license incompatibility was thought to be at that time to prevent releasing a BSD with a GNU userspace. I don't even know if the GNU stuff was complete enough.


I hadn't heard of Darl McBride (CEO of SCO) and I looked at his LinkedIn; this is his side of the story from his time at SCO, as he puts in in his work experience section. I found it super interesting.

  All of my work experience before and after SCO was of the startup/entrepreneurial variety. Fun stuff, new, exciting and positive. Then some former colleagues on the board of SCO (previously Caldera) convinced me to come in and try and turn around a company that had fallen from a billion dollar valuation down to a measly six million. They only had $8 million of cash and they were burning $4 million per quarter so I basically had 6 months to complete the turnaround. 

  What they didn't tell me was the company was in a dispute with IBM over disputed UNIX software code. This led to us filing a lawsuit against IBM, retaining David Boies to represent us, raising $76M to fight the battle, seeing our stock go from $0.66 per share to $22 per share, then falling back down to pennies per share after losing an important trial. 

  IBM teamed up with Linux programmers worldwide to go against me and I showed up on the cover of Fortune Magazine as "Corporate Enemy Number One". Hey, at least you can say that I was number one at something huh? I was as popular in the tech industry as Donald Trump hanging out at an Oscars after party with a bunch of Hollywooders.

  The legal battle is not actually over, 15 years later, the case is in review at the 10th circuit court of appeals in Denver.

  Silver Lining: While the courtroom battles were raging, we started a mobile apps business that I later bought out of SCO with some friends. That is where Shout came from. Back to the worlds I know and love - tech startup tied to sports and entertainment.


And today he's a Marjorie Taylor Green-retweeting bankrupt conspiracy theorist. Funnily enough his personal website <title> is "Billionaire Cheat Codes"


IDK, if I was put in charge of a company losing money like crazy and was genuinely trying to turn it around, my first thought would not be "oh, let's hire David Boies".


I know! For HN readers not familiar with the legal industry, Boies was considered the superstar of the legal world at the time and was well known to demand $50 million retainers.


As I recall, Microsoft put up a ton of money to fund the SCO lawsuit. They didn't hire Boies out of Unix profits.


I waded into this thread to point that out.

Some people are mystified by why so many long time free software people don't think microsoft is suddenly a good actor these days.


My point isn’t that McBride was doing something financially irresponsible, my point is that he’s obviously lying.


He’s technically only claiming that turning the company around was his job description when recruited. Is that this was a lie told to him.


Even if he's getting fed nothing but lies by someone else about how IBM really did steal their source code and winning this is in the bag and will totally turn the company around, and somehow has no ability to discern the truth himself, you still don't hire David Boies to deal with that.


You've decided to throw down with IBM with a lawsuit that could result in billions in damages. You've convinced yourself you're going to win, lies or no. Microsoft is shoveling money at you to do it. And......you aren't going to go hire a gold-plated superstar litigator, the guy whose already spanked Microsoft in court, "to deal with that"?

With all due respect...the hell you aren't.


> gold-plated

In 2003? Boies was probably best known for losing one of the most watched court cases in American history.

He's a great choice... if you want an savvy operator with a weak moral compass to pump your stock. Not so much if you want to win because you believe the company is in the right and has a promising future. (He might do that too, but no better than dozens of other effective lawyers.)


...and he's the (real) bad guy in the Theranos story. [Bad Blood, by John Carrerou]


It's weird. Boies got a great rep vs. Microsoft representing the Justice department, and obviously for working on the side of angels in Bush v. Gore, but he's also taken on some really ethically questionable clients -- among them, obviously, is SCO, but also tobacco companies. And, not for nothing, he both represented and served as a director for THERANOS.

And that's without even MENTIONING his involvement with Weinstein and the use of Israeli private intel company Black Cube to dredge up dirt on both Harvey's victims and the reporters covering the case.


http://www.groklaw.net/pdf/IBM-835-Exhibit_224.pdf follows a good explaination of sco's tactics at the time. During the lawsuit a dedicated server company rackshack (later ev1 servers) bought a license for all their users between. According to the owner Robert Marsh later in the trial, he felt in the end SCO was misleading and provided false information.


His personal website contains gems like this one:

> "Darl McBride will be remembered as one who fought for the rights of the individual against tyranny, for freedom against slavery, for intellectual liberty against herd mentality. In short, he is a Howard Roark (Ayn Rand's protagonist in 'The Fountainhead') for our age."

> Francis Erdman, Blog Kinetic

And other heavily narcissistic quotes. Not surprising that he saw nothing wrong in appropriating other people's voluntary work and calling them long haired stinkies at the same time.

What a fellow :)


If the stock went so low, couldn't IBM or whoever just buy SCO? Or were not there not a controlling number of shares available?


The LWN article discusses this point:

> it was widely assumed at the time that SCO's real objective was to prod IBM into acquiring the company. That would have solved SCO's ongoing business problems and IBM, for rather less than the amount demanded in court, could have made an annoying problem go away and also lay claim to the ownership of Unix — and, thus, Linux. ...

> IBM, though, refused to play that game; the company had invested heavily into Linux in its early days and was uninterested in allowing any sort of intellectual-property taint to attach to that effort.


I shit on a lot that IBM does, but at the end of the day, I will always give them respect for what they did against SCO. If they rolled over, that would have given validity to SCO's claims and who knows where Linux would be today.


If you can win, why set that precedent? The negative result for SCO is very useful to the rest of society.


It would have been more interesting to see SCO/Caldera being turned into a profitable Unix company, rather than going the lawsuit route. All development of SCOs products just stopped, there were never an attempt to turn the company around.


>retaining David Boies to represent us,

That name!!!

I followed the SCO lawsuit closely through Groklaw and /. back in the day.


During project Monterey I had to setup all of the services that allowed SCO/IBM/Sequent on the IGS/IBM side. A few short years later I was tasked with pulling all of the IBM AIX design docs and source code for discovery going back to the beginning.

One has to recall that what sued IBM was the Ray Noorda Canopy Group. Aka Caldera. Which also levied lawsuits against Microsoft. Far from the ocean view of Santa Cruz.


I was recently bitten by this ghost of the past in the present day! Since the TLA⁺ project is trying to join the Linux Foundation, they've recently started enforcing Developer Certificate of Origin (DCO) signing on all of the repos under the github.com/tlaplus org. The CI workflow started insta-failing on my PRs since none of my commits included a DCO. Fixing this isn't terribly difficult, github gives you a rebase command to run so you re-apply every commit locally with DCO signoff, then force-push. Then in the future you have to remember to run git commit -s -am "commit msg" instead of git commit -am "commit msg". The -s flag adds a DCO message to the commit.


BSD

BSD's were the worst hurt from all of this. Too many companies scared to use it due to lawsuit.

Really unfortunate in hindsight, and Linux forever benefitted.


That causality seems backwards. Linux "benefitted" because it was the target of the lawsuit in the first place, which wouldn't have been aimed at Linux had Linux not already been the market winner. It was the USL v. BSDI lawsuit of a decade earlier (which to be clear absolutely did involve copied Unix code) that effectively killed BSD in the market by holding it back as Linux left the launchpad.


The Linux and BSD cases actually turned out to be pretty similar. There were a handful of cases of literal copying but not to anything close to the degree claimed and pretty much trivial. (And SCO also made all sorts of claims around Linux taking knowledge, including negative know-how, and techniques from Unix that were completely ridiculous.)


And, as I recall, there were allegations that some of the AT&T code was actually Berkeley code originally.


The packet filter, for one thing. Yup.


If you like Star Wars, check out the "bit of fun" that the author linked in the article's comments: https://lore.kernel.org/lkml/20120415121413.111a7461@lwn.net...


>Microsoft, which had not yet learned to love Linux, funded SCO and loudly bought licenses from the company.

I'm disappointed LWN would blindly rehash the "MS loves Linux" propaganda like this. Microsoft still doesn't love Linux. Their whole business model is still based on proprietary software.


I think that line is sarcasm. Corbet has a pretty dry sense of humor


IMO the high-water mark was set in "Debian and the Hot Babe Problem":

The Debian developers raised the obvious, predictable objection to the inclusion of this utility: the associated images were covered by a non-free license.

<https://lwn.net/Articles/113644/>

That was ... also very nearly 20 years ago. The story left its mark.


SCO was just a sockpuppet of Microsoft.

They were providing the financial ammo.


Anybody know anything about the part where SCO actually won? I'm referring to the final event years later where IBM made a settlement payment to them. I assume it didn't make up for the years of effort, but I'd love to know what it was based on.


I think it was based on claims from the Project Monterrey contract. In fact, I think it kind of had to be - there weren't any other claims that were still active in the case, IIRC.


Providing legal protection for lawsuits can be a lucrative business. However, I'm not aware of any BSD-based companies that have capitalized on this opportunity. Are there any untold stories?


Microsoft had the same approach to fighting Android: it threatened OEM with patent lawsuits, but never made the patents it was hanging public. Some of the settlements included agreements to manufacture Windows Mobile devices - an operating System that was clearly on its deathbed.


Fwiw I think Darl may still be playing some games w his bankruptcy filings. Just an opinion but I’d be curious if he’s had multiple recent filings and if any have been dismissed (ie failure to disclose etc). I took a quick glance at a court docket - I’m no expert but it didn’t look good at first glance


Here's a choice for you all:

(1) copyright as the IP protection for software, with cleaned up laws and procedures about "insignificant changes" and "derivative works" -- or --

(2) the current patent aystem, post CLS Bank, which has eliminated a lot of business method patents and some, but not all, software patents.


> (1) copyright as the IP protection for software, with cleaned up laws and procedures about "insignificant changes" and "derivative works" -- or --

I'd go with this. But ideally with a 10-20 year limit on copyright[1].

I do think that there are some software patents (RSA, for example) that seem worthwhile. But the system as a whole seems like it does more harm than good.

---

1. My ideal proposal for copyright is exponentially increasing fees (after a short initial grace period) for a linearly increasing time period. But that seems like several bridges too far.


IP paying property taxes would be nice. Could even let owners set the value, if it were coupled to licensing fees etc.


Note that there are maintenance fees for patents, which are non-trivial.

they're every 4 years, though.


thanks.

please note that whatever a lawyer says about copyright law now is usually limited to what the courts have said plus the latest statutes and what judges are likely to rule.

The topic for today is harder: it's what should copyright law for software say? We're assuming that Congress does its job for once. The lawyers will never do it.

Don't like it that Disney had a forever copyright on Mickey Mouse? Me neither, but that's because copyright law is one-size-fits-all. The proposal is a carve-out where software law is different than literary.

So property taxes or shorter terms are definitely on point. "Which Unix is better" is not.


Whatever choice "zero patents for software" is. Already, no one shares their code, which would be in the spirit of the patent system. The patent system is designed to keep the design details open and public, but gives a legal monopoly to the holder of the patent. Currently, companies can write software implementations that are completely closed and secret, but still hold a patent for whatever the "thing" that algorithm does. Clicking one button to check out? Patented, and I still don't know what the detailed code implementation looks like from Amazon. The patent itself only shows the most simple, rudimentary steps as block diagrams. The most open software we have is that which is unencumbered by patents!


This is actually the topic of the first of (planned) four posts on software. For you lawyers, it's section 112.


> with cleaned up laws

Which one I'd pick depends a whole lot on what those "cleaned up" laws consist of. But generally, this looks like an exercise in determining which is the lesser evil.


"No IP protection for software at all" is your position, then?


For proprietary software you could have licenses. In fact, the reason that IBM first licensed some of its software for the IBM/360 was that it was unclear at the time whether software could be copyrighted.

However, absent copyright, it would probably be impossible (IANAL but have talked to lawyers about these topics) to enforce any usage provisions on open source software where you haven't explicitly agreed to a license. (Open source licenses basically give you rights that you wouldn't otherwise have under copyright law.)


See above comment: the question is not "what does copyright law say now?" but "is some variation of that appropriate as a replacement for patents?"

In other words, all options are on the table. We're assuming that Congress does its job, for once, and being that they're politicians, they do sometimes respond to public pressure.


Well, it's not just Congress. There's the Berne Convention which, with minor variances, governs copyright in most of the world. The US can do whatever it wants I suppose but it's not as simple as Congress saying: We're going to do our own thing. Who cares about Europe etc.?


I'm not familiar with that.

Congress passed the Sonny Bono Disney Protection Act (being facetious here). Was that before the Convention was signed?


I am not a copyright law expert. But the Berne Convention originally dates to 1886.

There is apparently some flexibility within the Berne Convention on copyright terms. There are also limitations on public domain works in continental Europe that differ from common law countries like the US. However, it does govern copyright in broad strokes. For example, the US used to require that a creator explicitly assert copyright while most of the rest of the world did not.

In any case, the US is generally aligned with copyright in most of the rest of the world. So any broad change in copyright law (other than perhaps somewhat shortening terms) would make it an outlier.


No, that's not my position.

Given the legal frameworks we have available, copyright is the right direction, IMO. Patents seem entirely inappropriate.

But US copyright law is a royal mess and is not really a great tool. It needs to be reformed -- but as the last copyright reform demonstrated, not all reform is an improvement, so my position is that ultimately where I'd fall between those two choices depends on what, exactly, copyright laws look like.


Information deserves to be free.


Is there anything about Unix -- then or now -- which is unquestionably better than Linux? And if yes, are the Linux versions of the features or capabilities a significant enough constraint to warrant using Unix instead? It seems that certain networking and NAS applications are measurably better on Unix (Netflix is still a big using of FreeBSD caching servers iirc) but can't Linux close the gap?


Well, that very much depends on the UNIX in question.

If one defines UNIX as POSIX, then, for the most part, or at least a pretty good part, Linux IS UNIX, because Linux implements much of POSIX, and in some cases does it better, e.g., certain functions for which POSIX is quiet or negative about thread safety are thread-safe on Linux, and there are almost always TS versions, e.g., strtok_r.

Outside of POSIX, one has to conside OS specific features, two of which come to mind as superior, though this is a judgement call: Solaris Zones and Mac OS.

I was a Linux desktop user for years, but I found that over time the desktop just started getting in my way (this was tennish years ago, so I am sure things have changed). I’d curse how gnome or KDE had once again broken my UX, then I’d help my wife or daughter with their Macs, enjoy the experience, the curse when I got back to my machine.

One day, after a particularly frustrating battle with whatever desktop I had then (I’d gone from Ubuntu to Mint to Debian to I cannot remember what, trying to find the sweet spot), I cursed, yelled that I would be back in 90 minutes, and bought my first Mac.

They’ve been a joy to use since. If Apple does something stupid, e.g., stopping at an old, brain dead bash, well, homebrew, et al, to the rescue.

Solaris Zones were wicked. One could get full MAC with labelled networking and restricted root with ease.

It’s finally possible to get close to Zones with a mix of capability management, namespace management, SELinux, et al, but not as easily.

So, yeah, there are individual UNIXisms that are better than Linux, but overall, at least on the server side, Linux is UNIX improved.


I've been using XFCE for about 15 years. It feels like nothing has changed during that time and I think that is one of its best features.


That's a major benefit of numerous older desktops.

XFCE is one of my own fallbacks (I usually prefer WindowMaker), though fvwm, twm(!), the boxes (open-, black-, flux-, hacked-, etc.), tiled WMs, etc., are all perfectly serviceable.

Some may look* vaguely dated, but tend to be rock stable and blazingly fast.

XWinMan is still live, I find: <http://www.xwinman.org/>


I'm not sure what window manager was on AIX in 1991 on an RT. Maybe it was twm. That is the first time I used Unix / X11.

When I started college in 1993 our default environment was mwm but I quickly switched to vtwm and had a really cool setup. I switched to AfterStep because I loved the NeXTSTEP look and used that and WindowMaker for 10 years before moving to XFCE.


Some Unices come with things that Linux can technically do, but they were built-in, not bolt-on.

ZFS is an example of one that's recognizable, you can bolt it onto linux and it (now) works quite well, but it was integrated much more into Solaris.

There are also Unix platforms that fully support hot-swappable just about everything, which you CAN do with Linux but it's not as simple.

Most of that stuff is gone by the wayside now, Linux is "good enough".


I seem to recall that Sun put ZFS under a license that has been deemed incompatible with the GPL by pretty much everyone except Canonical, and so that it can't be integrated with the kernel but must remain a "bolt-on" for legal reasons.

It won't surprise me if one of these days Oracle decides that Canonical is a ripe lawsuit target over ZFS.


Why would they if they haven't already? It's not like Canonical has a lot of money. But, yes, anyone else who Oracle might actually elect to sue won't go anywhere near ZFS.


Likely because they want to keep that powder dry until someone with deep pockets gets involved, Microsoft, or IBM or someone.

Because there's a moderate change that it goes against them.


Btrfs has also been improving. Certainly not for all use cases but it's the default for desktop Fedora now and some Synology NASs use it, among others.


On one hand ZFS has held down btrfs because "why bother" when you already have something that does much of what you want, but on the other hand it shows what can be done.

I hope btrfs becomes quite stable and usable.


Yeah, but even if bolt it on (which Ubuntu does for you, any other distro it's easy to do) it's still a "bolt-on" that doesn't integrate as fully as it could if it were the native filesystem, allowing full system rollback, snapshotting, boot management, etc.

It's similar to the "polish" available on macOS or Windows, Linux has all the pieces but you have to assemble them yourselves, even today.


Oracle itself deploys DTRACE with it's Oracle Linux.

That whole CDDL/GPL incompatibility is BS.


This would be a more convincing argument if your facts weren't over half a decade out of date. DTrace for Linux was relicensed under the UPL (with kernel components under GPLv2, just like the kernel) way back in 2017.

(speaking as the guy who wrote the scripts to change all the license text, though the actual relicensing was due to a lot of work on my boss's part.)


Well true but Oracle Linux included DTRACE for more then half a decade.

And canonical includes ZFS, and i think they have more lawyers then you and me.

Do you really think they would take that risk especially with oracle to include ZFS (that is not even their main fs)?

Have some commonsense.


> Is there anything about Unix -- then or now -- which is unquestionably better than Linux?

back when, unices were better at real multiuser, while linux was much faster at all sorts of I/O and CPU bound things in the personal-computer, single login context. Like massaging massive amounts text data (which everybody was doing to webify it), linux was blindingly fast, but don't trying to do much else at the same time, it would bring your machine to its knees.


interesting but not what I asked


It’s a heady picture because there were many Unixes. Most were me-to and fairly generic for the most part, but some had novel capabilities in specific areas. IRIX for graphics, Solaris for Zones and ZFS, but none of them had everything. Often what was special was down to proprietary hardware.


Wow, thanks for the blast from the past, SCO were my first Unix servers along with IBM AIX ironically, a long time ago in a galaxy far, far away...


The use of acronyms without definition is exhausting. A three letter one which has multiple uses all the more so.


SCO? It's the name of the company. Like IBM.



As someone who works building technology for commodity trading in the energy industry... you get used to it. Kinda.


Thanks, now I feel old.


Amazing read.


I missed the SCO v. Novell verdict by 5 minutes. Got a heads-up phone call from a Groklaw contact and ran for my car, but didn't quite make it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: