RMS starts with a grandiose vision, which (at the time of writing) he hadn't even begun. He then asks for time, money, and equipment, before the end of the first paragraph. The very first thing promised, a kernel, has never been effectively delivered.
Linus starts with a modest disclaimer, then asks for feedback as to what other potential users might want most.
Night and day. Yet, could we have had one without the other?
I find it fascinating that RMS made his pronouncement in '83 and Linus in '91 (8 years later). At the time (and I was at Sun at that time) 'GNU' was still pretty much just a concept, all of the effort from 1983 -> 1991 was in C (gcc), binutils, and emacs. USL was on the verge of suing BSDi for their BSD project (AT&T had been lobbing threats at the GNU effort for years, especially their use of the word 'UNIX' in the description of their acronym)
Post lawsuit RMS wrote his manifesto and the GPL was born, and the various BSD flavors of UNIX were the only UNIX kernels that had had their software provenance litigated.
Into that Linus wrote his entirely new kernel which was 'unix like' but not UNIX at all on the inside (generally user land felt like UNIX because MINIX felt like UNIX).
And because Linux had a pretty complete history from birth to present (making tracing software ownership possible) RMS annexed it as the GNU "kernel" trying to get everyone to call it GNU/Linux for a while.
I think you're discounting the state of GNU in 1991 by quite a bit. It's probably true that the bulk of the "effort" was spend on the toolchain (because, well, toolchain), but almost all of the GNU userspace (bash, coreutils, make, flex/bison, etc...) was present and working at that time. It was routine on the proprietary Unix boxes I was working with for someone to have built all the GNU stuff and left it in /usr/gnu/bin for use. Frankly the userspace stuff was already better in many ways (c.f. all the feature/bloat flames, "cat has arguments", etc...) than the proprietary equivalents already.
When Linux arrived, it was booting to a working free userspace within months. The FSF had, from their perspective, the right plan. They certainly weren't just a compiler.
I was also at Sun in '91. Michael Tiemann worked there for a while, on gcc.
Prior to my time at Sun, I'd made contributions to gcc, gdb, gas and emacs (all in support of the Convex architecture) before my time at Sun. (My former boss from Convex was the president of BSDi, too. Most of the engineers from Prisma, his company between Convex and BSDi, came to Sun when Prisma folded, becoming the early version of Sun's RMTC.)
You do some disservice to rms here. Easy enough to understand (rms is easy to dislike).
By 91, GNU had a compiler, and emacs, both of which Linus used to develop his kernel.
Not that this is particularly relevant to anything, but AFAIK even back then Linus was using MicroEMACS, not gnu emacs.
Having said that, I agree that people are generally discounting rms' immense contributions* a bit too easily here.
* (both technical contributions and in terms of just the entire culture of open source, which he really helped shape in a very significant way, and I say that as someone who mostly dislikes the GPL and prefers MIT/BSD style licenses)
I was a student at the time, but it seemed like GCC was like the Linux of its day, causing a lot of upheavals in the compiler market. GCC code was so fast compared to what was out there.
My impression, as someone who had a used Sun 3/50 in the early 90s, who later used Linux a lot, and who used various Unix workstations (IBM, Sun, Next, SGI, HP) in college:
For the bulk of Unix user/developers, the big thing was that GCC reintroduced the idea that you should be able to get free dev tools for your Unix OS, during a time when some vendors had decided that you should pay more to get the dev tools in addition to the base OS.
I agree, though it's notable that open source is quite distinct from free software. I recall the beginning of the open source movement well. I would argue that the GPL is Stallman's greatest contribution to programming.
RMS annexed it as the GNU "kernel" trying to get everyone to call it GNU/Linux for a while.
As dllthomas said, RMS never wanted the kernel to be called "GNU/Linux", he wanted (wants) the Linux kernel with the GNU userland to be called "GNU/Linux".
I see that you an link to the Wikipedia page on the naming controversy. "Operating system", at least as RMS uses it, doesn't refer to the kernel, it refers to the kernel+userland.
And to dllthomas' point, the controversy is the 'it'. Naming things is a pain, and naming computer things is a bigger pain. That was the root of the controversy.
Today most people who use a Linux distribution on their machine will say they run Linux or "distroname" (like Ubuntu or CentOS) on their machine. They don't say they run Gnome 2/Linux Kernel 3.x/gcc 4/gnu utils 2.8. By RMS' reasoning (and he is very precise in this) the "Operating System" is not "Linux" it is "GNU/Linux". They even added a field for it in uname(1) (-o).
The "Operating System" is a combination of the kernel, the editors, and user tools, and the compilers. Go back and read his original announcement, this is his definition:
"To begin with, GNU will be a kernel plus all the utilities needed to write and run C programs: editor, shell, C compiler, linker, assembler, and a few other things. After this we will add a text formatter, a YACC, an Empire game, a spreadsheet, and hundreds of other things. We hope to supply, eventually, everything useful that normally comes with a Unix system, and anything else useful, including on-line and hardcopy documentation."
By that original definition, we shouldn't even bother calling it GNU/Linux should we? Its just GNU. I expect taking that position though would be quite unpopular.
As the Wikipedia page alludes to, there were at least two camps at the time. Many, like myself who 'grew up' on the UNIX side of the house, the Naming Hierarchy was 'kernel->[userland]->window system' So it was SunOS->SunTools or SunOS->X/News. If you called something a "UNIX" system it meant that the kernel was based on the design architecture of Thompon, Ritchey, et al from Bell Labs. SunOS, IRIX, SystemV, and Unicos were all "UNIX" systems even though their userland code and window systems (if they had one) varied. Even in RMS' original statement of purpose, the first component of GNU was a kernel, and some other bits.
So to someone with the same background as I've had you could call it Linux/GNU (hierarchy preserved) or "Ubuntu" based on Linux with GNU tools. But Linux as the kernel architecture is at the top of the naming chain, not GNU.
For folks who came up from the DOS/PC side of things the 'window system' was the primary naming tool (in this case Windows). It was PC-DOS/Windows for a while but Microsoft decided to go all out and make "Windows" the brand/trademark they hung their hat on, and we got Windows/DOS, Windows/NT, Windows 98, Windows XP, Windows Vista, Etc.
As dllthomas illustrates, the interpretation of the 'it' part rooted the disagreement, and which side of the disagreement one resonated with seemed strongly correlated with ones early exposure to naming conventions.
My belief is that the emotion in the discussion came from egos, or people feeling person A or person B's contributions or impact were over or under valued. We can see some of that in the comments here where people emotionally jump to a conclusion that I'm down on RMS (I'm not), or even partisan in this debate (I'm not that either). There is ample documentation that Linus' goal with Linux was to write a new OS from scratch kind of like (but better than :-) MINIX. And there is ample documentation that RMS' goal was to create an operating environment that was unencumbered by onerous restrictions. It was fortuitous the Linus' work was so successful, and it was fortuitous that Linus could leverage the work done in the preceding 9 years in making tools available for him to use.
But it was also clear that Linus wasn't specifically supporting the GNU project by writing Linux (although he subscribed to the philosophy). When RMS started claiming as part of the project it set up this little tempest. Perhaps the term 'annex' is too emotionally charged to be used for that action.
FWIW, I frequently refer to the system I run as "Linux" - with the fact that there's a sizable amount of GNU code pretty much assumed by the fact that I'm running a Linux kernel and it's a server or desktop and not an embedded device or something. I just think the case for "GNU/Linux" is better than it is frequently made out to be, and (as mentioned) disagreed about Stallman's position on the issue.
"RMS annexed it as the GNU "kernel" trying to get everyone to call it GNU/Linux for a while."
This is not how I remember it. RMS has not, to my knowledge, ever pushed for the Linux kernel to be called GNU/Linux, but for systems running a combination of GNU software and the Linux kernel to be called GNU/Linux.
When Linus Torvalds was asked in the documentary Revolution OS whether the name "GNU/Linux" was justified, he replied:
Well, I think it's justified, but it's justified if
you actually make a GNU distribution of Linux ... the
same way that I think that "Red Hat Linux" is fine, or
"SuSE Linux" or "Debian Linux", because if you actually
make your own distribution of Linux, you get to name the
thing, but calling Linux in general "GNU Linux" I think
is just ridiculous.[34]
It came up in the documentary because there had been a huge firestorm over RMS saying everyone should call it GNU/Linux. Various allegations of 'usurpery' etc and generally bad feelings. Linus' counter, which he made in the documentary and I had heard in person at a Usenix conference was that if the GNU project made a distro (and this is still a valid point) where everything in it was free and wanted to call it "GNU/Linux" that would be great, but trying to change the kernel name from Linux to GNU/Linux is just silly (or ridiculous as Linus points out).
[34] ^ Moore, J.T.S. (Produced, Written, and Directed) (2001). Revolution OS (DVD).
That's somewhat oblique, easily attributed to misunderstanding (I'm not sure what actual question was asked - it's been a decade since I saw Revolution OS), and poorly thought out: a "Red Hat Linux" system at the time was running more GNU code than Linux code (and overwhelmingly more than Red Hat code).
Every damned other thing in there seems to argue precisely the opposite.
I defy you to find anything Stallman said (as opposed to characterizations by others, which could easily be deliberate or accidental mischaracterizations) that implies a position that a system without GNU code (or with insignificant amounts) running Linux should be called GNU/Linux. If such exists, then I will agree with you that Stallman once held the absurd position you attribute to him.
Edited to add:
"It came up in the documentary because there had been a huge firestorm over RMS saying everyone should call it GNU/Linux."
I agree that there was a huge firestorm over RMS saying everyone should call it GNU/Linux. I disagree with what that "it" refers to.
A Red Hat system was running more code for GNOME/KDE than either coreutils or Linux. Should we call it GNOME/GNU/Linux? Probably, because GNOME was designed from the beginning to be a usable desktop for users. There is a GNU, and it is running on their system, but it is just part of the system they use.
Also, LOC is a really poor metric for comparing the importance of a piece of software for a system. Many applications have far more LOC than the entire OS itself, and you wouldn't call that app more essential than the OS.
The world of computing had changed a lot in the 8 years between announcements: Consider the number of "Americans Online," the massive drop in the cost of computing power, and the increase in the number of computers in homes and offices between 1983 and 1991.
Stallman, a seasoned academic researcher talking to a small community, was quitting his job. Torvaldes was a student finishing his undergraduate degree at a time when Unix was widespread, Usenet had become popular, and computing power was ubiquitous enough that writing a kernel made sense.
...starts with a grandiose vision, which (at the time of writing) he hadn't even begun. He then asks for time, money, and equipment, before the end of the first paragraph. The very first thing promised ... has never been effectively delivered.
That sounds like a Kickstarter pitch. RMS was ahead of his time in yet another way.
Considering that the same guy then went on to write both emacs and gcc1 from scratch, mostly unassisted, seems to argue against the theory.
What I find interesting about RMS isn't that his productivity was suspect, but frankly that it just ended in the late 80's. He still writes code, surely, but there's a big wall in his history where he just decided that the code was there and what the world needed was moral evangelism. That's a fairly alien attitude to most of us, I suspect. I certainly could never have been happy with a pivot like that.
More generally the idea that the FSF was just lolling about not getting anything done until Linux landed in their lap is a weird modern fantasy. And ironically it's exactly the kind of thing they tried to treat by publicizing (or "stealing" if you swing that way) the "Linux" brand with the "lignux"/"GNU/Linux" nonsense. They wrote a ton of working code, most of which we're still using today. What they didn't write was Linux.
The GNU utilties, clib, gcc, all that stuff was just spinning out into the PC world. The software was being ported to everything from Amigas and Ataris with no MMU, and sometimes no hard drive, to Sun workstations. I thought it was pretty amazing - that a lot of hobbyists like me were turning our personal computers into little unix-like machines.
GNU was like this sun, and we were starting to move toward it. It was like we were sunflowers looking for sun... and then meteors being pulled by gravity.
For some people, the FSF and GNU project were more important than DOS or the Mac.
Oh please don't get me wrong, Stallman's contributions are immense and shouldn't be ignored.
Was the problem just in the different approaches between Stallman and Torvalds? That being: Stallman wanted to create GNU from the outside in (editors, then tools, then compilers, then the kernel) whereas Torvalds went inside-out?
>Stallman wanted to create GNU from the outside in (editors, then tools, then compilers, then the kernel) whereas Torvalds went inside-out?
A new editor or a new command-line tool was immediately useful to a user of a proprietary Unix.
In contrast, a new kernel would not have been useful to the user of a proprietary application or a proprietary command-line tool unless perhaps pains were taken to give the new kernel binary compatibility with proprietary kernels -- and even then there would have been worries about whether the user had the legal right to run the proprietary tool on the new kernel.
In summary, the order in which Stallman chose to build the pieces was the order that grew the user base the fastest, which in turn grew contributions from developers the fastest, in a virtuous cycle.
Grew the user base of portions of the GNU system the fastest, yes, and was arguably a correct choice for his aims. A number of things contributed to a faster rise of use of Linux compared to Hurd - which then saw the same virtuous cycle.
I think part of the difference in approach can be explained by the arrival of the 386 processor. At the time Stallman makes his announcement he needs Unix workstation vendors to donate their proprietary hardware for him to work on, because at that time that was the only way to get a running Unix environment. By the time Torvalds makes his announcement you've got 386 processors widely available, plus Minix as a viable and cheap (but not free in either sense) alternative to Unix running on top of it.
In some ways you could argue Stallman's approach was disrupted by the arrival of cheaper commodity hardware.
I think it boils down to two differences in approach: 1) Linus was more radically open about including others in development of the kernel, and 2) Linus was more focused on "working" than "right" in kernel design. These happened to combine beautifully, as a working kernel meant people using the system and wanting to make it better.
Linus also benefited tremendously from having the GNU toolchain and userspace already available.
Ah, yes he did, except he was obstinate about using anything more fancy than email. As I remember it, though, there were a lot of ports of the GNU tools to all the unices, and the PC was still not as good as the Unix workstations. So where's the motivation to make Hurd, you know?
People weren't really clamoring for Hurd or a kernel for GNU, because they got an OS with their computer, and got the GNU tools and libc. The really major geeks would buy old, used workstations, which were built like tanks and were plenty fast for running single-user unix.
There's also the more obvious difference in design. Hurd was supposed to be a micro-kernel whereas Linux was a much more straight forward monolithic kernel. One requires the inter-operation of lots of systems whereas one is just one big system that can just keep track of everything inside instead.
I'm pretty sure no one could build a functioning micro-kernel that'd be useful and able to compete at the time the Hurd project was still relevant. Then again, maybe I'm just making excuses for the FSF.
Well, that's more or less what I'd meant by 'focused on "working" rather than "right"' - GNU was trying to do what theoreticians suggested made the most sense; Linus was trying to do what had worked before. Both have pros and cons, but the latter is "safer" and certainly seems to have paid off here.
GNU emacs was begun after the 1983 announcement I believe. The first release was in late 1984. It's true that "emacs" the editor idiom had existed for years, but GNU Emacs the software was begun under the auspices of the FSF and written mostly/entirely in its early days by RMS.
"The last piece of Gosmacs code that I replaced was the serial terminal scrolling optimizer, a few pages of Gosling's code which was proceeded by a comment with a skull and crossbones, ... "
Right, but almost no one uses the Emacs or the imitations being referred to here: almost everyone uses the Emacs Stallman released after this 1983 announcement.
Side note: It turns actually out that if you announce how to achieve that goal, you're more likely to achieve it. Announcing without a plan is reducing the likelihood.
I think HURD's kernel would have had more pressure to deliver something (instead of endless academic bickering) without a serviceable Linux kernel. And GNU userspace has been pretty good!
Or it may have still fallen into endless academic bickering with nothing delivered and the GNU userspace may have therefore failed to take off at all. We'll never know what would have happened!
Linux met the need of people wanting to run UNIX at home, on a PC. To that market, it became available earlier than BSD by a couple of years. And it was free (save for the cost of a long-running internet connection to download it). Were BSD to have been available at the same time, there might (=would) be no Linux. Linus himself has said he would have just used 386BSD.
I'm not sure that demand for running UNIX at home, on a PC existed in 1983. What was the status of the 386 architecture in 1983?
And if it's true that the demand for a UNIX to run on a PC was absent in 1983, then it makes RMS's suggestions all the more interesting. As others have said, RMS was as much ahead of his time as he is weird.
The 386 was not available in 1983... The 286 was released in 1982. Linus was partially working on Linux initially to get to know the processor in his new machine, the 386.
Anyhow, I don't know why a 386 processor is required for people to want to run Unix. Unix was well established as 'the' OS for serious computing, and MSDOS was not a great alternative. Actually, MicroSoft released a lousy *nix themselves in 1980: http://en.m.wikipedia.org/wiki/Xenix
So, the idea of running Unix on PC software was mot unusual and made plenty of sense. Not necessarily for home users, but for small businesses and education professionals, who have been a significant market for personal computers all along.
Really? I though OS/2 was on the 386. I thought that the 286 had some limit on the size of a process, but you could execute multiple processes. I don't know why, but I think it was 64k per process. Also, the other thing I recall was thinking that it was a nice idea.
I was only familiar with 68k so I'm probably wrong here.
The 286 is a 16-bit CPU so there would have been various
memory limits. http://www.os2museum.com/wp/?page_id=313 has some details on what it can do, but not much mention of limits...
The lack of preemptive multitasking didn't mean that people didn't want to run a Unix-like OS at home, if that's what you mean. MSDOS was horrible, and regardless of the capabilities of your machine one would probably rather use something else.
I had Amigas with 68000s, and one with a 68030. AmigaOS didn't have any special capabilities on a 68030 that it didn't have running on a 68000. AmigaOS was a Unix-like IS with full preemptive multitasking that could operate in 256k of address space, on a 68000.
Oh yeah, DOS sucked. I remember Xenix and MKS and some other unixy toolkit. Oh yeah also Coherent. I ran a shell called Gulam on an Atari, and it was nice. Fake pipes were better than no pipes :)
Did all the processes run in the same address space on the Amiga??
The 386 had both protected mode and a large enough address space to make that protected mode useful. The 286 had protected mode as well, but a 16-bit protected mode doesn't let you write programs large enough for it to be worthwhile.
Given that I remember when we upgraded from a 286 to a 386, and I was born in 1983, I feel pretty confident the status of the 386 architecture in 1983 was "non-existent". Though certainly there were PCs at that point (wikipedia says original "IBM PC" was an 8086 in 1981, and personal computers predated that). Was GNU even targeting PCs originally, though? I recall that RMS's joke bio (early 80's) said he was recently separated from his PDP, which was a mini-computer.
Yes, non-existent. That's what I was suggesting. As I remember it, a lot happened between 1983 and 1991. I did not own a PC that could run UNIX in '83, but I did in 1991. Perhaps it was the same for Torvalds.
No, I don't think RMS's GNU was targetting "PC's", i.e., non-shared computers, in 1983. Few people would have owned them. My point, perhaps missed by others, was that Linux was aimed at a new bit of hardware that many people had acquired by 1991, the 386 PC.
I live in Brazil. I remember when I went to the local university in order to grab a 40+ diskette copy of slackware. A few months later I subscribed to a CDROM compilation (can't recall the name, something like imagemagic) that would mail me a 6 CD case containing many of the early distros every couple months - back in the day downloading it was too expensive (things improved a lot but telecommunications here are still very expensive compared to EUA).
"Individual programmers can contribute by writing a compatible duplicate of some Unix utility and giving it to me. For most projects, such part-time distributed work would be very hard to coordinate; the independently-written parts would not work together. But for the particular task of replacing Unix, this problem is absent. Most interface specifications are fixed by Unix compatibility. If each contribution works with the rest of Unix, it will probably work with the rest of GNU."
This stood out to me. Back in 1983 online collaboration was unheard of, and it was only the incredibly modular nature of Unix which made the project seem at all plausible.
> Back in 1983 online collaboration was unheard of
Not really. Stallman mentioned Chaosnet which the AI Lab had used for, among other things, internal mail (not yet called "e-mails") between developers about changes to programs. He also mentions UUCP, which was the godawful Unix way of sharing files (and which could be used for mail purposes).
This is really, IMO, why no new community OS has ever taken off. Only clones of existing architectures. I was briefly involved with TUNES many years ago. Getting people on the same page is an immense task, when creating something new and different. You get stuck in the bike shed tar pit. Focusing on a well-specified Unix clone was definitely the right thing to do.
"I am Richard Stallman, inventor of the original
much-imitated EMACS editor."
Stallman may have significantly improved Emacs, but he isn't the inventor. Guy Steele and David Moon are. Stallman only took over development after it had become the standard AI text editor. Stallman wasn't even the first one to implement Emacs in Lisp; Dan Weinreb did it first. "Inventor of the original" makes it sound like Emacs was his original idea.
On that page Guy Steele says "(3) RMS is responsible for the names “E” and “EMACS”. RMS still deserves 99% or 99.9% or 99.99% or 99.999% of the credit for taking a package of TECO macros and turning it into the most powerful editor on the planet, twice (first in TECO and then with ELISP),"
I really dislike the term invent when applied to creating software. Emacs was never invented, it was written (or if you prefer, coded). Text editors were invented long before Emacs. Just iterating and improving on a concept doesn't make you inventor.
To be fair: "text editors" really hadn't been invented "long before emacs". At the time (the early 1970's) editing text was a subject of much research and experimentation. Differing paradigms were being tried on new and exciting hardware (the glass tty). Screen editors, as they came to be known, were a brand new curiosity -- they were equally disruptive (if not more so) as the "web application" or "capacitive touchscreen interface" would be decades later.
And emacs was one of the very first screen editors. It invented lots of the stuff that would later seem "obvious".
So no: I think it's entirely appropriate to say that emacs was "invented" in the same way the browser was.
It's before my time, but my impression was that both TECO and EMACS originated as teletype line editors (as with ed and ex, the direct predecessors to vi), and were only later adapted to fancy new screen terminals. Am I mistaken?
I likewise was never a user. But the EMACS package of TECO macros was intended specifically to enable screen editting on terminals as I understand it (though surely there was overlap). Basically EMACS:TECO as vi:ex.
People's thinking about some of these things (questions of terminology, etc.) was still evolving 30 years ago. I'd be interested to see if he still refers to himself as the inventor of EMACS. At the time he me have considered it an alternative way of saying "author" or "creator".
There is also an argument to be made that EMACS was not an improvement over TECO. TECO had a very similar command structure to vi — optional count, command, and optional argument terminated by ESC. (I have not seen any evidence that vi imitated TECO rather than arriving at the same structure independently.)
> There is also an argument to be made that EMACS was not an improvement over TECO.
Full disclosure: I've been using emacs since 1981.
ISTR TECO under TOPS was a PITA.
eg. From Sec 4-1
Some characters, like <CTRL/U>, are both regular TECO commands and
immediate action commands. The command string ^Uqtext` enters the specified
text into Q-register q. However, <CTRL/U> typed while entering a command
string is an immediate action editing character which deletes the current line.
Thus you cannot type a <CTRL/U> (or any similar sequence) directly into TECO
as part of a command string. Nevertheless, <CTRL/U> is still a valid TECO
command; should TECO encounter it in a macro or indirect file, it will have its
regular TECO effect (^Uqtext`).
Sure; that's an effect of having a terminal line edit character (^U for erase line) also in use as an editor command. I'm not defending the details of TECO's particular commands.
But it reminds me of the first time I tried EMACS, actually, and my reaction to backspace bringing up a help file.
Hopefully in 30 years people will be able to read such seminal messages. Now that a lot of discussion is happening on proprietary platforms without a standard it may not be the case.
It's already the case with some stuff from the '90s: there was important stuff announced and discussed on CompuServe and AOL forums that has not been publicly archived. There are some personal archives out there, but they are pretty spotty. Often the only surviving (or at least accessible) record comes in the form of occasional quotes in other venues, like an academic paper from the 1990s that included a blockquote and a citation to a now-unreachable network address.
I wonder GNU Guile fills this requirement. It exists, but isn't common. I've never personally encountered it. Python or Lua seem much more prevalent as extension languages on a typical GNU/Linux system.
This has been a vague goal since the mid-'90s, and given the popularity of elisp etc, would probably be a nice boost for Guile, but the gulf between neat-idea and messy reality has always seemed so vast that I never thought it would actually happen.
[I haven't used Guile since the '90s... it was a bit of a disaster back then, but from what I hear, it's improved immensely (i.e., largely been rewritten) since.]
It is not too late. StumpWM is written in lisp. If I had more time, I could write a manifest like RMS's about a system entirely written in Lisp. At least, I've started to gather a few files in a git repository.
Try Fedora, they have guile 2.o working out of the box, guile binds C libraries quite easily, the documentation is decent (and ofcourse under GFDL). Fedora is pretty much the "GNU/Linux", RMS wanted (except some firmware, which RMS is touchy about.) Fedora has very good, tools for developers (all libre/opensource), they even have a feature slated for Fedora 20 called dev-assist, which helps setup several development environments and manage them pretty easily.
Man I miss playing this game. The summer of 1990 we had an Amiga Empire tournament over the summer at our high school (I was going in to grade 11) where a classmate had digitized the world map with Europe, Asia, Africa and Australia, and he had set up a modem so we could call in and play our turns every day.
I learned the hard way that you never, ever want to play as the Levant. After being destroyed by India, Africa and Europe, I restarted in New Guinea and was able to take over Australia. This went a long way for me to understand why remote, relatively homogenous cultures tend to be quite stable.
"GNU will be able to run Unix programs, but will not be identical
to Unix. We will make all improvements that are convenient, based
on our experience with other operating systems. In particular,
we plan to have longer filenames, file version numbers, a crashproof
file system, filename completion perhaps, terminal-independent
display support, [...]
Linux has, eventually, started to fulfil the promise: technologies like cgroups, dm, uevents, kdbus, alsa..., and the respective userspace: systemd, dmraid, lvm, udev, pulseaudio, show that GNU/Linux is not UNIX but better in some respects.
I can’t believe you seriously consider PulseAudio to be a technological improvement. Having used PA for years, I have encountered and reported numerous bugs and performance issues. After trying to debug some PA issues, I lost all respect for Poettering. The only worse sound server that I’ve encountered is AudioFlinger, and at least that has the excuse of being optimized for battery life over latency.
It’s clear that systemd will break compatibility with BSD. What isn’t clear yet, is if this is truly a better system. I have nothing against change. I’m all for Wayland. I just have no confidence in Poettering's work.
Most of the other improvements that you listed have been in BSD for years. Its sound subsystem OSS4 is also a much cleaner API and better performing interface than ALSA.
I agree. And I think that FreeBSD and DragonFly stand over the rest in this respect, both have incredible technology in their guts. Obviously OpenBSD has done wonders teaching people that you can write sane, safe and audited code that works and NetBSD... Oh well, stagnant despite the really good technology that shows up in there frequently, but runs on toasters!
Innocent question: what happened to GNU? Lotsa good tools available, but as Minix spawned Linux which has a good chunk of the world, GNU as an OS seems but a legend.
"Linux" is the GNU OS - most of the pieces are the same, and many of them have histories that you can trace back well into the '80s. It's just the kernel that's different, but that's one small piece that interoperates with the rest of it.
There are a lot of parts in a "typical" Linux system these days that are non-GNU, and a lot of GNU pieces have become less central or been displaced in many distros (dash in place of bash by default in Debian, for instance).
This is not to take anything away from GNU, or the far more central role GNU code played in earlier Linux systems; I think it's totally legitimate to say that modern Linux systems are a success of the GNU project and the GNU vision. But it's less true these days that (desktop) "Linux" is "A GNU system with the Linux kernel."
> dash in place of bash by default in Debian, for instance
Dash takes the place of the Posix binary /bin/sh for performance and memory reasons. The default shell for users remains /bin/bash — this is what a terminal application will run, which is what a “shell” is supposed to be. The /bin/sh thing is not supposed to be viewed as the shell, and its replacement is a non-issue in this context.
Hm, I think I recall some install of something that had only dash by default, but I could easily be misremembering. Regardless it has made bash slightly less central, and there are probably other examples of GNU code being displaced.
Originally you said “a lot of GNU pieces have become less central or been displaced in many distros”. I challenge this statement and put it to you to provide examples.
Simply saying “there are probably other examples” will not do. This sounds like simply wishful thinking to me.
So you don't think bash is now "less central"? Off the top of my head, Clang now provides an alternative C compiler. Arguably the windowing system has become more central as there are more users that don't touch the shell, and that has never been GNU. None of this is to say there is not still a substantial amount of GNU code, just that it is a smaller fraction of a typical system than it used to be.
It's by no means wishful thinking - I am a little more a GNU partisan than not.
> The replacement of /bin/sh is not replacing bash. Bash has not been replaced.
Bash has been replaced in its role as /bin/sh. If you're disputing that, I'm confused or baffled. If you're not disputing that, and simply pointing out that most systems are configured to have bash installed and to use bash as the default shell, you're addressing a point I already conceded instead of answering the question I posed: how is it not fair to say that replacing bash with dash in the fairly-central role of "shell which executes system scripts" makes bash "less central"?
It's everywhere on Linux, and a even few other places... quite successful.
Building the kernel is one of the harder parts of that goal, and when Linux took off the gnu kernel lost a lot of steam. That they chose a more difficult design only helped. But the question is moot now.
Even at that early stage, he realizes that GNU doesn't have to write everything. Having a large enough body of libre-software is the real goal. And who knows, in 10 or 20 years it may be time for a new kernel design?
GNU is alive and well -- the GNU kernel isn't (or has been making very slow progress). The project just diverged from its original technical goals, which is common.
Stallman would say GNU is an OS without a kernel. Debian GNU/Linux would be the complete OS. Someone pointed out on HN recently that many things user space on a typical "Linux" box comes from non-GNU projects, and it's not x11/GNU/Linux, or gnu/FreeBSD, or llvm/FreeBSD. So yeah I sort of see their point.
Especially hard to catch up with Linux when Linux began development in 1991 and Hurd began in 1983. Linux is only -8 years ahead of Hurd in development, you'd think Hurd could have caught up by now.
Linux was well ahead of where Hurd was at the time (and, in fact, ahead of where Hurd is now in terms of practical usability) when Linux attracted those investments (which mostly occurred after 2.0 in the mid 1990s when Linux was 5 or so years old and Hurd over a dozen years old.)
Kind of ironic, in retrospect, is Linus's original comp.os.minix posting announcing the project, saying on part, "I'm doing a (free) operating system (just a hobby, won't be big and professional like gnu) for 386(486) AT clones."
There's been definite progress with HURD recently. I'm not sure mid-90's Linux is still more usable than HURD (though I'm certainly not sure of the converse, either).
> I'm not sure mid-90's Linux is still more usable than HURD
Mid-90s Linux was more usable on then-current hardware than the Hurd is on now-current hardware, though I suspect that mid-90s Linux would be less usable now (on current hardware) than the Hurd.
That's likely the case. I remember some struggles with Linux in those days, but any direct comparison is clouded both by memory (I was pretty young at the time) and by a lack of any direct experience with HURD.
Light years ahead in what sense? I think it's an anachronism from a theoretical standpoint. I'd like to see more people run with newer ideas from Plan 9 / Inferno / Hurd / all the other operating systems that aren't a monolithic 1970s design.
Usability for pretty much any of the purposes for which one uses a computer operating system other than exploring theoretical ideas about how to architect a computer operating system.
> I'd like to see more people run with newer ideas from Plan 9 / Inferno / Hurd / all the other operating systems that aren't a monolithic 1970s design.
So would I. That doesn't make any Hurd more ready for any other use other than exploring newer OS ideas that Linux is.
Or even the most ready for such uses (or even the exploring OS architecture use) of the "Plan 9 / Inferno / Hurd" set.
In the sense of being a working kernel you can use as the center of an operating system. I like to see people run with other ideas too. But hurd didn't have that idea, and they aren't running with it. They are stumbling around blindly. Ironically, minix is now light years ahead of hurd, and it is also a microkernel.
Also, plan 9 doesn't explore much of anything in the kernel. It is a plain old boring monolithic kernel. The exploration in plan 9 was in the userland.
I used linux back then too, and I don't recall anything of the sort. IBM was still fighting against linux up until around the time of the redhat IPO. Looking back I can't find anyone referencing any IBM contributions that early.
> Apparently, they are still making progress, but it must be hard to catch up with Linux.
Its probably hard with far fewer active developers, and even harder with the amount of Hurd's development effort that's gone into a series of dead-end efforts to replace the kernel.
Well, GNU has traditionally been awful at systems programming. They worked for more than a decade on the HURD kernel even after gcc and coreutils were long completed. Eventually, Linus got tired of waiting and created his own operating system using the GNU utils (I so wished he had used the 386BSD tools instead), and HURD pretty much just went away.
The mailing lists are full of spam and progress is at a crawl. It's awful. If you want progress, just use Linux.
In fact, I think he may have once said that if 386BSD had been available at the time, he wouldn't have bothered doing Linux, or at least making it into what it is now.
They are a smaller code base, commands take less options, do less things. It's closer to the original UNIX userland.
A minority of people prefers this characteristics, history suggests that the majority just enjoys the added features: the GNU userland was widespread before Linux existed, people today install (parts of) the GNU userland on Mac OS X using fink/macports/homebrew.
For those interested in the history of open-source "UNIX" clones but who weren't in the industry at the time, I stumbled across this last night and thought it worth sharing: http://www.cs.vu.nl/~ast/brown/. Read the "original comment", "follow up", "code comparison", and "rebuttal".
Perhaps a novice question coming from a mostly Windows user: an OS I thought handled processes, hardware, etc. It can of course be handy to have a linker/editor/compiler shipped with it too, but why is it considered part of the OS? Is gcc/vi/emacs more a "part" of GNU/Linux than notepad is a part of windows; i.e. just a convenience? Or is linux unusable without a complete c-based tool chain? I know the term "OS" is a fuzzy one, which is probably what causes my confusion.
My personal opinion, which might not line up with anyone elses: GCC is a 'part' of GNU/Linux because of the goals/aims of GNU/Linux, which is to have a freely modifiable and distributable operating system. If you put yourself in the time when this project was started, there was no free compiler, they simply didn't have one. But in order to meet the goals of the project, they would need one (you might not need one to run it, but you sure do need one to develop it), so it's therefore a very core part of the project.
You don't need a c-based tool chain installed to run a GNU/Linux system or make it usable for a non-dev though, no, Ubuntu doesn't have it by default for example.
Right. Not so much. That was exactly the last thing to happen as it turned out. He and everyone else in the effort worked on the tools and apps.
By contrast:
> I’m doing a (free) operating system (just a hobby, won’t be big and
professional like gnu)
> and things seem to work. This implies that I’ll get something practical within a few months
Anyhow, torvalds had already put in the work on the kernel and wanted feedback.
Torvalds had a kernel and no tools. RMS had tools but no kernel, so the inevitable happened and they were wed. Torvalds even chose the RMS version of a marriage license.
The marriage is fruitful, but RMS is frigid and bitchy the whole time, solely over the name of the child -- even though the child has matured and gone on to a brilliant career they can both be proud of.
This would also be a good time to note that Torvalds is not the one who chose the name of Linux -- usenet chose it.
I have great respect for RMS and admire his courage (and self-discipline) very much, but I really wish he'd let that thing about the kid's name go. Just let it go already Richard.
Well no, Torvalds had a kernel and all the tools GNU had created.
And Linux is (entirely legitimately) the name of the kernel - the dispute[1] is over the name of systems running a bunch of GNU code on top of a Linux kernel.
[1]: Elsewhere it has been contended that RMS has pushed for the name to apply to the kernel itself as well. I don't believe that was ever the case (though will certainly update my beliefs if presented with evidence) but am far more confident that it is not his current position.
http://www.thelinuxdaily.com/2010/04/the-first-linux-announc...
RMS starts with a grandiose vision, which (at the time of writing) he hadn't even begun. He then asks for time, money, and equipment, before the end of the first paragraph. The very first thing promised, a kernel, has never been effectively delivered.
Linus starts with a modest disclaimer, then asks for feedback as to what other potential users might want most.
Night and day. Yet, could we have had one without the other?