Hurd is late because there isn't any need for it anymore, we now have plenty of free kernels. And GNU isn't sponsoring it because of that, just like it isn't sponsoring a GNU MTA.
But work continues because a few people are still interested. There's a lot of neat ideas in Hurd. You get a lot of nice things once you move most of the kernel to user-space.
The best example of this is custom user-mounted filesystems (translators). You can e.g. mount a remote FTP site on Hurd in ~/ftp without being a superuser, this is something that GNOME and KDE implement on their own because users can't do it on other systems without special permissions.
Linux has been accumulating a lot of microkernel-like features over the years. E.g. dynamic loading of device drivers, FUSE etc. In time it'll probably usurp every useful Hurd feature worth having.
I'm not convinced that Hurd's translator idea is a hack, or that it's doing it in the wrong layer.
That every program that presents FTP on my system (GNOME, KDE, lftp, Emacs/TRAMP) has to implement its own VFS is a hack, so is having to use iptables or another forwarder if I want to run a non-root daemon on port <1024. And that I have to recompile and reboot if I want to enable some minor feature in the kernel.
EDIT: The last two paragraphs are redundant because I misread rbanffy's reply, and I oddly can't reply to his new one so I'm putting this here.
This is essentially treating the hypervisor as an exokernel operating system. And it seems to become more common.
Perhaps we are really coming to exokernels in a very roundabout way.
I see exokernels as the future, but I'm slightly biased, as I'm involved with one...
Still an active project as far as I know. Mick Jordan has as of late been hacking on Maxine and I've seen GuestVM related commmits in Maxine.
> However, it makes no sense when you start development to try to guess what calls you might end up using.
I agree. That's why you will be using libraries to duplicate what your OS did for you. The Exokernel guys call this a library OS. Libraries are already quite good at providing abstractions.
The difference to a conventional architecture is, that the kernel only provides secure multiplexing between application on the same machine. But the abstractions from the hardware are provided by some libraries. It's easier to experiment with libraries than with operating systems.
The MIT-Exokernel-Operating-System page (http://pdos.csail.mit.edu/exo.html) explains this in more detail. If you look at the first picture, and replace `Exokernel' with `hypervisor', you get modern virtualisation.
I am uncomfortable having what amount to be sci-fi versions of Unix and VMS to chose from. What would be a cutting-edge, legacy-free OS these days?
Though sadly this page is from 1998. But there are some people who are still working on Exokernels. You can meet them here at HN.
For instance, the Berkeley people were also working on free stuff around the same time (indeed, the article points out the BSD was a contender for the GNU kernel).
I'm wondering if GNU is to free software what, say, Alexander Graham Bell was to the telephone.
Also, note that Alexander Graham Bell invented the first practical telephone. RMS is an advocate. Much of his software work is specifically not original: he was trying to emulate UNIX. He and GNU are more of a founding fathers to the US than inventors of a device. Yes, eventually someone else might have declared independence in the western colonies and went to war with Great Britain, but we admit that it was GW and company.
The problem with BSD is that it creates (or at least doesn't remove) an incentive to take whatever you can and run with it that has proven irresistible for companies. Every proprietary Unix has appropriated large portions of BSD and, with few notable exceptions, none gave improvements back - or freely added original work to the common code pool because their competitors could take it and run - take whatever you gave them and compete against you with it.
GPL-like "viral" licenses negate the threat by ensuring any code you contribute cannot be used as a competitive advantage against you.
If it weren't for RMS and the invention of GPL-like licenses, I seriously doubt we would have a healthy open-source ecosystem.
No one can tell what would of happened if this was the case. And expressing your personal opinion doesn't change that.
>The problem with BSD is that it creates (or at least doesn't remove) an incentive to take whatever you can and run with it that has proven irresistible for companies.
And what is wrong with that? Developers know what they're getting into when they license their software under the BSD/MIT licenses. It's better that companies take high quality BSD/MIT licensed code instead of reinventing the wheel by creating their own crappy implementation. I don't even recall any successful high profile proprietary fork of popular BSD/MIT licensed software.
This is why all of my friends and I distribute code under a BSD compatible license rather than the GPL. Well said.
You can look at the market and see if you could form a Red Hat around BSD. Call me back when you get funded.
> Developers know what they're getting into
And that's precisely why stuff like BtrFS is not BSD-licensed. Because the following week, Microsoft would launch their new and improved next-generation NTFS. There may be no successful BSD-branded ("ClosedBSD"? "ArrestedBSD"?) fork (BTW, is JUNOS open? I couldn't download the source) but certainly many pieces of BSD software end up inside proprietary software, and nobody knows exactly how those wheels were modified.
I don't really see how monetizing from BSD/MIT source code would be different than monetizing from GPL code.
>Call me back when you get funded.
I am sure Apple, Microsoft and Adobe make more money than Red Hat . So again what is your point? Not everything in this life is about money.
>And that's precisely why stuff like BtrFS is not BSD-licensed.
And that's precisely why FreeBSD folk have ZFS and Linux folk don't.
>Because the following week, Microsoft would launch their new and improved next-generation NTFS.
ZFS porting from OpenSolaris to FreeBSD hasn't been easy. What makes you think that porting another modern and complex file system from Linux to Windows would be easy for Microsoft? And anyways it would be awesome that we could get native read/write on Windows partitions from Linux.
>many pieces of BSD software end up inside proprietary software, and nobody knows exactly how those wheels were modified.
>>Developers know what they're getting into
Does Apple have a real model that benefits open source/free software compared to the much smaller Red Hat? The only companies that I knew that contributed the same amount to open source/free software as Red Hat were Sun and Google.
But I must be just, Apple has released some open source code (GCD comes to mind) and as far as I'm informed funded some open source software (LLVM for example).
"I think it's highly unlikely that we ever would have gone as strongly as we did without the GNU influence," says Bostic, looking back. "It was clearly something where they were pushing hard and we liked the idea."
<sarcasm>And no one else would of ever gotten to the task of creating a BSD/MIT licensed kernel since everyone would of thought "if Berkeley people didn't do it why should we?".<sarcasm>
Now he just eats detritus from his foot: http://www.youtube.com/watch?v=I25UeVXrEHQ
Some very good and lasting work got done back then, in spite of our rather unconventional work habits. I'm thinking especially of all the work done laying down the foundation of GNU libc. Roland McGrath got a lot of code rolling and, perhaps more importantly, established a pretty good standard of coding conventions and quality expectations.
I did not myself work directly on the HURD but in our small office I did have chats with McGrath and Bushnell about it. The sentiment around the design was, I think it fair to say, somewhat giddy. The free software movement was (and is) all about freeing users from subjugation to those who provide software. The HURD's microkernel architecture and the structure of the daemons would securely free users from subjugation to system administrators - each user could securely invoke a set of daemons to create the operating environment he or she wished, no special permissions required.
It was well understood back then, and even a point of discussion in academia, that a microkernel architecture posed some difficult problems for performance (related mostly to a greater number of context switches as messages pass between daemons rather than syscalls being handled by a monolithic kernel). Rashid's work had suggested that this problem was not so terribly significant after all. And so, at least to me, it felt like the GNU project was not only doing this shoestring-budget freedom-fighting hacking, but also leading near the bleeding edge of CS research made practical. Well, that was the theory, anyway, and we were mighty proud of ourselves and generally excited to be there.
Not much but some of the hacking of the core staff took place "from home". You must remember that this before any kind of data over voice or particularly high bandwidth connection was commonplace - so that hacking was over modem connected to text terminal. Mostly we hacked in a shared office which, if you saw it, you'd think "Wow, that's a slightly large closet." We were, at that time, guests of MIT.
With all due respect for RMS, and I don't think he'd especially disagree with this (though I could be wrong): he was an absolutely terrible project leader for the hacking part. As history has shown, his popularity among some notwithstanding, he's extraordinarily good at the political part of his works. Leading the technical project? Not so much.
It wasn't so much that he dictated bad technical choices. Even the choice to use Mach might have worked out. On the contrary, he was relatively "hands off" in most technical matters, only micromanaging if you really dragged his attention to some detail. It was more that he lacked any coherent overall strategy for completing GNU and his broad directives involved underestimations of the amount of work involved and were sometimes scattered, even bordering on inconsistent. It just wasn't his strength.
The original vision for the GNU system, at least as I understood it, was to - sure - grow a unix clone, but then to build a user space that much more closely resembled that of lisp machines. Emacs (with its lisp extensibility) was taken to be a paradigm for how interactive programs might work. Originally, it was even envisioned that the window system would be lisp based.
One early change to the original GNU vision occurred when it became clear that X11 worked pretty well and was here to stay and would be free software. As a practical matter: just use that.
Later, as mentioned in other comments here, the ECGS fork of GCC caused issues - ultimately leading to the displacement of an FSF appointed project leader. There is some back story to that. The company Cygnus (later acquired by Red Hat, founded by M. Tiemann et al.) had been advertising to customers that not only could they develop customized extensions to GCC, but that they could shepherd those extensions into the "official releases". There was frustration at Cygnus and some other firms that the FSF branch was not merging these changes quickly enough or was arguably being too prickly about the nature of the changes. As nearly as I can tell those sentiments led to the ECGS fork and RMS was ultimately put in the position of having to choose between "blessing" that fork or simply losing any claim at all to the future of GCC.
Around this time, I am told but can not myself verify, RMS was also under pressure from some key FSF advisors or supporters to exit the software development business and focus on the politics. Whatever the motivation, the FSF shed most of its in-house development efforts.
The pattern of losing the original GNU vision continued in the controversy over Gnome vs. KDE. Originally, KDE had licensing issues and did not pass muster with the FSF as being free software. Those problems have since been fixed but at the time it led to RMS' proclamation that Gnome would be the desktop for GNU -- a radical departure from what was originally conceived. Later, as you may have read, RMS came to describe Miguel as a traitor to the free software movement.
Somewhere in there - I'd have to look things up to get the timelines exactly right - Debian took off, in part to try to fill a void in the FSF's leadership at assembling a complete GNU system. Bruce Perens penned the now famous "Debian Guidelines".
A small group of relatively wealthy influencers, including Tiemann, met with Eric Raymond and conjured up the allegedly business-friendly "open source" notion. The main differentiation they sought from the FSF is that they would not condemn proprietary software or describe themselves as a freedom movement - they sought to emphasize the economic advantages of having volunteers do work for no pay. In my view, their main purpose upon founding was to attempt to politically marginalize RMS (a project in which they've had some success).
Bushnell moved on to a different stage of his life and, I guess it's fair to say, a higher calling. McGrath moved on to what I gather is a sweet job for Red Hat. The GNU project was gutted. Its institutional memory and such momentum as it may have had was gone. This was in part because RMS was not so great as a project leader but also, in large part, because the project was under significant attack.
In my humble opinion, there would be plenty good to come of a resurrection of the GNU project. I don't necessarily mean a resurrection of the HURD although I suspect we can do better than the Linux kernel. I do mean a return to a concentrated effort to build the kind of user space originally envisioned. While such a project could have enormous social benefit, I don't see any way to institute it and find support enough to carry it out.
Then you should look at Smalltalk. Just deploy with the compiler and dev tools in the image. You can quickly visually inspect every object in the image and write a script against it, and run it instantly.
I do mean a return to a concentrated effort to build the kind of user space originally envisioned.
Can you describe more about what you see as this originally-envisioned user space?
Two aspects were of particular interest to me, although if you ask the HURD developers from back then they could likely point out others:
1. Interactive programs should be uniformly customizable, extensible, and self-documenting - roughly in the manner of Emacs. That sounds like a trite thing. After all, the programs we wound up with have all three traits. For example, Gimp can be customized, new commands can be written in extension languages, and it has a great deal of on-line documentation. I mean something more specific but hard to convey concisely. Most of Gimp is not written as extension packages, which betrays an architectural weakness in contrast to emacs. Gimp's interaction model and customization model is awkward and ad hoc, compared to emacs. The on-line documentation of Gimp is not designed in a way such that its enhancement is a natural part of writing extensions. I don't mean that Emacs is perfect in these regards - it certainly isn't. I do mean that the architectural approach it takes is vastly more sane than what the interactive programs we wound up with use. (Don't mean to pick on Gimp. The observations apply as well to the Open Office suite, to Gnome, to Firefox, and more. We have a big, barely maintainable heap of discordant and vaguely conceived functionality.)
2. On a more mundane level, even the command line got horked. You know how (nearly) all GNU command line programs have --version and --help options and so forth? At least when I was working at FSF the notion was that that coding standard was a stepping stone to a shell much more like the much-loved shell on the Tops-20 operating system. The standard was supposed to be gradually refined so that those standard "--help" messages could be automagically used by the shell to intelligently prompt for arguments to a program.
I guess I should add that a lisp - most of us that I knew assumed Scheme - would figure prominently as the main extension language (rather than Emacs lisp). Why a lisp? Well, it had a proven track record and that was our aesthetic -- I'll leave it at that.
Could you be a little more specific about this? E.g. perhaps who, or at least how and who it effected how.
And, yep, a good user space of the sort you've described would be great (and is in fact something I'm trying to get myself jump started working on, in a FoNC style: http://vpri.org/html/work/ifnct.htm)
The ECGS schism was, in my view, purely and simply a successful effort to wrestle control over GCC away from the FSF. The FSF sought to develop GCC <i>for the aims of the GNU system</i> but a few firms, and their employees, sought to develop GCC for the aims of those proprietary software firms or the aims of proprietary software firms which were their customers.
The invention of "open source" was, in my view, purely an attempt to disassociate the resource of GNU source code and free software practices from the FSF's freedom mission. To put it crudely, and you can see this even in Raymond's original essays - they sought to recast the movement to give users software freedom into a movement to give business free labor.
After the Linux kernel started to take off there was, additionally, the non-trivial task of assembling complete and supported distributions. Although there was a community process underway that showed some promise (Debian), the firms that took the lead and grew quite large turned their back on that process, kept their system integration work internal and proprietary, and meanwhile solicited volunteers to work on other matters. That is to say that while the community might have been far further along by now than it is at distributing a decent GNU system, those firms fought (and won) to prevent that from happening.
Those are some examples. There are more but I did say I'd be brief.
At least Cygnus took input from contributors and even actually bothered to work on the project themselves. The audacity!
That is there right, under law and free software licensing terms. Nevertheless, it is a wrong in the sense that they are refusing to help the community who has so benefited them, and actively attempting to keep the larger community from self-organizing to eliminate the need for that closely held infrastructure. These firms sing a song about the benefits of community cooperation but they do not practice what they preach. They take free labor from others. They give back labor in areas that are strategic to them. But they withhold labor that would actually advance software freedom in substantial ways.
When I am done building my ferris style open source web and video game empire, I'll be focusing on stuff like that.
"Linus Torvalds has said that if 386BSD had been available at the time, he probably would not have created Linux."
As noted by tl, EMACS suffered a nasty fork, and it bears pointing out that RMS started with a fully functional version of EMACS written by none other than James Gosling of Java fame. RMS replaced Gosling's MockLisp with a real Lisp (both bytecoded), but the guts remained largely the same.
That was a matter of much debate that started to conclude when RMS did GNU Emacs, in 1984 when Common Lisp was also officially released for the first time, the latter's biggest breaking change perhaps being static scoping.
Were you thinking of anything else?
But I also understand that Lisp isn't a family of function languages. Functional is just a style that's possible and encouraged in Lisp, not required.
Obviously some (many?) implementations of Common Lisp provide it, but I'm not sure it's something you can particularly depend upon (I got out of the Common Lisp community in ... 1984 and haven't seriously used it since then).
Then again, I myself wouldn't necessarily describe Common Lisp as a modern Lisp anymore, it's essentially been frozen in amber when it standardized and it's filled with legacy cruft, perhaps most especially it being a Lisp-2. Scheme's standardization process has become glacial, but there is one (well, two, RnRS and the SRFIs). Clojure is much like Lisp Machine Lisp in the early days, although it should be past the worst of breaking changes by now.
There's nothing inherent about Emacs that ties it to being an editor, it could have been rewritten a long time ago in Scheme or acquired a FFI. Then things like GNOME or TextMate might have been rewritten as part of the Emacs ecosystem, while looking exactly like they do today.
Instead Emacs is a very specific thing to a small crowd, and adding things like a FFI to it have been vetoed by Stallman in the past. That's one of the reasons I think that we still don't have Lisp as a first-class systems programming language on GNU as the initial announcement promised: http://www.gnu.org/gnu/initial-announcement.html
GCC is much the same. There was a proposal many years ago to make a libgcc. Stallman vetoed that on the basis that things like C++ support might be written without contributing the changes back to the FSF.
That's a fair point, but LLVM+Clang are quickly replacing GCC in some areas due to its monolithic architecture. You can't write things like a C source code using GCC as a backend (easily), but you can with the LLVM tools.
So RMS failed in one execution for a still extremely excellent toolkit that would have make the GNU project even more well known for its excellent technical quality.
If anything, the GNU project is a great tremendous success, even if it is sometime dogged by its own short-term political interests. If he didn't listen to his political interest and instead focus on technical quality of his work, he would have achieved something much greater for the GNU project and his movement.
It's also losing some users like Apple and the BSD's due to the GPLv3 switchover.
I think that in the long term the GNU project will become increasingly irrelevant due to most of their crown jewels being hard to maintain, and there being political opposition to changing that.
I'd support that project. And I suspect that many other people would too.