We did contribute to the floppy driver. The original driver read a block at a time before incurring a rotational latency. I had had enough of this slowness with SCO Xenix, so added a track buffer to speed it up on Linux. My friend wrote the original generic SCSI driver, to support a film scanner.
I think we may have made one of the earliest products using Linux. We worked for an entrepreneur who started a business selling medical image archiving and teleradiography equipment. In those days, ct-scan machines and MRI machines had no networking. To get images from them you had to capture the image sent to the console screen or use the 3M laser camera parallel interface. (My job was to make cards for this capturing). Image transfer was over phone lines using sz/rz and Telebit Trailblazer modems.
Anyway, at some point one machine did have network (I think it was some proprietary interface, not DICOM). The problem was that Linux did not yet have a networking stack. The solution? Use "KA9Q", the amateur radio TCP/IP stack in userspace for this.
At some point Pat Volkerding (Slackware) also worked with us. I remember he was a big deadhead at the time, wrote Slackware while living in parent's basement. We were half installing / half developing a teleradiography system after hours in Eastern Long Island Hospital.
Then I jumped in the library and got myself a manual based on the UNIX 'System V R4' and am still amazed how 15 years later all commands still make sense (BTW, a link to the PDF copy of the SVR1 manual has passed here on HN a few days ago - I _so_ recommend it to anyone starting out).
Dissatisfied with RedHat, Mandrake and SUSE I fell in love with the minimalist Slackware - that was my first true love, the one you will never forget.
Good memories, made of long nights recompiling the kernel, failing and wonder why nothing boots anymore, giggling when modprobing the module for the wireless device, and dreading leaving the home server up at nights with port 22 open (did I harden it enough?).
If you want an OSX which doesn't require reboots OSX is the answer.
But seriously - I was at a customer site yesterday with some colleagues and Wifi wasn't working for any of the Mac users. They were all told to reboot and then it worked. OS X is an excellent system, but it's shit stinks like the rest of us.
The slow manual install process? I can't remember any of it, and would have to read a howto again if I had to do it -- but I think it went a LONG way towards demystifying things: everything was just user-editable config files and sources. There's no magic, and almost everything is something I can change with compiler flags.
Of course, I now heart ubuntu and debian, because I tend to agree with the choices of most package maintainers as far as compilation options, and my computer is fast enough that I don't think I'd notice if my office binary were maximally optimized or not. ;) However, I might not have gotten here had I not been exposed to Gentoo and Slackware long ago.
I agree, though, that sticking to Ubuntu or Fedora for new users is a great idea. They are both excellent operating systems with regard to simplicity.
By any chance are you running OS X on Apple hardware? If we took OS X and installed it on an Ubuntu Certified Dell PC we may find that it has the stability issues, so if you are comparing it to OS X running on Apple hardware it's not a very fair comparison, as it's being run under ideal circumstances and Linux is not.
In those circumstances you shouldn't have any "stability" issues because it's a like for like set of engineering - basically.
In general, in terms of your other comparison with "any" hardware on the Ubuntu side - the two most likely causes of problems are hardware drivers and user fiddling (ie mixing repos and kernels). Lack of engineering support from manufacturers for client hardware is a big issue that's outside the Linux distros control. Nonetheless, if you use common supported hardware and standard installs then you shouldn't be having stability issues.
Personally Debian Sid is my top choice, Arch being close second, and SlackWare being third for sentimental value.
Ubuntu always breaks on me after a while. Mint destroyed two separate and different installations with its full disk encryption. Gentoo is too much work. Fedora... I just cant bring myself to use something called yum. The Mandrakes, Mandrivas, PureOS's, et. al. and the derivative distros just feel like crutches to me. In my opinion nothing come close to the simplicity and straightforwardness of Debian.
They use dnf now.
I started by using Fedora, which worked well for me, but there were occaisional annoyances, like having to install a lot of software to /opt and /usr/local instead of making custom packages because rpms are a pain to make, and some major, and difficult to compile and configure software (specifically mono), being too out of date to satisfy dependencies for something I wanted to use (KnightOS development tools written in a recent version of C#). When I wanted to edit the configuration of some software, it was much more difficult than it needed to be because it was already running a custom configuration highly tailored to Fedora. Rather than just finding support online and doing what it said, I had to invest quite a bit of time into figuring out how the software worked, and how it integrated into the fedora system, before having a high enough level of understanding to figure out the changes that needed to be made myself.
After just 6 months of using fedora I migrated to Arch. The installation process was slightly cumbersome. I had never configured wifi using anything other than a graphical wizard, and even though netctl is really easy to use, it took some getting used to. Partitioning my drive without gparted or OS X disk utility was daunting. But none of these were too big of challenges, because the beginners guide, and the rest of the Arch Wiki, were so thorough and easy to follow.
I installed GNOME and never ran into any issues. Nothing ever broke, except when I installed incompatible video drivers, but that was my fault and easily fixable by booting from the installation disk. Unlike in fedora, almost any software I could ask for was in the official repositories, configured minimally. If it wasn't, then it was on AUR. And for the software that no one but I use, creating my own package was so incredibly trivial that it was almost easier than manually compiling from source.
A couple months later I added a second arch installation to my machine with full disk encryption (keeping around the old one because I had no where to back up to at the time). And then a couple months I replaced both installations with a cleaner (gummiboot instead of isolinux, luks on lvm rather than lvm on luks), bigger (occupying my whole drive rather than just half) arch linux installation.
Just a couple days ago I reconfigured my system to run debian and arch in dual boot with a shared home partition, both with in an lvm configuration over luks. While the debian installer was powerful relative to fedora's and ubuntu's, I found that I had to drop down into its very underpowered busybox shell several times to acheive the configuration I was going for. In comparison the Arch installer is just a full live arch installation with arch-chroot and pacstrap bundled in to be able to manage packages on a foreign system.
tl;dr I think that if you encourage new users to focus on the fundamentals of using linux, the "easy to use" distros will quickly become harder to use than the more "advanced" distros.
My point, which I now realize I thoroughly obfuscated, is that by only providing low-level tools and focusing on making them simple and transparent and well-documented, Arch makes its inner-working much more accessible to beginners, at the cost of maybe seeming impenetrable to the total noob.
You can also install a debootstrap binary for most other distros and bootstrap a Debian chroot system in less than 3min ; )
>I trust that all the smart people using debian are onto something.
The big contrast with Debian and Arch are their package maintenance policies and release cycles.
The Debian community strives for a system that works well together and is stable above all else. Software must conform to standards of quality and stability, might be modified by their maintainers to that end and will be staged through the whole experimental -> unstable -> testing lifecycle before being deemed stable. This is a big deal for systems that you rely on or that you'd like to set and forget. Having the guarantee that software Works and updates won't break the system are also important.
From my experience, Arch is much more liberal about their packaging and is closer to the Debian sid/unstable experience in that you have access to bleeding edge packages that work 95% of the time, but sometimes you need to get you hands dirty to get your system working.
I remember that doing ./config failed, and I honestly had no clue in the beginning. So, while some friends of mine suggested that I abandon Linux and use Windows "because it works, and you don't have to waste your time" I decided to `waste` my time.
So I googled, asked around, watched the masters, and most importantly, learned. I tried again, failed again, and faster, but not on the same pitfall. Always on something new. And in the process I increased my knowledge of the system, of the programming languages used, I learned how to patch the software, clever and not-so-clever tricks.
The best part? I'm still failing, and I'm still learning.
I'm not into OS-wars - I just loved what I did, still do, and I'd do it again a million time :)
Don't get me wrong, I like Apple products too, but it's a bit harder to get some things on there like GCC.
If you can't even handle GNU/Linux, though, what does that say about people's receptiveness to research OS that completely shakes established paradigms? Are we just going to keep reinventing what is most convenient to our preestablished biases? Given the things coming out of GnomeOS and Freedesktop.org, it sure seems like it.
...then again, most people don't install Windows... on hardware
as heterogenous and diverse as what GNU/Linux frequents.
Before you buy a computer you plan to install Linux on, you check the compatibility lists. Before you buy a laptop to run Linux on, you check whether Linux has drivers for the Wifi chipset. Buying a graphics card? Does Linux have drivers that can fully exploit it?
Arguments about Linux's alleged superior hardware compatibility seem to always be based on anecdotes about that one time a relative's Windows computer wouldn't talk to some printer.
Walk into any Fry's or other similar store, pick a random piece of hardware off the shelf, and there's a significantly better chance of it both a. working at all and b. being fully supported on Windows over Linux.
About your last point, biases takes a huge amount of energy to overcome. I really wonder when such a shift will happen.
If you want an OSX which doesn't require reboots OSX is the answer.
That OSX doesn't require reboots (and that it has a security track record that's better than Windows) is the Slashdot-esque 'inverse FUD' you realize you unconsciously bought into only after the fact.
However, I think it's great that people have those tools for Linux. Sure, they might need to be configured sometimes, but most of the time they're something someone online thought they needed so they made it. It's a great community.
However, while I have had to make lots of configuration changes with Linux (even if it's just adding API keys, changing some simple and documented display parameters, sand so on) I've never experienced stability problems with software. I mean, sure there exists bad software, but I've always had great stability on Linux.
I had a very similar journey. For me, I got hooked on computers when my mom (a programmer at a defense contractor) would take me in to work when she didn't have a sitter. I got to hang out in the computer room and play adventure on the minicomputers they had in the late 70s. I had the typical Apple II experience many of us had in the early 80s. I learned to type by typing in those games from Byte (and how to debug by finding my typos), eventually writing my own games and learning 6502 assembly.
Flash forward to university and my first Unix account on an always-melting-down sparc server. There were dozens of DEC VT220 terminals in our public lab at UB in 1989, but only a few Xterminals which were highly coveted. My friends and I jumped at the chance to use them. Most of us eventually got on-campus jobs or internships with the goal of getting unfettered access to Sun or DECstation workstations. I STILL use some of the same keyboard shortcuts from my 1990 .twmrc, and my .cshrc, .Xdefaults and .emacs files have all just evolved from then (and yes, I still use tcsh).
The first time I installed Linux was from a stack of 3.5" floppies sometime in 1993 when I was a grad student at a different school. I was a huge Linux fan.
A year later, I met Linus at the '94 Linux BOF at the USENIX in Boston. At the time, I was trying to convince our dept to buy PCs rather than Suns or DECs to replace some wheezing grad student workstations. However, I ran into a problem where we used LaTeX, and it kept its fonts on a central NFS server (these were the days when disk space was precious, there were no package managers, and everything was installed by hand --- so installing stuff to NFS was very common). The 12MHz Mips R2000 DECstation 2100s we had (our slowest machines) would render a page of text in a second or two using xdvi. However, a test Linux machine (which was a blazingly fast 66MHz 486) would take 10x as long. I eventually figured out that this was because Linux NFS did not do any file caching at the time, and xdvi was seeking around byte by byte in the font files.
I was very excited to meet Linus, and was really looking forward to the USENIX BOF. So when I asked Linus about NFS file caching, he blew me off in what I now know is a typical Linus like fashion, and said he didn't care about NFS. So I went to the very friendly and welcoming FreeBSD BOF, found out that NFS works, and never looked back.
I have always wondered if Linus was a bit for friendly, would the Linux community have gotten a lot more mainstream faster?
I don't think it would have made a difference. *BSD has never really become mainstream unless you count Apple, which I don't. I think the main barrier to Linux success on the desktop is the fact that in my life I've rarely seen advertisements for Linux, while Apple and Microsoft seem to be everywhere.
I think if a Linux distro had a huge advertising campaign it could actually snag a sizable portion of the desktop market.
Actually, there is quite a bit of advertising, particularly in BRIC countries to that segment. It works as well. In many cases you're talking to people who are buying their first or second PC - but even here you'd be surprised by the market perceptions you have to overcome: people believe they can "get a job" if they know Windows, and that Mac's are stylish (aka a status symbol), and that Linux is too hard or for geeks only.
A lot of general users are surprised when you should them that Linux can do everything that a 'normal user' wants their computer to do - quickly and easily - at a price that is way better than the other options. No command lines involved.
In the spirit of root cause analysis you might ask, 'why there is no advertising?' at the level you're talking about. And the basic answer is that Linux is a difficult proposition for the hardware OEMS: it's a cost to them, and the users perceive it as 'free' so hard to pass that onto them. Whereas, Windows is margin to them, and obviously Apple makes most of it's margin on hardware sales. If you go a step further back then the underlying issue is that the client PC sector doesn't make a lot of money - all the large manufacturers struggle. So it's a pretty conservative segment. It's a low margin market that demands a lot of volume: the equivalent dynamic to supermarkets. That's why RedHat said it wasn't a segment for them, and why Ubuntu/Canonical puts a lot of energy into shipping volume in developing markets.
Ultimately, I think user networks are far more important than traditional distribution. Advocates and friendly support networks work. That's why it's much more important to convince others to use Linux, and to greet all Linux users as part of the same family - rather than fighting over whatever flavour of <init/desktop env/editor/etc> is the "true way"!
In any event, Linux not being mainstream might well be a good thing depending on your POV. I'm becoming more convinced that the GP dodged a bullet as I go on.
It was a completely non-technical discussion and Linus had a moment. I found it to be very off-putting.
He doesn't project the image of a Visionary, the way Jobs did. He doesn't project the image of an inspired Nerd, the way Gates did. He projects the image of a really smart and driven but annoying guy with whom you wouldn't choose to hang out recreationally.
Already knew Aix and Xenix back then, but having UNIX at home was great and I became a bit too much FOSS Zealot.
Nowadays I use all OSes and the Zealot guy has been replaced by a pragmatic guy that uses whatever makes sense for the business.
Never looked back, and its amazing to me today to see just how far we've come. Truly a phenomenal technology ..
Courage to you nonetheless! Time to find a job where you're root on your machine? :)
The funny thing is, I remember a few years later going for interviews and being slightly bashful about our use of free software. But it was only after I subsequently got a job at a much bigger company using very expensive commercial software, which was an order of magnitude slower and mind bogglingly unreliable not to mention completely opaque, that I came to realise just how good some of the free software is.
My local ISP gave away shell account I could telnet to and access a home directory that became a free website, ie theirispname.whatever/~myusername
I logged in, used HTML and Perl to do what I thought at the time to be the most amazing stuff in the world. Found out they used a variety of Linux as I sniffed around the commands available to me (like I had done with MS-DOS over the past couple years).
Thought it was awesome, and I ended up putting Slackware on a couple of my old systems people had donated to me and run little servers in my bedroom. Eventually went to RedHat, then CentOS, now mostly Ubuntu.
At the time I saved up for, and bought, any books that said "Linux" on them that came with CD's stuck to the back with a distro I could install :)
No Internet at home though so I had to get multiple boxes of floppies from stores and download everything at work and then cycle home with them - I seem to remember X and its applications being the single largest chunk of stuff to install.
Been a linux user ever since :)
Then a year or two later getting Dec$Write running on a V8650 to display on the X11 server of a Linux box elsewhere on the campus instead of one of the creaking DecStations we normally used.
Now I'm sitting in front of a laptop running Ubuntu and have an Android phone in my pocket. Good times (then and now).
I installed several linux servers and hired a couple contractors I knew to manage the deployment. They setup these linux boxes and consulted supporting them....
I went to one of them on day and said "you guys should just start a consulting company offering linux support!"
A few weeks later, one of my consultants came back and said "Hey guess what! We are starting a linux consulting company!"
I was excited... we talked briefly about me joining them, but that didnt work out...
A little later - they were valued at over $1B!
Those consultants that worked for me on this in 99? Dave Sifry, Art Tide and Chris DiBona.... They founded LinuxCare.
I later met Linus at one of the conferences and chatted with him for a bit, I don't recall him not being friendly though... but that was the only time I met with him.
The only difference is that it was Duke Nukem on my floppies. I bet this guy has a better paying gig these days :/
I share the love of computers. I spend all my free time in front of computers, programming and reading. It's been like this for my whole life.
Due to a twist of fate (couldn't get a scholarship) I ended up in law school. I have no interest in anything beside computers. I can't live without them, but it's killing me to know that I can't study computer science.
Your passion will take you further than your college programme.
My second semester, we worked in groups of four for an elective Robotics course. Our group was composed of 3 seniors and me. One was super competent, I noobed hard but contributed working code and the other two were very eager to do the write-ups.
And in the workplace there are plenty of people who know enough to do their programming jobs, but don't have a real interest in it, don't code at home, etc.
People with passion for it will shine eventually, whatever road they take.
This was so they didn't have to teach the basics on CS and EE courses
On my Portuguese university, we got to code in standard Pascal, C, C++, Prolog, Smalltalk, Caml Light, PL/SQL, 80x86 and MIPS Assembly, Java.
Many of the lectures were composed by exam + mini-project.
Anyone that made it without much coding was getting a ride in workgroups, on single projects it was either code or fail.
That being said, I feel like CS is one of the few things you can study in depth for free using the internet. I'm not saying its the only one, but I really do feel like you can. Actually, that's why I decided to go with engineering at university, because I felt learning CS without school would probably be easier than learning rocket science without school because of the material freely available online. This way I could learn both aero engr and CS! (the hard part with this reasoning isn't finding the material, it's having the work ethics to read, understand, and apply what you learn alone).
Maybe your law school has an online library also ? Back when I was doing my Bacherlor's, I wanted to learn more about math, aero engineering wasn't broad enough and wasn't in depth enough either. So I went on my online library and read pretty much every graduate textbooks in math I could find. It was so simple, I'd just type what I wanted to study, and I would read the books accessible online. I didn't even need to be on campus anymoe. I'm saying this because if you can find CS books that you can read for free and that are easily accessible/legal, you can learn CS this way. Of course, it'll be a lot harder to learn CS (anything actually) alone than by being 'forced' by your prof to read the books, turn in homework, and study your stuff for the exams. But with discipline, of course it's possible. I didn't learn engineering by going to class, I learned and earned my BS in engr by reading the books, doing my homework, and studying for my exam; not by going to class. I actually only very very rarely showed up to class. It may be very hard while also going to law school though :(
Anyway, if math isn't a barrier/scary to you, definitely look into alternate ways to fund your CS education/making your own
If I were you I would leverage your unique perspective as both someone who understands law and code.
I struggled for nearly two years after I found Linux before I changed my major to computer science. I thought I was too far along toward my degree to make any change.
After three semesters of hand-wringing, Advanced International Tax Accounting helped me realize it wasn't too late actually. :-)
The other commenters are correct: You don't need a degree in computer science to do what we do now. In fact, as other have already pointed out, these days, a more well-rounded background / education is a major advantage, whether that background / education is law, design, writing, etc.
Best of luck to you.
Almost all of them have the same interest, passion and enthusiasm as you have. Studying computer science won't get you as far as that.
Plus because they have varied backgrounds, they can bring additional colour, learning and stories.
I broke the family PC a whole bunch of times (software not hardware) before I managed to get linux onto and then onto the internet, after that nothing came close as a platform to work on and so it's nearly 20 years and I still run it to this day.
I don't remember the reasons why, but I didn't download a distro and instead went to Staples and bought the cheapest flavor they had for sale. I can't remember what it was, but it had a GUI and text editor and all the backend programming stuff I needed to do the project much more efficiently.
I ended up keeping that Linux partition around and used it to teach myself Apache, MySQL, and PHP, which is what got me my first job our of college. That company was an Apple shop, so that's when I was first introduced to OSX and the need for a Linux desktop was eliminated for me.
Updating and driver support was weak back then. Thankfully this has improved a lot. Ubuntu is also a more novice friendly alternative.
The later versions had xEarth installed which is so cool.
After that it was Suse and Mandrake in half-serious fashion.
Right now though i am using Gobolinux, after having it sit as a CD for a time until i had Windows blow up on me for the nth time.
I should really do a clean reinstall, as some major changes has come down the Linux pipeline since then. But at the same time, it basically works as it stands.
And in a telling expression of where Linux is heading, the guy that prepared the last iso for Gobolinux claims he spent more time getting the desktop parts (Consolekit, polkit, dbus, etc) working than the kernel etc.
I always felt a bit out of place for having this origin story compared to 'normal computer science' students. I know it kind of sounds ridiculous, but it never stopped me from feeling it. Murdock's story really gave me the feeling that its just a silly thought I'm having :)
Let's say you had the 2.6.0 source in /usr/src/linux.
You would FTP to ftp.kernel.org, download patch-2.6.1.gz into /usr/src and
# cd /usr/src
# gzip -d patch-2.6.1.gz
# patch -p0 < patch-2.6.1
Side note: there would often be other patches by various kernel developers (e.g. Alan Cox) and they would be named something like "patch-2.6.1-pre6-ac.gz". I recall once coming across a patch by someone I had never heard of -- Don Tuse was his name, apparently. I proceeded to download, e.g., patch-2.4.8-dontuse.gz (or whatever version it was), apply it, and recompile my kernel. It was only later that I realized Don Tuse was actually "Don't Use".
Version control worked by posting your patches to the mailing list. Linus would apply your patches to his tree, which he'd occasionally tar up and upload to the mirrors.
For those that haven't seen Revolution OS I would highly recommend it for much of the Linux backstory.
That said Microsoft is still an innovative company. They still create good applications and have robust data software.
It really isn't.
I finally ended up with an Arch system running xmonad and huge numbers of terminal based apps. I'm finally happy with my system again. What I realized is that I specifically don't want a Mac-like/Windows-like/shinkwrapped experience. I want choice and freedom, because what I like is not necessarily what the masses (or Mark Shuttleworth or Redhat or Gnome developers) want.
I kind of picked on your post because even though it is kind of negative, I think there are quite a lot of people who think the same way. It's a valid opinion, but a bit unfortunate. The advantage of a (dare I say Gnu/) Linux system or a BSD system is the freedom from being told how you are going to use your computer. It's not having to put up with some stupid design decision just because it was pushed by a popular company and now the masses are used to it. It the ability to explore, experiment and create with absolutely no boundaries.
It is popular enough and gives me more of everything I want than OSes that are more visible to the masses.
But, giving general users Xmonad and a terminal isn't going to encourage more people to use Linux. Consequently, you land-up building applications that general users want and trying to make the environment more accessible to them.
Ah, but you might say that "more users" isn't an important goal for you .
But, the challenge is that below a certain level of users the hardware and software ecosystem isn't incentivised to make things "work with" Linux. If you have 5% of the PC user-base then Intel cares that WIFI chips work, Barclays cares that you can login to their web bank, etc . So even if you don't want Linux to be 50%, you probably do want it to be an important platform.
I don't really buy into the idea that Linux can't be both mainstream and for expert users. The "general users" easy environments and the expert-user xmonad user environments can both co-exist. First, because the Linux distributions are an easy way for users to self-segment - particular types of users are attracted to different distributions. The slight downside of this being that we get tribal wars over distros'. And, really the differentiation between distros is mostly their default choices - you can run Xmonad on Ubuntu (I run i3 for example) and I'm sure you can run KDE on Arch. It's pretty much the same software underneath.
 And actually one of the things I dislike is when people say "the Community wants X" which is totally bogus because a lot of people have different goals, there is no "one" community.
 You might recall the 90's/early 00's where there was lots of Web tech that made sites not work where Linux was the browser such as e.g. ActiveX.
It's really the wave of the future.
Really? Looking at the docs (http://xmonad.org/manpage.html#default-keyboard-bindings) it looks like it has mod-KEY and mod-shift-KEY chords. Nothing wrong with that, of course.
It's easy enough to set up StumpWM to support those bindings, if one wishes. Not saying that everyone should be using it, of course! xmonad's a fine WM I'm sure.
It felt very liberating to suddenly be free of all the Gnome3 / Unity politics, developer fragmentation, and complex desktop systems that still feel inferior to the commercial OS-X/Windows alternatives.