I've been using Linux for many years now and I'm still investing time daily sharpening my tools. It's an investment that continues to pay exponentially increasing rewards. Nearly all of the remote computing resources I use in my job are running Linux. That probably is true for most HN users. Is Linux really that much more difficult to use on the desktop? I find getting all my laptop hardware running smoothly is one of the easiest system administration tasks I face. And I only face that task once in a blue moon when I upgrade laptops.
- The battery life is very, very good and combined with the screen (Retina 13"), made it perfect for me.
- It's still a POSIX environment so I can still use most of what I had from Linux without issue.
- It doesn't require as much maintenance as Linux. I don't have to fiddle with things to get it to work the way I want, it already does that.
- The hardware support has been great whenever I've had issues.
What maintenance does Linux require? The only thing I can think you might be implying is that there are bugs caused by updating that cause system issues frequently and this is simply untrue. Software updates have been known to cause issues on all systems, and I can find plenty of results on google showing OS X having issues after updates. Linux, in my experience, is the "set it and forget it" operating system. OS X is great though, just not for everyone (personally the environment doesn't have the capabilities I look for in an OS).
It's even worse when the Nvidia drivers don't automatically recompile for the new kernel. I've had many upgrade that resulted in no functional X-Window session so I had to use the terminal to download and recompile the video drivers.
I realize that's a problem with binary blobs rather than open source drivers, but it's also a reality.
That's funny. After using a mac for several years, that's my biggest gripe about working on most linux boxes. OS X separates text viewing and editing functions (C-f, C-k, C-a, etc in eamcs parlance) from system functions (S-c :: copy, S-v :: paste, S-w :: close window). In Linux, ctrl-A will either get you a cursor at column one, or select everything. I wish it were easier to define these actions system-wide the way OS X does.
I mean, compare: http://www.apple.com/macbook-air/
(ok, you could also buy an Asus laptop or something that would be nice looking enough, and cheaper than a novena laptop).
Another point, is that when you're at home, with a linux laptop, you can spend two days configuring the wifi. But when you're going out to a conference, MacOSX shows its worth.
Oh, and for several MacOSX versions now you can map the modifier keys as you wish, in the System Preferences / Keyboard panel.
MacOSX is not entirely proprietary. Darwin is open sourced (in large part). And as developers, we use a lot of free software on MacOSX, starting with emacs, and all the linux tools you can get from MacPorts or Homebrew. Right, because between ourselves, the BSD tools provided by Apple suck.
But actually, it's the econony, stupid! ;-)
As a Linux hacker, I much prefer to earn the money I need to pay for taxes, internet, electricity, food and shelter (oh, and clothing, sometimes you get to go to conferences), developing applications on a unix machine like MacOSX, or iOS, rather than on MS-Windows. (Yes, there's also Android, but java...)
So don't make a mistake, the main workstation is a Linux box, and even while developing MacOSX application, most of the code is written from the Linux box, even if it's compiled by Xcode on the MacOSX system nearby. Because, honestly, Linux is much more ergonomical! (if only because you really can configure it as you wish (and also strangeley, the keyboard feels much more responsive with linux than with MacOXS, let alone MS-Windows)). Anyways, remember, the Network is the computer! But most customers are still MS-Windows users, or MacOSX or iOS users, (or Android users right). So while you keep Linux close to the heart, you have to have a MacOSX system to earn your living.
(And there's also the possibility to run MacOSX on a Virtual Machine on Linux, but let's be honest, it means that legally you still have to own Apple hardware).
OS X has plenty of problems, see: http://www.imore.com/os-x-mavericks-problems-drive-me-nuts-h...
I've personally experienced many of these my self. OS X is cool, but I wish the fan base would stop acting like it never has any bugs or issues. OS X does get bugs, OS X does crash, OS X can break, OS X does not always 'just work'. Just go to discussion.apple.com and look at all the issues everyone is having. OS X is a great OS though, but it's not infallible.
Not sure how any of the various Linux package managers are more "standard" than brew on OS X...
Come on now, it's not 2005. Ubuntu runs smoothly on just about everything I've installed it on and I usually end up having to do more tweaking on Windows 8 than Ubuntu. As far as OS X goes, it's not perfect out of the box either, especially for anyone coming from a different environment who needs to tweak it to meet their standards.
OSX comes with the best hardware and you can develop in any programing language... not the same for Windows
And you can, with a bit of effort, run Linux on a MacBook. Yes, it's not for everyone, and it took a good 10 months from the last MacBook Air release before I could run Linux natively on it (without VMWare), but for those of us who really like Linux, it's worth it.
I totally get that the effort is not worth it for lots of people, though.
a) Battery life. For the most part, shoehorning Linux onto a PC is going to negatively affect battery life.
b) Despite being a bit of a "hacker", I have a day job and don't want to spend too much time tinkering with my OS. OS X is more polished than Linux and is less likely to go wrong.
I'm guessing that while most HN'ers aren't in exactly the same situation as me, they have more important things to do (i.e. working on their ideas) rather than wrangle with Linux.
I'm sure people will immediately chime in and say that Thinkpads work well with Ubuntu or that System76 does the job, and I'm sure it does, but the experience of running OS X on Apple hardware is simply unmatched.
>the experience of running OS X on Apple hardware is simply unmatched.
Eh. Seems a lot of people on HN believe this, but it's not always the case. Just a quick visit to discussion.apple.com will show all sorts of issues people have with the OS including bugs like http://www.cnet.com/news/some-lion-users-plagued-by-black-sc... which is a bug in the OS.
Edit: just some things I noticed in the short time I used OS X:
The gmail integration is buggy. I didn't get messages on time, and I have to either quit the app or take Mail offline and then online again before new mail will start streaming in. Sometime I couldn't get it to quit.
The 'quicklook' thing seemed really slow, which seemed to be a problem a lot of users had.
Multi-monitor support isn't that great. I had multiple windows on multiple desktops for some reason and it just felt weird.
I had issues with audio stopping and not working again until I did a reboot.
In some apps scrolling was really rough.
Apple seems to think I should unitask on a single monitor. That's not how I work, so it's not for me.
People are saying they have to spend a lot of time configuring Linux but in my experience that's not the case (obviously once it's installed in the case of Arch, which for me is a worthwhile investment).
I do admire the polish of OS X's desktop though.
Now for hackers, you already get a long way with the built-in BSD commands, but one word : http://brew.sh
Once you've done that, you could install Linux or Windows or FreeBSD on it, but if you don't have an overwhelming requirement, you'll keep it on Mac OS.
And if you're like most hackers, you have one work laptop and you carry it around everywhere, and it's your primary machine.
Personally, I don't carry a laptop back and forth: I have a Linux desktop at the office, and a Linux desktop at home, and a Mac laptop for travel and moving around the house.
I've only switched once in 20+ years. I started using Windows, 3.1, then XP, loved 2000, but switched to OS X about the time of Vista. Got tired of the constant virus issues. Tried Linux, but I've used OS X exclusively for 10 years or more now.
I'll admit that I still feel much more at home on Windows (when I have to fix the kids laptop), it is like putting on an old pair of slippers.
It's not because the hardware will never be up to it. In many cases it is already. I think it's the OSes. In my opinion it's because the feudal security model and closed app store model does not scale out to professional applications, developers, or complex usage scenarios. A computer that I can't do whatever I want with can never be my primary computer, nor can one where somebody has to whitelist what applications I can run or what capabilities I can extend it to have.
MS could disrupt here but won't. They're slavishly imitating the app store feudal model.
For creators, nothing beats a great big screen coupled with a keyboard, and (mouse|trackpad|graphics tablet). Plus, as you say, for a certain type of creator, the kind of freedom that a walled garden just doesn't offer.
Windows 8.1 is really only their for day to day tasks such as web browsing, email and gaming. All serious work is done on virtual machines running a number of different Linux and BSD distributions. I have found that this works better in the long run than dual booting Linux and Windows.
Laptop is OS X(but dual-boot into Win7 as needed), work workstation is Win7, work servers are Linux, HTPC is a chomeOS (box), phone, tablet and 2nd HTPC are Android.
I'm a workaholic with ADD. I'm using any number of the above devices at the same time whether at the office or on the couch.
The main reason is that I don't agree (anymore) with the payoff of a proprietary OS, in a privacy and philosophical way.
Me: After about four years of splitting my time between Arch (desktop) and Ubuntu (laptop), I've switched to Debian (Wheezy) on my new laptop. So far so good - I haven't missed anything from Ubuntu, and if anything it's a bit easier to manage overall.
More important to me than the OS is the details of my desktop setup, which is the same under any of the above: I run GNU screen in urxvt and ratpoison from screen. That way, any output from applications launched from ratpoison goes to screen 0. I make various aliases for things I use regularly, so e.g. e starts an editor in a new screen window. I can use screen or ratpoison window splitting if I need to see multiple windows at once. This is by far the least annoyance setup I have found on any OS. There are menu apps that work with ratpoison, but I find the text file + command line + aliases method to be much less annoying than any app launcher I've used on any OS.
What is not primary is my tablet, that runs Android.
Solaris for most Dev work; Linux for a little, Windows for the rest.
Would switch to ubuntu but I need photoshop/illustrator and we only really get IT support for windows.
I work on all sorts of linux virtual machines though so that makes up for it I suppose.
At work, windows box. At home, I have a windows computer, which I use maybe 2 hours a week. Then I have an android smart phone, which I use 2-4 hours per day.
Which one is my primary computer?
The one you use most. I hope, for the sake of your employer, that's the work windows box.
Ask HN: Most productive OS for web developers?
I was a diehard Linux nazi until I got my first internship where everyone used Windows. Initially I was appalled; I had brainwashed myself with the spirit that `configurability is power' to the point that no one, not even I could get any work done on any system I touched. This philosophy carried over when I started using the work computer, which I plastered with ``productivity software'' on it in an attempt to morph the system into something more like my personal laptop. I became part of the ``Cult Of AutoHotkey.'' In retrospect it is absolutely absurd that there exists a subculture which revolves around changing the settings of computer software.
The first week of the job my boss had to do something on my computer, and within thirty seconds told me "What have you done to this poor computer." He was a Smart Guy; he is highly educated, has been working with for longer than I have been alive and adapted to modern standards, (sucessfully) administered company server farms, directed software projects (embedded, web and everything in between) and most importantly had a life outside of science and technology. It was at this point that I woke up. Obviously there was a problem with me. Now I leave every conceivable default untouched unless I absolutely cannot avoid it. I use Windows 8, IE (Firefox with no addons when that breaks), Visual Studio, Windows Photo Gallery, etc. Most of the less common software I must use is run out of a folder in my home directory. Now I and everyone can be productive on my system. The only thing I have lost since switching is the smug self-righteousness of Linux veganism and Lifehacker articles.
There is no default on Linux, so there can be no expected behavior. That is why I use Windows.
I also refuse to hobble myself with custom desktop toys. If my machine fails I want to be productive again as soon as I get another machine on my desktop. Forget all the imaginary gains of diddling with exotic editors etc. Truth is, you can get used to anything. So get used to the defaults, and leave the whole discussion behind.
I just could not live with a stock Mac or Windows setup (or Linux, though, to be fair, there's really no such thing as a "stock" Linux setup) and still feel productive.
Then again, regarding Windows specifically, I just cannot feel productive on Windows anymore, period. In the 90s I would have considered myself a Windows power user, but switching to UNIX-y systems blew away anything I could ever do on Windows. Perhaps things have gotten better, but I have zero motivation to go back when I already have the real thing right in front of me.
In the end, though, everyone is used to what they're used to, and switching costs can be pretty damn high.
Balanced against a feeling of productivity?
* I find their paradigm very bizzare. The nail(s) in the coffin for me were the inconsistent and incomplete patchwork of shortcuts and modifier keys. Even with universal keyboard access enabled, there is no way to do some very basic things via keyboard out of the box (e.g. move, maximize, minimize windows). Every time I use OS X I feel like I'm fighting the system. The only way I can get anything done on it is via CLI, which defeats the purpose of switching (at that point why not just ssh into it from windows)
If I had started out on OS X it would probably be great but as it is, the learning curve and opportunity cost is too steep for me.
(Oh god, remember Win32s?)
It's been years since I've ever been unable to do any task on OS X, and do it with the best tools available.
There's so much useful, quality, well picked stuff built into OS X that the "quantity" of software available is irrelevant.
- I simply prefer the experience. I feel like I can navigate around the environment more quickly. Also: not being tied down to mac hardware is freeing. I've found the HP ProBook to be a decent windows laptop. It shamelessly mimics a lot of the MacBook Pro form. In fact, the chiclet keyboard can use MBP silicone keyboard cover.
Macs just seem too low end hardware for the price and an 8 year old mac wouldn't run this well - i also couldn't get over the no right-click on the mouse. Linux is fine for the server side, but after years of MythTV I got tired of getting things to work.
I'd dual boot if I could, but my primary computer is a laptop and it's too much effort.
I'm not trying to diss or dismiss GNU, but I think it's possible to have a functional linux desktop without any GNU software.
(Busybox, plus μClibc for example.)
Modern-day "Linux" desktop systems are pretty useless without GNU on top. I'm certainly not going to go all RMSy and insist on "GNU/Linux" on idealogical grounds, but it is more correct than simply "Linux".
 Stop being pedantic. By "anyone", I mean, "a minority large enough to be more than a rounding error".
Used to dual boot but I recently got rid of my Windows 7 on account of never really using it anymore
I got tired of Windows screwing up GRUB everytime I update it that I tried running Linux virtually and found the display to be the weakest link even with 3D acceleration turned on.
Otherwise, I prefer windows (7).