> I have two monitors on my Linux desktop. A month ago full screen on video stopped working. [...] In this regard, both Windows and OSX just work.
Just because it's never happened to you on OS X or Windows doesn't mean it doesn't happen. OS X 10.6.7 broke the output on my 13" Macbook for either of the two external displays I own. Both worked fine previously, when booted from the install CDs, or from Linux on the same machine.
Plugging in my Firewire audio interface on the same machine spun the CPU up to 100% and kept it pegged there. A lot of good having a nice mic pre-amp does when you get a high pitched fan whir added gratis to all of your recordings.
It's silly to pretend that Mac is somehow perfect in these matters. In my experience it's only been marginally better than Linux, if at all. And with Linux you have some hope of finding a solution, whereas for OS X you're pretty much hosed.
> It's silly to pretend that Mac is somehow perfect in these matters.
Straw man. Nobody said OS X was perfect.
> In my experience it's only been marginally better than Linux, if at all.
I used Linux on my desktop for several years and have now used Macs as well for several years. I won't say this is bullshit because I don't think you are lying about your experience, but I think you are extrapolating way too far.
> And with Linux you have some hope of finding a solution, whereas for OS X you're pretty much hosed.
Forums and mailing lists are "hope for a solution" while the genius bar is "pretty much hosed"? How on earth did you arrive at this conclusion? That just doesn't seem reasonable.
I don't see any material difference between "hardware compatibility just works" and "hardware compatibility is perfect".
As for fixing the issue, both of my issues were kernel issues, as are most hardware compatibility issues. On OS X reverting to a previously working kernel would have meant backing up all of my data, reinstalling the OS from the DVDs, installing the combo update to the last working version of the complete OS (10.6.6), restoring my data (but not using Time Machine) and then never allowing another OS upgrade, including not updating the apps that come with the OS.
On Linux it would have been a matter of downloading the previous kernel package and installing it. That's it. The package managers are usually smart enough to figure out that you've forced a previous version of the package and won't replace it in future updates.
Neither solution would have been doable for a novice, but the Linux solution is far easier for an expert and a generally more palatable solution. And note, this was Apple's OS breaking Apple's hardware.
I still hold that Linux has a lot of "usability." It is extremely useful, extending itself into every possible use I can think of. It is an extremely useful tool.
That said, maybe Linux in it's current stage isn't destined for the pickup approach of Windows or OS X. I believe that Linux is not following the same path as Windows or OS X, and because of that it may not end up at the same destination.
I agree with the original article; Linux is produced differently, often with different goals than Windows or OS X. Trying to shoehorn it into the same path as Windows or OS X may not prove fruitful till we see someone with the drive to change everything onto the design path that Windows and OS X are taking.
To stress out again; OS/X has the poorest hardware support for 3rd parties; even compared to Linux. Just "think different" and buy some hardware and you ll see what I mean..
Oh well now that you said it the argument is settled. You can't just say OS X has the poorest hardware support ever and not back it up with anything except "go buy some hardware and see what works and what doesn't". Is there maybe a table online that lists these incompatibilities you can link to? Please don't say "do some research" because the onus is on you to prove yourrself right, not on me to prove you right.
You also said "even compared to Linux". That's interesting. That statement there gives the impression that you're more interested in defending your own beliefs and/or choices rather than being truly interested in explaining to us what OS has objectively terrible hardware support. "even compared to Linux" sounds like something an apologist would say. Then when you added in the "think different" line you made it seem even more like your comment was based off some kind of blind loyalty to Linux rather than loyalty to facts.
Me? I've used Mac, a handful of Linux distros, and Windows for a long time. I don't know which has the best or worse hardware support but I do know when someone says something based on what camp they're in rather than what the facts are.
> Then when you added in the "think different" line you made it seem even more like your comment was based off some kind of blind loyalty to Linux rather than loyalty to facts.
Don't read too far into things, and "...loyalty to facts" eye roll.
Anyway, It's pretty common knowledge to Linux users that hardware/driver support on Linux can be a challenge. Sometimes things work right out of the box, other times you have to do a lot of work and a lot of Googling.
That being said, I don't agree with the parent. For basic hardware needs, Windows and OSX have a fairly high success rate of plug-and-play functionality. For me on Linux, it's about 50/50.
On windows, a fairly high success rate? This must be a joke. Try installing a graphics card without driver, you will see how well it is supported under windows. I consider that basic functionality. Same for wifi dongles, they usually need drivers to be installed in windows, while this is all taken care of at the kernel level in Linux.
In Linux you very rarely have to install any driver. True, some hardware remains unsupported, but the list of compatible hardware without any installation required is pretty long. On windows, you'll need to install stuff for most of new hardware you try to plug in, no matter what.
OSX was the first system where a printer worked without not even a popup saying that "drivers were being installed" (and I'm not even the biggest Mac fan, I'm typing this on Windows 7...)
The real difference is that on OSX is if something doesn't work it doesn't work. Under Linux it may work, or work badly, or you can probably hack it to make it work well. I used to think I liked the latter but there's something appealing about the binary clarity of OSX.
Thank you, that's exactly how I think, it's even this way with Windows. On other systems I would waste days after days of my life to set things up 'just the way I want' only to find that it's never gonna be quite that way. Macs just let me accept the things as they are and get back to work. Very often, I'd get to see after a while, why things work the way they work (say, Window management).
Anyway, this has been almost 100% true up from 10.3 through 10.6.x. Recent developments both on Apple's as well as on MS's side have closed the gap between the two systems quite a bit. I still prefer OSX but it's gotten close. The real reason I continue to buy macs nowadays is the hardware quality.
>On other systems I would waste days after days of my life to set things up 'just the way I want' only to find that it's never gonna be quite that way. Macs just let me accept the things as they are and get back to work.
This. It's almost as if Linux is a collection of parts and tools from which a sufficiently good designer can build a usable desktop whereas OS X is a usable desktop. That's an exaggeration, but for me there is some truth to it.
Note that Linux was my only desktop from 1997 through 2009.
OSX seems to have had this design paradigm: 'Make everything work in the most streamlined way possible for the default use case (80%). Give some limited options for a few more specific use cases (18%). Cover everything else with CLI integration / defaults system, such that the final 2% can get around with a bit of googling'
This has been highly successful, only I'm afraid that the Unix people inside Apple are loosing influence. The sandboxing in ML looks like a mess to me. Not that its a bad idea per se, but look what it did to our beautiful '~/Library/[ApplicationSupport|Preferences]'! It's tacked on and it's obvious that this has been a political decision from management rather than engineering. Makes me angry.
On a different note: I've never really used it extensively because I switched to macs in '05, but wasn't Ubuntu pretty close before they switched to Unity? I remember last time I installed it that for the first time I could have a Linux that worked out of the box, including network and graphics drivers.
I've actually switched to doing my work on Ubuntu (using xmonad - I can't stand unity) and iOS (iPad as SSH client) as OSX only cares about the 80% use case now.
Which is fine, it's Apple's decision to take, but it's no longer for me.
(As for Ubuntu - I'm using pretty standard hardware and never had a problem)
Part of it is you won't spend as much time thinking about it. Say you have a wireless keyboard that is a little dodgy on Linux, but works most of the time. But every now it fucks up and really pisses you off. If it never worked on OS X you wouldn't even remember it because you would have sent it back or resold it.
Printer drivers seem fine. Scanner drivers seem fine. USB mouses and keyboards work fine. Most Apple products can't really be expanded in many other ways.
I'd say the hardware support is fine for what little customisability Apple allows on the hardware that they sell. Perhaps the Mac Pro might have some issues with internal cards (I honestly wouldn't know ...), but those devices are not affordable for 99% of the people.
Then you've been selective about what you've bought. It's trivial to find devices that are Windows specific and/or have only gotten Linux and/or OS X support after extensive rounds of reverse engineering.
Low end printers and scanners in particular are notorious for skimping on the logic in the device itself and depending on a large blob of Windows driver code, with manufacturers that couldn't care less about supporting anything other than Windows.
I have no idea where this sentiment is coming from. I'm typing this from an MBP with a das keyboard, navigating with a razer mouse and viewing with a samsung monitor, everything works fine.
All keyboards are the same, all mice are the same, spectral analyzers are not the same, software-defined radios are not the same, USB-Serial adapters are not the same, FPGA interfaces are not the same, sensor boards are not the same, laser cutters are not the same, need I go on?
I bought Samsung Galaxy S3 recently, plugged it in via USB and was not able to browse it because, as far as I understood, MacOSX has no support for "media devices" whatever that is. I had to download some obscure Samsung "Kies" software through which I was able to get to the filesystem and upload some files. Wasted hours.
Starting with Honeycomb the storage partition is exposed through MTP (or PTP), this way it can be accessed concurrently by both Android and your computer. Before Honeycomb, said partition could be accessed as an USB mass-storage drive but that required it to be unmounted, thus making all data unavailable to Android while it was connected to a computer.
In my experience only Windows 7 has passable MTP support. On Linux I had mixed results with gvfs and mtpfs (slow and crashy). jmtpfs [1] is a nice replacement, though. Google also released an application for OS X [2] but I can't try it since I don't own an Apple computer.
My solution to the problem was to install a WebDAV server (such as [3]) on my device (Galaxy Nexus) and access it wirelessly.
Why can't Apple support MTP? Linux gets cited for lack of hardware or protocol support (MTP to an Android device works fine in Linux, btw), but it's not Apple's fault when they can't or don't provide drivers for common, standardized consumer electronics?
Even though my Samsung phone is seen as a USB device (and it works on MAC OS X), but maybe in the newer models they removed this functionality (and called it a feature)
People may complain that iPhones need iTunes but then again it's iTunes not the gigantic pile of crap that is Kies
It was pure engineering tradeoff to improve things in the long term. If anything OEMs would have preferred the old 'just works', sub-optimal, gives-an-excuse-to-obsolete-a-phone, lower support issues, solution. The ball is now in operating systems' court to support this standard in a way it wasn't envisioned to work a few years ago.
OS X has perfectly find support for media devices, and has worked fine with every digital camera, SD Card or other card reader, etc, that I've attached via USB in the past decade!
I have no idea where this sentiment coming from. I'm typing this from an MBP with a das keyboard, navigating with a razer mouse and viewing with a samsung monitor, everything works fine.
I've been using OS X going back to the betas. I've never seen OS X have problems with third party hardware that conformed to standards.
I have seen problems with third party hardware that required its own driver (poorly written generally) or that was half assed crap that was designed to work with windows but marketed as supporting standards like "USB".
Meanwhile, Linux has trouble dealing with the computer itself, let alone getting hardware attached to it. Constant pain in the ass.
Currently I'm in a country in a different hemisphere from the USA. I walked into a store and bought and off the shelf no-name brand displayport to HDMI adapter. Worked no problem.
I've gotten used to being able to do that... and haven't had much trouble getting things to work with OS X in years.
Course, I also stopped buying things that require custom drivers-- that's a clear sign you're not going to have a good time.
To claim that Linux is better in this regard is, quite frankly, asinine.
In my experience, Linux works great with standards-compliant hardware. Linux is typically the first platform with support for a new standard. The problem is there is so much hardware that is not standards-compliant.
Your comment looks about as asinine to me as the as the parent comment does to you.
Being the first platform supporting it in the kernel - and then you have to wait for another few months (or years in case of Debian) until you finally have the driver in your distribution. The alternative is compiling kernel-modules yourself - which then get broken again by the distro whenever it updates the kernel without warning. And to even compile you have to follow hints on several websites until you figure out the way to compile a module just for that kernel-version you are having (this all is from experience trying to get a Wacom tablet running - I gave up on it when it broke again after a distro update and the compiling mechanism then also no longer worked for some reason. Now waiting for Debian Wheezy to save me).
"Currently I'm in a country in a different hemisphere from the USA. I walked into a store and bought and off the shelf no-name brand displayport to HDMI adapter. Worked no problem."
That really has nothing to do with the OS you're running. The OS isn't driving the DisplayPort<->HDMI protocol conversion in any way (your 'no-name' brand adapter is probably relying on DP++ and just passing through the HDMI signal anyway).
This is like saying your VGA cable is compatible with Macs with a VGA port. Of course it is.
> I've never seen OS X have problems with third party hardware that conformed to standards.
The problem is that hardware often doesn't really conform to standards, even those it claims to.
One of the reasons device support in Linux (and other OSes which target a wide range of platforms) is difficult, is that it's necessary in practice to cope with and have workarounds for buggy and out-of-spec hardware and firmware. Just "coding to the standard" isn't good enough.
Such buggy hardware/firmware is rarely documented as such, and finding these problems and the appropriate way to handle them is painful and difficult work. In some cases the only practical way to figure out what actually works is to reverse-engineer what Windows does (the hardware manufacturers generally make sure that Windows works with their hardware, but rarely make such information public).
Apple's main goal is their own hardware, over which they obviously have a lot of control and information, so they really don't need to worry so much about this.
If the hardware conforms to standards, Linux is the most likely to be able to support it of all three - and that's even without having to install a driver.
See, according to me, you don't get the point. Things DO break in OSX as well. Just that its a bit less. Apple has the liberty to control both the (primary) hardware as well as the OS which runs on it. Therefore it has an edge over Linux in terms of compatibility.
Having said that, barring a few "recent" desktop distros, as they call it, like Ubuntu, Linux has always been a bottom up approach. You need to install the base, then the userland and so on. For someone who is looking for ultimate customization, Linux makes sense.
But when I am running a notebook, I seriously want everything to work, albeit at a lower performance/customizability.
I always run either FreeBSD or Gentoo on my servers. But my notebook/desktop continues to be OSX.
Just because it's never happened to you on OS X or Windows doesn't mean it doesn't happen. OS X 10.6.7 broke the output on my 13" Macbook for either of the two external displays I own. Both worked fine previously, when booted from the install CDs, or from Linux on the same machine.
Plugging in my Firewire audio interface on the same machine spun the CPU up to 100% and kept it pegged there. A lot of good having a nice mic pre-amp does when you get a high pitched fan whir added gratis to all of your recordings.
It's silly to pretend that Mac is somehow perfect in these matters. In my experience it's only been marginally better than Linux, if at all. And with Linux you have some hope of finding a solution, whereas for OS X you're pretty much hosed.