Hacker News new | past | comments | ask | show | jobs | submit login
Running OS X Mavericks under QEMU with KVM (definedcode.com)
196 points by kvmosx on June 4, 2014 | hide | past | favorite | 106 comments

given that a lot of the work presented here was not "preliminary work that was adapted", but actually entirely made by him and copy and pasted by you, I think you could at least do him the courtesy of linking his site[1] and his work somewhere in the first sentence instead of a side note in the closing statement.

Adapting his work in my opinion is not copy and pasting his work and adding a line or two in the middle. Making a simple enough script that runs the whole thing from start to finish for example would be a nice adaptation that I am sure he and many other people would appreciate.

Maybe the choice of your wording and the structure of your post is a bit unfortunate. If so, consider editing it.


edit: sorry, if i came across harsh, in the tiny little world in my head the one thing engineers care about/enjoy(but don't expect) after an accomplishment is praise and credit.

I was aware of this when posting. I am very thankful for that information and everyone who contributed to KVM, KVM-KMOD and QEMU repositories. I have moved the credit to the top of article.

This was more as a reference and a quick guide (also because I couldn't achieve Mavericks via his research and therefore I have added it here).

Thanks anyway.

What graphics card does Mac OS X think it's using when running in QEMU/KVM as you describe? Are you able to get different (more than 1200x800) resolutions? One of the major shortcomings of most of the "Mac OS X in a guest" efforts is that 3D hardware acceleration is disabled (unimplemented) in the guest video driver, which the Quartz compositing engine assumes will always be there. This results in weird video behavior, like certain things not showing up or for FLV video to not render in a web browser. Are you able to view web video with this Mac guest?

When OSX can't pick a GFX driver (or you forced safe mode) it usually falls back to a generic VESA driver, along with a software OpenGL implementation (hence why GUI effects still work but some apps such as Pixelmator crash), which I suppose is not entirely unlike LLVMpipe. With a non-crap CPU it's even quite usable.

The 3D hardware acceleration is not present. The purpose of this exercise was to have virtualized Xcode build slaves.

If that's what you're trying to do, you might want to explore cross compiling Mac apps on Linux:


See also Mozilla Bug 921040 - Cross-compile Firefox for Mac on Linux. https://bugzilla.mozilla.org/show_bug.cgi?id=921040

Couple of notes:

* It is easier to just install a Linux 3.15 kernel. I installed the latest RC on Ubuntu 14.04 and that did the trick.

* You should not have to install Mountain Lion first. I made a Mavericks ISO using the script mentioned and it installed without any issues

* The Qemu that ships with Ubuntu 14.04 is new enough. No need to compile your own. Those patches are pretty old and if your distro is up to date then you most likely already have a Qemu that just works.

This posting also has more info http://blog.ostanin.org/2014/02/11/playing-with-mac-os-x-on-...

It would've been helpful if the guide mentioned the required KVM and Qemu versions by their version numbers. Users like me (Arch Linux, Gentoo etc.) and Ubuntu, Fedors users in the future could check and avoid installing from source.

Looking up qemu-kvm in Ubuntu 14.04 now, it's version "2.0.0~rc1+dfsg-0ubuntu3.1", which is similar to the version in my Debian Jessie/Testing machine (2.0.0+dfsg-6). Time to try this out, methinks.

Edit: try this out another time - you need access to an OSX machine to do various steps of the process.

Run the file utility over the DMG that you have. If it comes up as having a boot sector in it (aside from a hfsplus filesystem), you should be able to use qemu-img to convert it to something qemu or VirtualBox can use.

Are you speaking from experience? Because that doesn't work since 10.8 I think. You need to use the script to turn the dog into an iso. It is not just a format change, it actually changes the structure of the contents.

Ah, my source image may have already been modified. Thanks for the pointer.

I did this recently and was able to start directly with Mavericks with no trouble.

This is also a useful article that I read in addition to Gabriel's work to get this working: http://blog.ostanin.org/2014/02/11/playing-with-mac-os-x-on-...

It's great that we can finally virtualize OS X with no hacks to the guest.

I wish there was a reasonable way to run OS X on non-Apple hardware.

Just today I was looking at the current Mac lineup, and there's not a single setup that works for me. Everything comes with underpowered Intel graphics, unless you pay the ridiculous price for a top-of-the-line MacBook Pro.

Oh, if only I could have a Surface Pro 3 running OS X... I could live with the Intel graphics if it had Surface-style portability.

> unless you pay the ridiculous price for a top-of-the-line MacBook Pro

If you consider the time you'll spend tweaking your hackintosh into a useful state and the risk of bricking it on every update, it's actually a bargain.

I wouldn't buy a Mac to run Linux, but I wouldn't buy anything else to run OSX.

I built a Hackintosh 2.5 years ago and haven't had any issues. I highly recommend it to anyone with minimum Google skills.

Yep, just upgraded mine to Mavericks and had absolutely no issues. You just need to make sure you buy components that are tried and tested by the hackintosh community.

How often have you had to re-install the boot loader and/or kexts? Each time you upgrade OS X, sometimes, or never?

Never. To upgrade to Mavericks I just created a USB installer using unibeast, booted from USB and followed the usual OSX install process. All my files and apps were left untouched.

That was my experience with "Hackintoshing" as well, back in the Leopard days. It was a fun, fun hack but the constant "will the next update break or brick my setup?" factor made it more of a curiosity than something that could be relied upon.

Speaking of "ridiculous price", just look at the Surface Pro 3. Sure, the 64GB version is fairly inexpensive, but it quickly escalates from there. Strapping on a keyboard will cost you another $130.

At that point the 13" MacBook Air is a bargain.

Unless you're doing intensive 3D, which is mostly reserved for gaming, then the Intel graphics will be just fine.

You could hackintosh. It's not that difficult, though that statement isn't true on some hardware configurations.

Also, how is Intel underpowered? Seems perfectly adequate for me.

In my experience, trying to install OS X on a laptop is an exercise in frustration. I did it with Snow Leopard on a Dell Inspiron-1525, almost identical to a polycarbonate white MacBook of that time and one of the easiest machines to tinker with.

I read elsewhere that desktop computers are much easier to work with, but the OP is explicitly talking about portability and, in that case, it is just better to shell out the money to buy a MacBook... event though they get expensive really quick.

I did it in about an hour or two, with most of the time being spent making the USB drive or actually installing. No problems.

The catch is that I specifically found known-good laptops and then went and bought one. Ditto for the wifi card.

Could you share your source of known-good laptops?

If you want a really nice hackintosh laptop with good build quality and great performance for the money you can't go wrong with this:

A second hand i5 HP ProBook with 4GB RAM. £126. Add to this a 120GB Samsung EVO SSD for £55 from Amazon, and a £6 wifi card from eBay. Less than £200 all in.

Everything works out of the box on this laptop except the wifi, which you have to replace (or use a USB adapter).

I configured one last week. Mavericks runs perfectly. Quartz Express, Sleep/Wake, all works.





I bought a used Dell Latitude E6520 because it was the only laptop I could find with a reasonable price that had a good full HD screen.

As a bonus, it has an i7.

I didn't want/need discrete graphics because hackbooks don't have discrete graphics support (yet?). The only thing I HAD to replace was the WiFi, but I upped the RAM to 8 GB. Other people have upgraded to 16 GB with no problems.

If you're in the market for a new desktop, I couldn't agree more with the suggestion of a hackint0sh. After fiddling with difficult laptop installs ~2006, I seized the opportunity to jump back in last Black Friday and have been working on a remarkably stable/powerful desktop at a fraction of a Mac Pro cost thanks to the resources/community over at http://www.tonymacx86.com . With their monthly buying guides (recommending compatible, proven hardware) and their installation tool, it was an incredibly easy process. (Personally, triple booting with Linux for coding fun, Windows for games, and Mac for the majority is great).

The one thing you give up, of course, is the guarantee of future compatibility. That said, there is some security in hardware working in 10.9, likely working with 10.10+ (or w/e Yosemite is) just as old Mac's working with future OSX versions, with the additional ability to easily swap/upgrade components.

What kind of graphics capabilities do you require?

My current MacBook Pro can switch between an AMD Radeon 6770M and Intel HD Graphics 3000. I've found that the Intel is fast enough, and the Radeon is hot and uses a lot of energy.

I usually don't play games, I don't do graphics development, and I don't use Adobe products. So in practice, I rarely need the Radeon.

I also have a MacBook Pro with dual graphics, but I do graphics development and use Adobe products, so I ended up switching completely over to the Radeon.

I'd like a new Mac, but I'd really prefer to have something better than the basic Intel HDwhatever integrated graphics... Based on benchmarks, the Iris seems like it could be good enough for non-Retina MacBooks. But of course Apple has chosen to use it on the Retina MBPs where it's still underpowered. Sigh.

The higher end 15.4" MacBook Pro Retinas feature both the Intel Iris and an NVIDIA GeForce GT 750M.

I'm guessing that the guy you're responding to games, and yet still prefers OS X (like myself).

There is. plenty. but they are all a big no-no because it is a huge legal issue. Apple is very aggressive on that. So we have a few mac pros in a basement and access them via vnc (not the OSX native garbage) and run our tests there. It is retarded and feel like we are back in the 90s. ridiculous.

Also, why would you want OSX on a surface? it is silly. OSX thrives on little tiny buttons. it is hell to use even with a mouse (impossible to live with OSX without keyboard shortcuts) i can't imagine anyone wanting that on a touch device.

" Everything comes with underpowered Intel graphics" exaggeration , exaggeration and exaggeration. My iMac comes with NVIDIA 755m (actually all but one , of the iMac models come with NVIDIA). Macbook pro offers also NVIDIA . MacPro of course is the top option in terms of GPU power coming with AMD.

Macs are not cheap, you get what you pay for. MACOS is streamlined for macs , it makes little sense for me to run such an OS on a PC and deal with crashes and other problems.

> I wish there was a reasonable way to run OS X on non-Apple hardware.

The reason that Mac OS X updates are currently free is because the software development (and that of iLife and iWork and iCloud etc) is subsidised by the hardware. You can think of the Mac as a very elegant functional dongle, if you like. But the fact is, that's the way Apple structures its business.

what motivates Microsoft updates? Fear of market share loss?

Not to mention the... well, weird keyboards!

> Virtualizing OS X is a thing that can today be done very easily, with VMware and VirtualBox fully supporting it under OS X hosts

Having tried this in the past with Virtualbox there were some serious caveats.

If you still can't make Seamless mode in Virtualbox work under Linux and have accelerated graphics drivers then it's not worth trying to run OS X virtualized. It's a crap experience.

Wish somebody had an up-to-date guide to installing Mavericks on Xen.

I wonder what crazy low level technical thing is preventing a clean mavericks boot while an upgrade works.

Does this build of chameleon still rock fakesmc? FakeSMC is unambiguously illegal - it violates Apple copyright.

Zoom on iPad sucks. The stupid Home/About navigation bar just doesn't go away.

Open source darwin has failed. That is the problem.

This is awesome. OS X in the cloud!

Pro-tip: don't bother upgrading to Mavericks, Mountain Lion is better and lighter (even with the memory compression thing), don't upgrade unless you really need to use something specific in Mavericks

Are you kidding? The extra battery life alone is worth it.

In a virtual machine?

Timer coalescing may help reduce host/VM context switches, which may ease virtualisation. Also, even desktops and servers consume watts, which ultimately get converted into heat and bills.

tell that to everyone who wants to play with swift.

You do know this violates the EULA for OSX?

No OSX version allows for virtualizing it as a guest VM.

(just pointing it out)

Honest question: has anyone ever cared for a EULA on hn except for bickering about getting Apple's poorly compatible consumer-os to run on regular consumer hardware?

I can't think if a single case.

To be honest, I don't think it is a huge issue for personal use. But when you're running a businesses then it becomes important to pay attention to software licenses. If you intend to get big enough to where you'll be audited for investment or sale, you wouldn't want to have your technology relying on improperly licensed software products.

That being said, I'm among the people who would love to run virtualized OSX.

It is not an EULA for that case. they've taken companies out in silicon valley with other claims. But that was during Jobs times. And I doubt he needed anything legal to go after anybody.

As long as it is Apple hardware, as defined by[1], then it can be run, virtualized or not. KVM runs on an OS X host and therefore this is totally acceptable.

[1]: http://kb.vmware.com/selfservice/microsites/search.do?langua...

Which is a real shame. It means that testing OSX software is significantly more expensive than testing software on any other popular OS due to the need for a Mac host.

Being an Apple developer is very (overly, maybe excessively?) expensive.

Well, not really. It is a popular argument though.

if you must buy special hardware just to compile code, well... that's not normal for any other eco-system's developers. Nor is having to pay for the privilege of being a developer... nor pay for dev tools, etc.

You don't have to pay for dev tools (Xcode is free). You only have to pay if you want to distribute through the App Store or, for iOS, test on a device.

But I already own a mac, and it's fantastic hardware that I can do development for anything on, and the dev tools for OS X are free.

But you can't pay a fraction of a cent per hour to run your unit or integration tests for every commit unless you set up the infrastructure to do so yourself.

Though now that travis-ci has beta support for osx, this is changing. But probably at a much greater cost and complexity than it could have without the VM restriction on the license.

Actually, it is:

> The grant set forth in Section 2B(iii) above does not permit you to use the virtualized copies or instances of the Apple Software in connection with service bureau, time-sharing, terminal sharing or other similar types of services.


In the same PDF, actually in the section 2B(iii) you are refering to:

"If you obtained a license for the Apple Software from the Mac App Store, (...) you are granted a (...) license (...) to install, use and run up to two (2) additional copies or instances of the Apple Software within virtual operating system environments (...)".

So "No OSX version allows for virtualizing it as a guest VM" looks like an incorrect statement to me. Instead, some OSX versions DO seem to allow for virtualizing as a guest VM, under some conditions.

Hmm, appears you are correct.

Previous versions just flat-out banned any sort of virtualization.

Appears this release now allows you to have up-to 2 OSX guests running ontop of an OSX host.

(A pathetic attempt at allowing virtualization imho, my Xen hosts at the office have 10-30 guests running concurrently per box)

Well, it's not completely unreasonable given that Apple is in the business of selling hardware. Of course they want you to buy macs. :)

In all my years, and colo'ed across 3 DC's... I have yet to see any OSX server hardware besides a mac mini...

There are a few companies that host Mac VPS instances, and the one I used back in 2011 was running on XServe's using what looked like a bare metal VMWare solution (Parallels had one too I believe). Rare even back then though, Mini's are definitely more common.

You don't even need it to be an OS X host, only Apple hardware. VMware allows you to install OS X on ESXi hosts that run on Mac mini or Mac Pro (but disables that functionality on all non-Apple hardware).

In sensible countries the EULAs don't have any legal value.

The requirement, if you actually read it, is only that the metal must be from Apple.

(just pointing it out) :P

This is false. Since Lion (I think it was), the wording allows installation of up to 2 additional copies as VMs, provided the underlying hardware is a Apple/Mac.

The mystery around the Hypervisor API in Yosemite is even more intriguing then.

I find it humerus that getting osx to run in a controlled environment, useful for testing simulation etc takes more hacking and more work than any Linux distro ever did, even in the early days.

Despite this Mac people insists getting Linux to work is "hard" when it mostly works out of the box on any hardware. Amazing.

"Despite this Mac people insists getting Linux to work is "hard" when it mostly works out of the box on any hardware. Amazing."

This kind of divisiveness is not necessary. Clearly you have some dim folks in your midst.

"than any Linux distro ever did" - Remembering my stack of 32 floppies to install slackware and the level of understanding it took of drivers and hardware, I'm pretty sure that's not true.

The difference is, in those days you'd install Linux and maybe you don't get a working sound card or network or modem (since that was a thing). So you have to seek out replacement cards that have drivers. The worst is if your IDE controller doesn't work and it can't find the disk, but that was rare. I guess I also had a motherboard here and there that would cause panics on boot. But, the important point in all of this, is in those days it was never the case that the installation disk was actively trying to prevent you from booting. You usually got somewhere, and the stuff wasn't actively working against you, it was mostly just missing drivers.

And of course, today a stock distro kernel has more drivers in it than Windows or OS X, so it is a long time ago that you're talking about.

While I remember "oh no disk 17 isn't working" better re-download and try again tomorrow, this is the closest to that level of pain I've encountered in a long time. What's sad is despite VMs for testing being the norm, the vendor is actively causing pain for devs and admins.

For quite some time I was downloading things over bitnet ftp to email proxy since we didn't have raw IP available. Crazy times...

yes, circa 1996 maybe...

"THAN ANY LINUX DISTRO EVER DID," said josteink. Mentat already quoted it once. I thought it was pretty clear.

What does installing OS X on hardware it was never tested on have to do with the number of Mac users not wanting to use a OS that requires work beyond turning the computer on?

Linux is harder than OS X to get running, when you follow the OS X EULA.

Very much so not true:



Download an ISO and install it.

I think you will be surprised. Pop the disk in after burning to a CD/DVD. Click "install", reboot and you are done.

Doesn't matter what your hardware is... it will install and work.

Other than getting a computer with a pre-installed OS, it really can't get much simpler. Modern Linux Desktops do not require you to do anything but click the "install" button. You don't have to mess with partitioning, packages, etc. You can understand absolutely nothing about Linux and install it easily and use it easily.

It's at least 100x more difficult than opening a laptop screen and it already being done.

Then buy a laptop with Ubuntu pre-installed on it. Example: http://www.dell.com/us/business/p/xps-13-linux/pd

How do you do that in VMWare ESX?

Not sure your point here... it's actually easier than this mac install on ESX (of course, because it's supported natively with no hackery required). Just point ESX at the ISO and it installs.

I agree that linux is very easy to install these days, but the point is still valid. OS X was never intended to run the way the author is using it. If you use it as intended, installation is every bit as simple as Ubuntu or Fedora, and there's no need to mess with partitioning or drivers.

It will install and "work". But the 3d graphics won't. And even though it's pretty easy nowadays, it's still not easier than OS X, especially if you want to transfer files and settings from another machine.

And then the sound will glitch and cut out, with the graphical FPS varying wildly in games and sometimes frames being delayed by hundreds of milliseconds or more. That's my personal experience, anyway; I really wish I could say Linux works well, but on a 2008 Dell PC with a common chipset the sound does not even work reliably with the latest version of Ubuntu, nor does a common AMD Radeon chipset with the official fglrx drivers.

It's not like it doesn't work, it worked fine 90% of the time. But 90% is not enough; when even Half-Life 2 is completely unplayable (on hardware which easily runs it in Windows), something's wrong.

Every issue you described are issues from 2-3 years ago. They have all been solved, especially in the major distros. A current version 14.x of Ubuntu, or version 20+ of Fedora will work "out of the box" with zero tweaking. Honestly. (I run full-time Linux desktops on everything from old repourposed warehouse workstations that were bought in the windows 2000 days, all the way through dual booting my modern gaming rig... it just works).

And to touch on your gaming comment -- actually, Valve has found the same games perform better on Linux using OpenGL than they do on Windows. So much so, that they were surprised at how well they performed. So much so, that many many other gaming companies are now re-releasing and/or planning future releases on Linux. The day of Linux gaming is here...

>Every issue you described are issues from 2-3 years ago.

Sorry, these issues were experienced by me this year. On hardware that is not defective as it works flawlessly on Windows.

>They have all been solved, especially in the major distros.

No they haven't. It has not been fixed. Neither my GPU issues nor my Audio issues have been fixed.

>A current version 14.x of Ubuntu, or version 20+ of Fedora will work "out of the box" with zero tweaking.

It was running out of the box. It just didn't work properly.

>And to touch on your gaming comment -- actually, Valve has found the same games perform better on Linux using OpenGL than they do on Windows.

Yes, I know. OpenGL works, and all that. However, the harsh reality I have found is that the graphics drivers didn't work properly and frames would be delayed (not elongated; delayed) making Half-Life 2 unplayable under both WINE and natively.

> It just didn't work properly.

I echo this 100%. Even after significant tweaking it wasn't smooth enough. I can handle lower FPS than Windows, but delays, full blown glitches, flat out incompatibilities just rule it out for me.

Well, I"m sorry your experience was bad. I assure you, it is not normal what you are describing.

Valve has (figuratively) bet the house on Linux as tomorrow's gaming platform... and I'm confident they would not have done so without Linux being ready for prime-time.

As a note, don't game under WINE. WINE is great for applications and such, but a game designed to run on windows won't work quite right even on WINE. But, if you install the Native Linux Steam client and have Steam download/install the Native Half-Life 2 for Linux... I assure you, it works flawlessly... as-does all Source games now, Unity games, Unreal games, CryTek games (or at least they can be ported native now).

> I assure you, it is not normal what you are describing.

Phoronix seems to assure otherwise. It feels like every day I read about a minor performance bump with RadeonSI and then a 10fps regression. The proprietary drivers are clunky and don't seem to play nice with my multimonitor setup. Needless to say, my attempts on Ubuntu 14.04 this year on my very standard i5 3570K / AMD 7950 rig have not been very successful. I had this one awful text rendering bug (artifacting) with both the proprietary and apt-get open drivers. Only when I compiled Mesa from Git did I fix it. Taking a step back, running Hackintoshed OS X felt more reliable in the 3D graphics department, and I used it for over 6 months before dropping back to Windows 7 full time.

I wish, wish, wish to use Linux full time. I really do. I'm sure if I had older/integrated graphics hardware my problems would disappear. Unfortunately I need dedicated hardware to drive a dual link DVI (Korean 27") and two more HDMI/single link DVIs.

I'm sure the driver support will come with time. Maybe around the time Wayland/Mir take off. I'll be waiting. My MacBook is my rock, of course.

Hah, I only resorted to Wine because the Linux ports were so bad. The result was no better, though.

I'm not sure what you mean by "3d graphics won't [work]", they do just fine. Transferring files/settings is also a non-issue. I think it's been a long while since you looked at or used a Linux desktop. They have come a long long way in the past 5 years.

Not sure how it can be any easier other than getting a computer pre-installed (which you can now too, but, admittedly not as common as I would like). I mean, you have to put a disk into some hardware at some point and click some install button...

It's not a like-for-like comparison when you restrict what OS X will run to what Apple say it will run on (ie. only their hardware) but you don't keep the same expectation by requiring your Linux distribution to run on some other, random hardware.

Buy an Ubuntu certified laptop (http://www.ubuntu.com/certification/) and then tell me that the 3d graphics won't work on Ubuntu.

That's probably because it's so new in QEMU.

VMWare, for one, have been able to do this for quite a while with the effort of ~1 mouseclick.

Edit: Not sure if troll or not?! Getting Linux to run as a guest in Xen 0.0.1 probably took some work, too.

VMWare, for one, have been able to do this for quite a while with the effort of ~1 mouseclick.

Unless things have recently changed, this is factually incorrect. It might work on OSX, but that's not really the point, is it.

For whatever reason, you may want to be able to run a general purpose X86 OS on your general purpose X86 hardware, or general purpose X86 cloud/cluster.

For instance, getting OSX to run in a regular ESX cluster (for testing, etc) can't be done wtihout immense amount of hackery, and probably requires custom kext-files, injected on another machine before dumping the image and what not.

At this point poor compatibility is just that. I fail to see why someone would try to sell that as something good.

VMWare Fusion on OSX does it that easy.

Nobody's selling OSX as a "general purpose X86 os" for "general purpose X86 hardware" though. It's an OS for Apple hardware. It just so happens that the current versions run on x86_64 CPUs.

(Plus I don't think there exists a license allowing OSX to run on a regular ESX cluster. So if it breaks, obviously you get to keep both halves)

VMWare Fusion is isn't all that stable. It works, but I wouldn't count on it.. I have to up until this point.

Getting Linux to work on random hardware is certainly harder than getting OSX to run on Mac hardware. That's not an unfair comparison, because those are their intended use cases.

You're looking at a Porsche and bitching that it sucks at offroading. Well, yeah. That's not what it's for. If you work offroad and need to carry cargo, a Porsche is not the vehicle for you, but that doesn't mean it's a bad car.

That's not an unfair comparison, because those are their intended use cases. You're looking at a Porsche and bitching that it sucks at offroading.

Your comparison would make more sense if it was about making a car made by Porsche drive on roads not made by Porsche. Which everyone agrees is absolutely reasonable.

What you're trying to sell me is the idea that poor compatibility is a feature? It was designed to have poor compatibility despite being built for a standardized platform, so that makes it a porsche?

When I try to get OSX running in our VMWare cluster so that we have clonable, snapshottable machines for testing and debugging and it takes a week of effort not getting it working, you're telling me that's a "feature"?

Yeah, sorry. I don't buy that.

> What you're trying to sell me is the idea that poor compatibility is a feature? No. I'm telling you that broad compatibility is a feature that is genuinely unnecessary for a significant set of users.

I'm not one of them, BTW. I've got my MacBook Pro for toting around, my Windows desktop for gaming, a Linux box for streaming, sharing, and storage, and an Android phone for phone stuff. Each of them does what I need it to. Horses for courses.

But there are plenty of people whose use case is "I need an OS that works as soon as I unbox the laptop." For those people, compatibility is not an issue, nor should it be.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact