> many of my programers are too busy being productive to be interested in dicking around with making things work anymore, and increasingly I hold this view
This. I switched to OS X from Ubuntu about a few months ago simply because of this reason. I was too tired of tinkering and configuring endless .conf files, adjusting my GNOME/Xfce panel layouts for optimum productivity, and changing my display manager because all of them sucked.
In my final days with Linux I was running Ubuntu Minimal with twm -- I went completely old-school. Fewer things at the core, fewer things to go wrong.
I caved and bought a MacBook Air. While I miss the endless configurability of Linux, I find that most of the time I spend trying to configure OS X is only spent to make things more suitable for my workflow -- changing the number of desktops I have, changing the applications in my dock, or adjusting Exposé to show all app windows.
I never have to configure something because it breaks or isn't compatible with something else. I could have gone the FreeBSD way, but then, well, you know. OS X just works. As a Ruby and Java developer I find it perfect. I probably will try the 13.04 release on the Mac (I tried the beta and liked it), and I'll see how it goes.
It looks like a solid release, so I'd really like to keep it (but I probably won't because I'm cheap and only have the 120gig SSD).
Except for a few driver issues, (Nvidia and Optimus, doh!), I haven't had a issue with Ubuntu or Mint. Any other issues with configuration, I would have also had with Windows or OS X since they were application or server specific.
But I don't spend days tweaking the UI, I have a baseline of UI expectations that's not too difficult to meet. (But Windows 8 doesn't meet it... 'nother story for 'nother day...)
I've had tons of issues. Especially around power management, multi-monitor support etc. When I "Hibernate" I also need to roll a d10 to know if it'll actually come back up. This is anecdotal but on same hardware there are no issues under Windows and everything works as expected. Eclipse seems less stable and slower under Ubuntu which is probably related to the Java run time. Firefox also less stable. Webcam sometimes doesn't work. Audio suddenly stops playing until you touch the volume. Couldn't get WebEx working under Ubuntu (tried lots of suggested workarounds). The kind of stuff that just works in other OSes.
I have the same issues on an Asus laptop. It was well supported on 12.04 LTS but the 12.10 is obviously a big failure.
Not to speak about the system that slow down by itself, some swap mess I don't have time to fix, and the overall growing instability of the system tools. Memcheck segfault, then the report tool activate itself, inspect, segfault, launch again...
I also upgraded a pc of a friend from an good old athlon xp to a new shiny i3, and guess what the official, libre, graphic driver is just unable to run correctly, apparently because of a bad switch in an internal opcode.
Something is wrong here. The massive shift of devs from linux to OSX should warn some people somewhere about the overall quality of their stuff. A quality that we have been proud for decades.
My printer, scanner, graphics tablet, and sound card won't work in windows (the printer manufacturer explicitly refuses to support 64-bit, the graphics tablet bluescreens win7 when plugged in, the others just do nothing). All work flawlessly out of the box on Ubuntu. Hooray for anecdotal evidence \o/
He explicitly said he misses the configurability, so I wouldn't blame him for that.
As a Mac user that runs triple boot, the "just works" mentality in OS X is present only as long as you don't step out of the typical use case scenarios. It's great to say be a writer on OS X; there are very good tools and apps to write in a minimal, esthetic environment (WriteRoom and others). On the other hand, as a "hacker", I've always had trouble in OS X -- from Page Up/Page Down not registering properly in Terminal, to full NTFS and EXT4 support (FUSE used to work, but I think it was not maintained) to getting vim, gcc and other tools of the trade working on OS X (yes, there are ports-like systems on OS X, but hardly ones that I'd say "just work").
On the other hand, in pure Linux world, brightness controls, backlighting controls, volume controls and multiple sound card support were all rather tricky when I tried using Gentoo on the same Mac for a time. Some of those I solved, but some I just gave up and was content that it kinda-but-not-completely works.
Ubuntu seems to be tested enough on Macs, so those things work on my laptop now without having to configure too much. It may not be the perfect "just works" experience, and I have seen a lot of things going the wrong way on Ubuntu, but currently it's a good mix of being easy to maintain and being easy to start doing actual work with.
I didn't leave the Linux world because I didn't like Ubuntu dictating a desktop environment to me. I left because OS X suits my needs better than any Linux distribution (except perhaps OpenSUSE) could.
It's strange because the overwhelming consensus so far has been that the latter releases have been much more tested and stable than any before, with the sole exception of LTS releases. Still, it's always been good advice to hold off on upgrading until a release has been out for a month or so, and for most users staying with the LTS release is probably good enough anyway.
There's a huge QA process for Ubuntu now -- packages have to go through a large swath of automated tests, and the alpha/beta versions are kept working on a daily basis. It was only 2 years ago or so that most Ubuntu developers didn't even expect the unstable release to be usable until the feature freeze milestone had well passed.