I'd say the biggest change in the development methodology happened when Bertrand Serlet was replaced with Craig Federighi.
With Bertrand, we would move in giant monolithic releases where every group would just dump in whatever they had ready and the whole thing would get released with nightly builds. With SnowLeopard in particular, I remember three dozen releases in a row where Xcode was unusable due to obj-c garbage collection issues. Random stuff you didn't expect like CoreGraphics would have showstopper issues and then we'd report it and it would get fixed by the next week.
This resulted in extremely late releases that had a ton of bugs that we piled patches onto as time went on.
Craig moved the organization onto a sprint system, where we would develop new features for 2 weeks and then spend a week fixing bugs. After 10 or 12 or 16 of these cycles, we would deem it ready and ship it out.
I felt this produced more stable but more conservative software. It seemed like giant rewrites and massive features would be very difficult to introduce and if they did get done, wouldn't happen until two thirds or so into the release cycle.
On the other hand, Craig has consistently been able to release on time with most of the features promised.
I was only there up to the release of Lion (the first Craig release), so I don't know how updates and patches worked from then on. Maybe they're worse now.
But I've been using OS X all this time, and honestly I don't think it's any worse than before.
What has changed is that releases and features happen more often. Tiger and Leopard had a good 2 years to mature and get patches while their delayed successors missed target dates. In the meantime they stagnated with ancient unix tools, safari build, QuickTime frameworks, graphics drivers etc.
They felt stable because they were just old, sort of like Debian stable. Meanwhile, the development versions of Leopard and Snow Leopard (the two I spent most of my career at Apple developing) were downright horrible and unreleasable. Each of those releases went gold and had an almost immediate .1 release to fix glaring issues.
It's just that you remember them better because they had a longer history as a stable legacy OS than the modern versions.
you remember them better because they had a longer history as a stable legacy OS than the modern versions.
Does this tell us anything about the state of modern commercial OS development? Users like stability and predictability in their software. Sure, they like new whizz-bang as well, but that excitement lasts for a week after you buy a new machine, and then they want stability every day for years.
That conflicts with commercial pressures to sell Shiny New Things every few months. Flat icons? Let's make them skeuomorphic. Skeuomorphic icons? Let's make them flat! Tabs are stupid, let's make them all windows. Windows are stupid, let's make them all tabs! Multitasking is great, let's have more dashboards. Single-tasking is great, let's have more fullscreen! And so on and so forth, rinse and repeat.
I think hardware waves used to hide most of this game, since every year new hardware was so much more powerful than before, you could justify this constant churning as "adding more stuff". This has not really been true on the desktop for almost a decade now, so the long-term patterns emerge and they are quite ugly to see.
> Flat icons? Let's make them skeuomorphic. Skeuomorphic icons? Let's make them flat! Tabs are stupid, let's make them all windows. ...
My guess the other problem is simply this -- they hired so many developers, designers, managers. They produced a great stable OS, worked hard, then it was making billions of dollars. Ok, what next? Institute 4 day weeks? (maybe not a bad idea, but I am being hyperbolic here a bit).
So there is perhaps some pressure from the inside to develop new features. Designers say "flat is best now" so they get to work and feel busy. Developers want to build the next internal restructuring. Managers probably want to be responsible for some new awesome feature and seal their legacy.
> They produced a great stable OS, worked hard, then it was making billions of dollars. Ok, what next? Institute 4 day weeks? (maybe not a bad idea, but I am being hyperbolic here a bit).
It's really a business problem. Once you have a successful high margin product, investors expect you to continue making those numbers. If you sit on your duff then competitors will catch up and you find yourself in a low margin commodity market.
But is it isn't always obvious what to develop, especially once the low hanging fruit is gone. Sometimes there isn't a right answer and the market is simply mature enough that it's destined to become a commodity market.
And if you can't find something good to do, people have an aversion to doing nothing. So instead they do something bad.
I agree, and that was the feeling I got while watching the 2014 WWDC. Heaps and heaps of unrelated announcements (and the feeling that nothing of exceptional quality would come out of it).
I wish Apple had instead focused, for the whole year, on getting iCloud Photos 100% right. Unlike "flat" redesigns, this could even have turned into passive income for Apple. It hasn't even shipped on OS X and I've already given up on it.
If they'd just frozen OS X at 10.9, everyone could still work on cloud functionality for years before they'd be "finished".
> So there is perhaps some pressure from the inside to develop new features.
Oh yeah, these pressures are both internal and external. There is a reason concepts like NIH, bike-shedding and "reinventing the wheel" are endemic problems in the software world. I guess we're just seeing that Apple is not immune from them, after all.
>Users like stability and predictability in their software. Sure, they like new whizz-bang as well, but that excitement lasts for a week after you buy a new machine, and then they want stability every day for years.
After years of linux use, I was always excited to see new releases, and how it was catching up with Windows and OSX. Ubuntu by version 10.4 was great, and I loved it, but then I hit the upgrade button and it borked the install (actually that happened a lot before 10.4, with various distributions).
Nowadays I am very wary of system upgrades, because more likely than not something won't work and I can't be bothered to reinstall everything the way I like it.
Ten years ago I enjoyed installing the various flavour of Linux and seeing what new shiny whizz-bang they contained - now I see it for what it is: some novelty eye candy, that rarely adds any real usability). Now I prefer rolling releases, and I am using XFCE, as it is basic and works.
The thing is, it has taken me dozens of installs of Linux to realize that the new shiny whizz-bang shiny features are usually a lot of crap, and my guess is that most "normal" users are still at the stage where shiny is far more exciting than stable. I would expect that Apple fans are even more geared to that attitude. Shinny new crap is what sells, not stable and boring (how many people do you know that use Debian stable, and compare that to Ubuntu users).
It depends on the user, really. A lot of the reason why it took so long (and indeed, is still taking so long) for Windows XP to go away is because it's what people are used to; people don't like change, especially if they don't think it's for the better. This is especially common among the elderly (where there's less receptiveness to learn new approaches when the old approaches work fine) and in low-income households (where there's less time/energy available to learn new approaches when the old approaches work fine). I've found Windows XP -> openSUSE w/ Xfce to be a relatively-suitable migration path for most users, mostly because of their relative similarities (a clear "Start menu" equivalent (that can even be easily renamed to "Start"), the "Mozillascape Firegator" already being installed and ready-to-go, a somewhat-decent office program that looks similar to Office 2003 already being installed and ready-to-go, etc.).
It's also the same reason why Debian and RHEL/CentOS and SUSE are more likely to be found in big enterprises (and/or government/military agencies) than Ubuntu or Fedora or openSUSE or Arch or whatever. Those sorts of users need things to stay consistent. NASA would rather install Debian Stable on the International Space Station and stick with that release (along with perhaps security patches and bugfixes) for half a decade or more than go with Ubuntu and have to biannually retrain its astronauts with every LTS release (and deal with installing the new release on computers that are perpetually falling around Earth at high speeds in the first place). Granted, that's a bit of an exaggeration (Unity isn't that volatile), but I'm willing to bet it factored into NASA's decision to use Debian rather heavily (for example).
I definitely second the preference for rolling releases, however, though not nearly at the pace of, say, Arch or Debian Sid. I run Slackware-current on a couple of my desktops, and I'll probably end up switching to the Tumbleweed repos on my openSUSE-running laptop very soon. Having to deal with two different levels of upgrades is annoying, but so is things spontaneously breaking because changes weren't adequately tested; openSUSE Tumbleweed and (interestingly, thanks to Slackware's conservatism when it comes to software versions) Slackware-current seem to be good balances between those two extremes.
current apple engineer... the sprint (milestone) development system is still in place... it's not the problem though, it's the problem is the focus on new useless [imo] features at the expense of core functionality and quality
hope marco, geoff and others keep writing these articles so that eventually tim or someone sees one and shakes things up. pressure from the bottom has not worked so far
In case you know someone in charge : the wifi performances issues on yosemite are still not completely solved. This is my only gripe with this version so far, but having to turn off bluetooth to get a good connexion from time to time really feels like working on broken software.
Exactly my experience. There's so many hypothetical fixes on the Internet, some seems to improve intermittently and temporarily, some seems to not work at all. But toggling Bluetooth seems be able to get a reaction from the OS at times. It felt like a ritual and I'm praying to the OS god.
Then one day a fortnight ago, I chanced upon another solution. Some people are speculating that because the WiFi signals is clashing with Bluetooth signals as they are both in 2.4 GHz, its affecting the WiFi. I was using Apple Wireless Keyboard & Touchpad. I have friend who had similar setup who seems to have the same issue.
So the next day, I spent 464 USD just to upgrade my router, get a wired Keyboard and a wired Mouse.
I have definitely splurged on these hardwares but at this point I am so frustrated and desperate that I need to make sure any hardware issue is out of the question. These are the only variables I can control, anything else felt hopeless.
I then jumped on 5 GHz WiFi and stop using bluetooth. My WiFi situation have improved a lot since then. Not perfect, but not un-usable.
The point is that I could have saved those money. To spend so as to fix a bug felt so Window'ish. It use to just work, and glitch are bearable, now I always worry of upgrading.
Anyway, those money could have went into buying Apple stuff.
I've heard the same thing, and my setup indeed includes a wireless keyboard and touchpad.
But the weirhd thing is that i don't think i experienced thoses issues with the previous os version.
I also read somewhere it was related to Handover protocol using the same bandwidth as wifi. This sounds more plausible. Then having bluetooth + regular wifi + Handover wifi is too much and the connexion suffers. I which case there's not much Apple can do unfortunately, except rethink the whole stuff.
I'd say in any case the future in that matter doesn't look bright.
would be interesting to hear what the distinction is between useless and core features.
Maybe I'm just not hitting core features with OSX 10.10, but the features I'm using seem fine. And not seeing stability issues with third-party software.
Useless feature: Tagging files 'Home' or 'Work' in Finder
Core feature: A filesystem that doesn't get so easily corrupted and need constant Disk Utility scans to stay healthy
Now, I know some people will really want tags and I remember them spending a bunch of time in the keynote talking about them but I'm pretty sure I've never used them. It's not a bad feature… I guess it's nice and I bet a lot of people use it, but I would prefer if it was built on top of a more solid foundation.
It was my impression that Tags were just Apple's last attempt to avoid having "Folders" in iCloud. Wisdom on the internet is that the file system is bad and must be hidden from users. But if you add folders to iCloud, then what is it, if not a file system?
Then iCloud Drive introduced Folders one version later, and now Tags is kind of a legacy feature.
I don't think anyone but management ever really wanted tags.
Folders allow for a simple hierarchical navigation option and allow you to preserve "structure" when round-tripping resources the rely on nested folders in and out of a cloud service. It allows for strong naming and namespaces.
Tags can be user-supplied, crowd-supplied, or even through AI/expert system, and allow for cross-cutting exploration when involving stuff that can't be indexed like text. (Although even documents benefit from tags to ease finding them again when there's a haystack)
Tags require more effort to maintain but are essential when you are dealing with a complex collection.
But there is no way to implement user-supplied or crowd-supplied tags when you are practically limited to the seven colours used for tagging, and it only works if you can even remember which colour is used for which tag.
I was also experiencing something like this. I would open lots of tabs, and at some point the machine would freeze and I would need to do a hard reboot. This got worse after installing Yosemite, where I could rarely go a day without having to reboot my system.
I was about to take my laptop in to see if it was a hardware issue when a coworker pointed me to a forum where someone suggested turning off automatic graphics switching. I did that about two weeks ago, and since then I haven't had a single occurrence of the issue. You may want to try the same thing to see if it helps.
Interesting! Wonder if it's a quality control issue with the code handling the graphics switching. Afaik, Apple engineers write custom video drivers for every supported hardware device. Wonder if Intel is now contributing more to the graphics driver updates and maintenance.
Interesting! Wonder if it's a quality control issue with the code handling the graphics switching. Afaik, Apple engineers write custom video drivers for every supported hardware device. Wonder if Intel is now contributing more to the graphics device efforts.
Interesting! Wonder if it's a quality control issue with the code handling the graphics switching. Afaik, Apple engineers write custom video drivers for every supported hardware device. Wonder if Intel is now contributing more to the graphics driver updates and maintenance.
Interesting! Wonder if it's a quality control issue with the code handling the graphics switching. Afaik, Apple engineers write custom video drivers for every supported hardware device. Wonder if Intel is now contributing more to the graphics driver updates and maintenance.
Interesting! Wonder if it's a quality control issue with the code handling the graphics switching. Afaik, Apple engineers write custom video drivers for every supported hardware device. Wonder if Intel is now contributing more to the graphics driver updates and maintenance.
Interesting! Wonder if it's a quality control issue with the code handling the graphics switching. Afaik, Apple engineers write custom video drivers for every supported hardware device. Wonder if Intel is now contributing more to the graphics driver updates and maintenance.
Interesting! Wonder if it's a quality control issue with the code handling the graphics switching. Afaik, Apple engineers write custom video drivers for every supported hardware device. Wonder if Intel is now contributing more to the graphics driver updates and maintenance.
Interesting! Wonder if it's a quality control issue with the code handling the graphics switching. Afaik, Apple engineers write custom video drivers for every supported hardware device. Wonder if Intel is now contributing more to the graphics driver updates and maintenance.
No, you can't single him out like this: Since Mavericks, OS crashes are very often reported, and Chrome tabs seem to be often reported as a cause. I myself have never had crashes with Snow Leopard and I see them often in Yosemite. That's exactly why we see articles reporting the Apple has lost the Quality ground.
I have exactly the same problem on multiple machines.
Plus, the wifi fiasco, over which more than one person should have been fired.
I am barely sticking with Apple for now, mostly because in startups it is the default, but I don't plan to replace my apple equipment with more apple products.
It's just brutal in Yosemite. This is the first release to which I regret upgrading. I've never had so many issues on my laptop, and it has never been so slow. My FileVault encryption has been stuck at 99% since I upgraded and enabled, and Apple marked my radar report as a duplicate. They can't/won't tell me if I'm forever hooped or if it will ever be fixed. At this point, I'm looking to spend a day rebuilding my whole system.
I was running the beta and it was pretty rough but that's what happens when you run betas. But then for over a month after gold-master it was STILL really problematic.
Yikes. Whats you're telling us is bug fixing and the incurring of the deep, deep, technical debt isn't seen as a problem. Unfortunately this debt is also being shackled to the 3rd party developers since we have to compensate for Apple's bugs. Not fun.
wait, this is causing friction?
they fired forstall to get total gleichschaltung and now ppl elsewhere are revolting? i hope this'll come to fruits.
this whole cozy-collaboration-and-nobody-saying-no thing has to stop.
But how did things like NSURL defaultCache and other base-components get broken in iOS8? There seemed to be work on the fundamentals that broke them right? It's not just that the new flashy stuff stuttered or crashed...
The good thing about the nightly builds was you didn't have to use them, and people could respond quickly to showstoppers rather than wait for a sprint. There was a quicklook team to catch bugs which stopped nightly builds from release to general dev, and unless you really needed that build ( to test a new API) you didn't install until quality was restored. The xCode bug was therefore unusual, as they could have fixed it the next day ( and with enough heat they would have).
With fortnightly builds which are only then released to Engineering, if that is now what is happening, there will be massive instability every 2 weeks, until the final round of bug fixing cycle stops all features being added.
Which can't finish in time because the OS has to be released at an Apple event. Except for 10.0 when the date was known months in advance the old OS cycles were released when ready ( when show stoppers were 0). Of course that did lead to some wrangling about what a show stopper was, but consider if iOS 8 was not released with the HealthKit bugs, and other major issues, and instead we got what was in iOs 8.1, or 10.1 instead of 10.0.
Still some issues but mostly robust. Apple needs to decouple software from hardware releases.
The good thing about the nightly builds was you didn't have to use them, and people could respond quickly to showstoppers rather than wait for a sprint. There was a quicklook team to catch bugs which stopped nightly builds from release to general dev, and unless you really needed that build ( to test a new API) you didn't install until quality is restored. With fortnightly builds, if that is now what is happening, there will be massive instability every 2 weeks, until the final round of bug fixing cycles. Which can't finish because the OS has to be release at an event.
> But I've been using OS X all this time, and honestly I don't think it's any worse than before.
Agreed. Yosemite has been flat out the best OS X version I used. Zero bug except a few Airplay issues. I'm even using beta builds for production, this thing is that stable.
I'm suspecting that Marco's point shows how a maturing part of the developer/consumer now takes a larger mindshare. Not a bad thing, it's just that it doesn't tell much on the product.
edit: Marco says this gets him 7.8m/s traffic, so there has to be something for _many_ here.
Yosemite has regressed, in my opinion. All Screen Sharing sessions cause my MacBook Pro to freeze for 5 minutes after they are closed. I had to reboot the laptop many times.
Have you reported this to http://bugreport.apple.com/ ? I know most of my Yosemite / Mavericks bugs were tagged as duplicates, but there might be some sort of a "wow, so many duplicates, let's fix this" type of ranking re: bugs reported to the Radar.
I, for one, would wish Mr. Federighi would push out a Snow Yosemite. No new features, simply bugfixing. I have taken to simply posting screenshots on my blog about various AppleQA issues. I wish there was a way to really make a difference, god know I'm fanatical enough about wanting Apple stuff to just work.
> Craig moved the organization onto a sprint system, where we would develop new features for 2 weeks and then spend a week fixing bugs. After 10 or 12 or 16 of these cycles, we would deem it ready and ship it out.
I felt this produced more stable but more conservative software. It seemed like giant rewrites and massive features would be very difficult to introduce and if they did get done, wouldn't happen until two thirds or so into the release cycle.
I've noticed this problem too in Agile shops. Small iterations happen fast on well scheduled. But big new efforts tend to be like swimming up-river. Whatever comes after Agile is probably going to formalize a long-track parallel cadence for a dedicated team to grind away on. Then once that's released it goes into the normal sprint cadence.
It seems to usually be "solved" right now by lots of weeping, halting production and blowing up of anybody else's schedule.
> I've noticed this problem too in Agile shops. Small iterations happen fast on well scheduled. But big new efforts tend to be like swimming up-river. Whatever comes after Agile is probably going to formalize a long-track parallel cadence for a dedicated team to grind away on.
That seems to be a not-that-uncommon practice in Agile/Lean environments with the resources to implement it already (you don't see a lot of coverage of it in books about "Agile" software methods specifically, though I've seen considerable coverage of it in "Lean" software development books -- which tend to address the kind of pragmatic metamethodology the Agile manifesto calls for, while "Agile" books, ironically, tend to focus on more on narrow prescriptive methodologies, particularly Scrum and close variants.)
The immediate and longer-term (or multiple potential solutions with different risk profiles without a strict "this one is for now, this one is the long term objective solution" division) in parallel thing that Google has done in lots of areas for a long time (and people outside often question with "why is Google doing both X and Y when they have overlapping use cases".)
Ugh, no it isn't. That myth from Windows ME era must die. For as much as the Windows 8 / Metro thing is an annoyance, under the hood it is a solid OS - I am running it heavily loaded - 4/5 HyperV VMs, IDEs, SQL Server etc and it is thoroughly reliable even counting suspend / resume.
Edit: Elaborating a bit - I got a Win 8 Pro license for $39 when it initially went on sale. I run it on my HP Z series Workstation that I got refurbished for $1199+$(Disks+ 16G ECC RAM). It allows me to run with 5 disks, a 8 Core Xeon CPU, 32GB RAM. The OS came with a very good hypervisor that allows me to run older version of Windows in cheap memory footprint, RHEL 7 and Server 2012 - all decently supported.
If I tried doing that on Mac hardware, even ignoring the considerable cost increase, getting a reliable hypervisor on a Mac is itself a challenge. Last time I tried Fusion and Parallels they were complete toys compared to HyperV.
So no, for techies Windows is still a darn attractive ecosystem - if you are just browsing and emailing any OS from 2013 onwards works fine, including Linux if you find the right hardware.
> MS has weird issues where you have to re-install the entire OS just to fix things.
That's a) too vague and b) I know my org runs a 100,000+ install base of Win 7 and unless people do something stupid no one reinstalls their OS ever. The reason you might not see much Mac and Linux calls is people aren't doing the same things they do with Windows in a business environment. Try installing a ton of 3rd party stuff, add a bunch of old software, expect it to work with any hardware you throw at it and you'll find Mac and Linux fail miserably beyond your normal browse/email/code/publish workflows.
If you are talking Enterprise - there is nothing that even comes close to what Windows does - ton of hardware, ton of backwards compatible stuff, ton of manageability stuff and so on. Try doing all that with a Mac or Linux. I guarantee you won't get too far unless you run a startup with people doing just coding and deployment. Windows in Enterprise is an entirely different beast - that it works so well is in itself a miracle. Mac and Linux won't even compete beyond the basics.
> Try installing a ton of 3rd party stuff, add a bunch of old software, expect it to work with any hardware you throw at it
But that's the thing: one the biggest value-adds of the mac ecosystem is that the hardware combinations are all well tested (or, from the point of view of an app/driver developer, they are heavily constrained). While it's not Microsoft's fault that they have to deal with a combinatorial explosion of drivers and it's not 3rd party vendors fault that they have to deal with a combinatorial explosion of hardware configurations, it's still a problem that leads to instability in the Windows ecosystem, users still have to face the instability, and it's still an argument in favor of using the Mac ecosystem if you value stability (outside of enterprise contexts -- we're in complete agreement there, OSX server is a shitshow). My personal anecdata is in a sibling post to yours.
For home use - you can get a MS Signature edition laptop for little more money than you pay for cheap PCs. Everything is stock Windows and you can expect that to work at least as well as OS X if not more. The problems start when you buy dodgy hardware loaded with ton of bloatware or crappy drivers - the signature edition PCs address that.
If you need a UNIX like system and can't live with Cygwin or native Windows tools then yes, having OS X laptop and hardware is the way to go. Admittedly the Mac hardware is still a bit nicer than anything in PC space but the trouble is Windows support is not very great (look up trouble about System Interrupts taking a ton of CPU on both Windows and Linux running on Mac laptops for instance.)
Thanks! That's good to know! After 3 or 4 tries, looks like Cunningham's Law pays off again (sorry).
> If you need a UNIX like system...
Yep, that's a big plus for me. A UNIX-like system with a decent terminal emulator and an IDE like Xcode makes my workflow so much easier. Probably enough to keep me in the mac ecosystem regardless.
Is Apple stuff is so well tested why does my Mavericks machine hang with just the cursor on a black screen for five minutes if I happen to move the mouse, or hit a key just as it starts to go to sleep.
Why does my daughters MBP frequently get very hot and the 'genius' in the Apple store tell us there's nothing wrong with it (in the most condescending way possible)
Apple stuff is just as buggy as Windows stuff (perhaps even more so)
Apple's quality has probably worsen in recent years, both software and Hardware, but as long as M$ dont build the software and hardware theselves, the likely hood of Apple Mac having a much better experience still holds. Try upgrading your Laptop from Windows 7 to Windows 8, worked? Try 8.1 and now Windows 10.
This is especially true on Laptop where no drivers update ever get released.
> expect it to work with any hardware you throw at it
I have an external USB3 1TB hard disk drive that boots into Ubuntu 14.04. I'm using it right now.
So far it boots with all my computers and several work laptops and desktops and recognizes all the hardware I have used so far. I does it much better than windows, because for some stuff like add on video and network cards, Windows needs the boring 'go to manufacturer's website, download and install' additional drivers.
In fact, if Windows was installed into the external drive, it would have failed to boot in a second computer. Linux can.
I also use and enjoy Windows and friends, like MSSQL Server and Visual Studio, but let's be realistic here:
Linux is much more from what it used to be a decade ago.
You can't confuse the OS with the shell, though. Metro is the most immediately visible part of Windows but it probably represents 1/10000 of the overall OS code base. I can't think of any reason why a Windows 7 user wouldn't be fine with 8.1 plus ClassicShell.
My fiancee has had her PC updated to windows 8. Now she can't play solitare on one side of the screen in a window with a video playing in a window on the other side of the screen.
She's forced to 'split' the screen in half, with solitare taking up a full 50% of the monitor.
Now if she wants to bring up a web page, it only loads up in the 'desktop' half of the screen.
This type of interaction is infuriating for someone used to XP, 7, etc.
tldr: anecdata supporting "Windows is still worse"
I switched to the Windows ecosystem from the mac ecosystem in 2012 because two things happened: I kept hearing that windows had gotten better, and I finally grew up enough to admit that none of the windows complaints I made as a mac fanboy were actually based on experience. Windows did nothing but disappoint and embarrass for a year and a half before I finally gave up and moved back to mac. I've got a list of "small" complaints the length of my arm and a large list of big complaints that were "my fault" in the weak sense that with foreknowledge I could have avoided them. I won't post those. Here are the big complaints that happened after I adopted a "don't do anything my mother wouldn't do" policy in an attempt to improve stability:
* An update broke the dynamic linker or something: every time I would try to launch a .exe I would get a segfault in ntdll. Needless to say, rollback didn't fix the problem, all checksums appeared fine, and I had to reinstall.
* During the Win 7 generation, the shut-down process broke. All process would appear to exit, but the computer would then spin up its fans and never actually shut off. Every time I restarted the computer would want to spend the better part of a day on fsck. Naturally I habitually canceled it; day-long reboot times aren't acceptable. Eventually this led to blue-screens on reboot. Reinstall!
* After months of normal functionality, Windows 8 decided to forget my serial number. Upon entering the number again, it refused to re-activate. After 5 hours on the phone running remote diagnostics, they had me reinstall the OS.
* Most recently, sound and/or keyboard would stop working on wake from sleep. I installed fresh drivers from the laptop vendor (they mentioned a similar problem in the release notes) and it didn't work. I uninstalled those and installed fresh drivers from the sound card manufacturer; those didn't work either. What kind of a laptop is a laptop without sleep?
* The battery life got down to 20 minutes on a year old laptop. That was the last straw.
I've been told that this isn't normal, but I've caught the same people who told me that I simply purchased from the wrong manufacturer living with huge wake-from-sleep bugs so I don't know what to think. I'm sure that there exists hardware on which Windows is as stable or more stable than Mac OS is on my mac, but I don't know how to reliably find it.
So, this kind of list of complaints seems to be really common among folks who don't like Windows. Some times the list varies and wonders from one category of problems to another, but the core theme is still the same: The end user isn't capable of using the computer.
Now, it isn't fair to say, 'You must have 5 years in desktop support to use this product', but it harkens back to the problem of that people are making judgements about an OS that aren't really caused by the OS. You can say, 'I had issues with drivers' and 'I felt that the UX here and there was poor', but to say, 'Windows is broken. Here's the problems I had that say it's broken' is dead wrong.
Furthermore, a lot of people do this kind of stuff on 500 dollar laptops and other equally crappy hardware and expect the same kind of quality as they got from a $2000 MPB. You want a rock solid PC? Buy a Dell Precision. Comparing the cheap hardware put in the budget-line laptops to the (mostly) decent hardware put in the line of portable Macs is pretty foolish, and again comes back to the idea that it's not Windows that's broken, it's your crappy hardware.
Windows is unbelievably stable and while it isn't without it's problems, 99% of the problems that people actually report as 'problems with Windows', aren't.
> people are making judgements about an OS that aren't really caused by the OS
That's why I add the word "ecosystem." I'm sure that most of the big problems I complained about can be traced to a 3rd party, but the fact that something isn't really Microsoft's fault doesn't mean it isn't their responsibility in the sense that it would be different in the mac ecosystem and therefore should affect my purchasing decisions.
* Broken activation: maybe a 3rd party program overwrote a registry key, maybe I reinstalled the damn OS too many times and their servers locked me out. Either way it's not Microsoft's "fault" per se, but Mac OS X doesn't have an activation mechanism to break, so it's a win for the Mac ecosystem.
* Broken fsck: there was really no excuse for this, it falls squarely in the MS's-fault bin.
* Broken major updates: I didn't mention the broken 8.0->8.1 update because it's evidently common knowledge in the Windows ecosystem that updating without doing a clean install approximately never works and it was therefore "my fault" that I tried it. Guess what? It approximately always works in the mac ecosystem, and that's a win for the mac ecosystem.
* Broken linker: Undetected virus? Registry overwrite? Clearly it didn't affect everyone, so it must have been something specific to my computer, and therefore probably wasn't MS's fault per-se. Still, during that install I had been extremely careful about being gentle by not installing non-default drivers, system tools, etc, so it must have been the fault of an app, possibly one I downloaded by fumbling while running the download-button gauntlet. Guess what? That kind of BS simply doesn't exist in the Mac ecosystem, at least not nearly at the levels that it exists in the Windows ecosystem. Microsoft's fault? Not really, but I'm not a charity or a judge, so I don't really care. If things are better on the mac side of the fence, that's reason enough for me to return.
> Comparing the cheap hardware put in the budget-line laptops to the (mostly) decent hardware put in the line of portable Macs is pretty foolish
Fair enough, but once you're buying the expensive PC hardware the Microsoft ecosystem no longer wins by default on price so it comes down to personal preference and things that are Microsoft's fault.
> 99% of the problems that people actually report as 'problems with Windows', aren't.
> So, this kind of list of complaints seems to be really common among folks who don't like Windows. Some times the list varies and wonders from one category of problems to another, but the core theme is still the same: The end user isn't capable of using the computer.
Inexperience with a computer doesn't cause ntdll to break. Inexperience with a computer doesn't cause Windows activation to break. Inexperience with a computer doesn't cause shutdowns to take hours instead of seconds. Inexperience with a computer doesn't cause suspend to spontaneously bork.
You're blaming inexperience for things that are fixed on pretty much every modern operating system. Yes, those problems are caused by the OS, because it's really the only OS that chronically has those kinds of issues in those quantities.
Let's go through and demonstrate exactly why blaming the user for broken software is inaccurate at best:
* According to jjoonathan, the ntdll segfaults started happening after an update - something which should be routine. Do you expect users not to update their software? Especially when Windows will happily do the update and reboot automatically (and annoyingly, I might add).
* Shutting down isn't that hard, yet somehow Windows pretends that it is. It's rather hilarious to watch Windows take twice as long as, say, openSUSE to do something as basic and fundamental as shutting down. This isn't even including Windows' hilariously-convoluted method of installing system updates; if my laptop's openSUSE installation can install updates without having to drop down into some sort of maintenance mode, why can't Windows (for the record, though, OS X annoyingly has the same problem). I shouldn't have to sit through 100 updates installing when all I want to do is shut off my laptop and continue with my life.
* Activation problems with Windows are very common. I've run into them repeatedly with clean installs and multiple Windows versions. I shouldn't have to open a command prompt and type in arcane commands in order to activate Windows so I can customize the homescreen. Hell, I shouldn't have to do something as silly as activating my rightfully-purchased copy of my operating system, period, but that's another story.
* Really? You're going to blame Windows' fragility when it comes to power management on the user?
These sorts of problems are common with Windows. They aren't common with modern operating systems like GNU/Linux and BSD (and even OS X, no matter how hard Apple tries to mangle it and force it to be as poor of a product as Windows).
Yeah, "unbelievably stable" and "Windows" don't belong in the same sentence (except for this one, obviously). You're delusional if you sincerely believe Microsoft's shoddy programming to be the fault of their customers* of all people.
You are the person I'm talking about when I say that people that blame Windows for problems are their own worst enemy, and it's way worse when Dunning-Kruger is in full effect as well...
> Inexperience with a computer doesn't cause ntdll to break. Inexperience with a computer doesn't cause Windows activation to break. Inexperience with a computer doesn't cause shutdowns to take hours instead of seconds. Inexperience with a computer doesn't cause suspend to spontaneously bork.
Inexperience causes all these things. Listen, my experience goes way, way beyond 'anecdotal' when it comes to managing end users running Windows. Never, ever, ever with the exception of a small handful of updates, does Windows fundamentally break itself. When you see a bluescreen, 100% of the time it is your fault or it's your hardware's fault. 100% of the time. Sometimes it might look like it's not your fault. Sometimes it's McAfee removing important files that break the OS because it's being stupid, but that's not a Window's problem.
If shutdowns are taking awhile, something is broken. Your inability to diagnose that isn't the fault of the OS. Ever.
If your computer 'suddenly borks', then your hardware failed, or you are bad at computer. 100% of the time.
> According to jjoonathan, the ntdll segfaults started happening after an update
It usually means that some third party software, usually legacy/dated drivers or your AV solution, has fucked up. The aforementioned McAfee bug is pretty famous for that, which you can read about here: http://www.theregister.co.uk/2010/04/21/mcafee_false_positiv...
Your inability to diagnose the problem isn't the fault of the OS.
The half dozen Windows computers I own and the close to 600 I manage never, ever have these problems that seem to plague other people who seem to insist 'it's just Windows'. So, why is it that my customers never have these problems that you insist are ubiquitous to the platform? It'd be because I can manage them. I can leverage GPO to make sure my AV solutions at least report when it's doing something fishy. I monitor my computers so when something bluescreens, I get a copy of the dmp file and I can peel through it immediately. I don't let my users have unrestricted access to absolutely everything. I can use and manage the OS and leverage the extremely potent tools Microsoft gives me to diagnose problems and use that knowledge to prevent problems in the future.
And here's the thing: I don't really enjoy white-knighting Windows, but I also really don't enjoy people in the field of IT acting like Zealots in regards to products and platforms they don't know anything about.
I'm not a huge fan of Linux. I'm not a big fan of how esoteric the OS is and how management is unintuitive and complex. I'm not very good at fixing the problems that show up on the platform because quite frankly, I'm just not experienced enough to take any problem and pave a path to a solution. I've sworn more at my SAN running Openfiler because of Openfiler than I care to admit and if I could do it all over again, I'd never have put Openfiler on that box, but here's the thing: I'll never say, 'Linux is bad' or 'Linux isn't stable' or 'Holy shit I'm so mad at OpenFiler because if you have an iSCSI implementation and your boot device fails, your arrays are fucked unless you can rebuild', because I'd be speaking from misunderstanding, inexperience, and an incomplete understanding of the tools I'm using and I might look like an idiot by saying as much.
My point is that there are plenty of operating systems out there that don't have these issues. That's a point you're missing (or perhaps deliberately ignoring). It's fine and dandy that you've found ways to work around Windows' awful design, but my point is that you shouldn't have to do so, seeing as there are plenty of operating systems which don't have these problems. My point is that there shouldn't be anything to cause shutdowns to slow down in the first place, because something as elementary and critical as halting execution shouldn't take long at all. My point is that upgrading one's operating system (and all the other software, for that matter) shouldn't be a convoluted ordeal with multiple reboots (and even more shutdown delays) and high risk of seemingly-minor updates breaking things irreparably. My point is that that you shouldn't have to be a MSCE and manually prevent your system from imploding; my point is that your system shouldn't spontaneously implode in the first place.
These things are the fault of the operating system when other operating systems have already solved these problems. Blaming users for Microsoft's bastard child of DOS and VMS being poorly designed is, well, misguided, to say the least.
I don't particularly like white-knighting Unix, either, but after about a decade and a half of Windows support and administration - in environments ranging from ordinary households to healthcare facilities with hundreds of workstations and almost half as many virtualized servers - it eventually got to the point where I'd rather use something that doesn't require that level of babysitting - something like Unix, for example - and put my time and energy into better things than unclogging my Registry and sitting through 2-hour-long shutdowns due to Windows Update and such.
I don't even particularly like GNU/Linux, either, but it's certainly amusing to see someone like you compare it with Windows and call the former, of all things, "esoteric". Windows is the textbook definition of esoteric.
And nice ad hominem, by the way, assuming that the Dunning-Kruger effect is in play right now.
My point is that there are plenty of operating systems out there that don't have these issues. That's a point you're missing (or perhaps deliberately ignoring). It's fine and dandy that you've found ways to work around Windows' awful design, but my point is that you shouldn't have to do so, seeing as there are plenty of operating systems which don't have these problems. My point is that there shouldn't be anything to cause shutdowns to slow down in the first place, because something as elementary and critical as halting execution shouldn't take long at all. My point is that upgrading one's operating system (and all the other software, for that matter) shouldn't be a convoluted ordeal with multiple reboots (and even more shutdown delays) and high risk of seemingly-minor updates breaking things irreparably. My point is that that you shouldn't have to be a MSCE and manually prevent your system from imploding; my point is that your system shouldn't spontaneously implode in the first place.
These things are the fault of the operating system when other operating systems have already solved these problems. Blaming users for Microsoft's bastard child of DOS and VMS being poorly designed is, well, misguided, to say the least.
I don't particularly like white-knighting Unix, either, but after about a decade and a half of Windows support and administration - in environments ranging from ordinary households to healthcare facilities with hundreds of workstations and almost half as many virtualized servers - it eventually got to the point where I'd rather use something that doesn't require that level of babysitting - something like Unix, for example - and put my time and energy into better things than unclogging my Registry and sitting through 2-hour-long shutdowns due to Windows Update and such.
I don't even particularly like GNU/Linux, either, but it's certainly amusing to see someone like you compare it with Windows and call the former, of all things, "esoteric". Windows is the textbook definition of esoteric.
And nice ad hominem, by the way, assuming that the Dunning-Kruger effect is in play right now.
My point is that there are plenty of operating systems out there that don't have these issues. That's a point you're missing (or perhaps deliberately ignoring). It's fine and dandy that you've found ways to work around Windows' awful design, but my point is that you shouldn't have to do so, seeing as there are plenty of operating systems which don't have these problems. My point is that there shouldn't be anything to cause shutdowns to slow down in the first place, because something as elementary and critical as halting execution shouldn't take long at all. My point is that upgrading one's operating system (and all the other software, for that matter) shouldn't be a convoluted ordeal with multiple reboots (and even more shutdown delays) and high risk of seemingly-minor updates breaking things irreparably. My point is that that you shouldn't have to be a MSCE and manually prevent your system from imploding; my point is that your system shouldn't spontaneously implode in the first place.
These things are the fault of the operating system when other operating systems have already solved these problems. Blaming users for Microsoft's bastard child of DOS and VMS being poorly designed is, well, misguided, to say the least.
I don't particularly like white-knighting Unix, either, but after about a decade and a half of Windows support and administration - in environments ranging from ordinary households to healthcare facilities with hundreds of workstations and almost half as many virtualized servers - it eventually got to the point where I'd rather use something that doesn't require that level of babysitting - something like Unix, for example - and put my time and energy into better things than unclogging my Registry and sitting through 2-hour-long shutdowns due to Windows Update and such.
I don't even particularly like GNU/Linux, either, but it's certainly amusing to see someone like you compare it with Windows and call the former, of all things, "esoteric". Windows is the textbook definition of esoteric.
And nice ad hominem, by the way, assuming that the Dunning-Kruger effect is in play right now.
My point is that there are plenty of operating systems out there that don't have these issues. That's a point you're missing (or perhaps deliberately ignoring). It's fine and dandy that you've found ways to work around Windows' awful design, but my point is that you shouldn't have to do so, seeing as there are plenty of operating systems which don't have these problems. My point is that there shouldn't be anything to cause shutdowns to slow down in the first place, because something as elementary and critical as halting execution shouldn't take long at all. My point is that upgrading one's operating system (and all the other software, for that matter) shouldn't be a convoluted ordeal with multiple reboots (and even more shutdown delays) and high risk of seemingly-minor updates breaking things irreparably. My point is that that you shouldn't have to be a MSCE and manually prevent your system from imploding; my point is that your system shouldn't spontaneously implode in the first place.
These things are the fault of the operating system when other operating systems have already solved these problems. Blaming users for Microsoft's bastard child of DOS and VMS being poorly designed is, well, misguided, to say the least.
I don't particularly like white-knighting Unix, either, but after about a decade and a half of Windows support and administration - in environments ranging from ordinary households to healthcare facilities with hundreds of workstations and almost half as many virtualized servers - it eventually got to the point where I'd rather use something that doesn't require that level of babysitting - something like Unix, for example - and put my time and energy into better things than unclogging my Registry and sitting through 2-hour-long shutdowns due to Windows Update and such.
I don't even particularly like GNU/Linux, either, but it's certainly amusing to see someone like you compare it with Windows and call the former, of all things, "esoteric". Windows is the textbook definition of esoteric.
And nice ad hominem, by the way, assuming that the Dunning-Kruger effect is in play right now.
My point is that there are plenty of operating systems out there that don't have these issues. That's a point you're missing (or perhaps deliberately ignoring). It's fine and dandy that you've found ways to work around Windows' awful design, but my point is that you shouldn't have to do so, seeing as there are plenty of operating systems which don't have these problems. My point is that there shouldn't be anything to cause shutdowns to slow down in the first place, because something as elementary and critical as halting execution shouldn't take long at all. My point is that upgrading one's operating system (and all the other software, for that matter) shouldn't be a convoluted ordeal with multiple reboots (and even more shutdown delays) and high risk of seemingly-minor updates breaking things irreparably. My point is that that you shouldn't have to be a MSCE and manually prevent your system from imploding; my point is that your system shouldn't spontaneously implode in the first place.
These things are the fault of the operating system when other operating systems have already solved these problems. Blaming users for Microsoft's bastard child of DOS and VMS being poorly designed is, well, misguided, to say the least.
I don't particularly like white-knighting Unix, either, but after about a decade and a half of Windows support and administration - in environments ranging from ordinary households to healthcare facilities with hundreds of workstations and almost half as many virtualized servers - it eventually got to the point where I'd rather use something that doesn't require that level of babysitting - something like Unix, for example - and put my time and energy into better things than unclogging my Registry and sitting through 2-hour-long shutdowns due to Windows Update and such.
I don't even particularly like GNU/Linux, either, but it's certainly amusing to see someone like you compare it with Windows and call the former, of all things, "esoteric". Windows is the textbook definition of esoteric.
And nice ad hominem, by the way, assuming that the Dunning-Kruger effect is in play right now.
I've used linux since 2003, and windows before that, on PC hardware. In 2012, I was looking for a good laptop, and could not freaking find one that wasn't an Apple Macbook something. So I got a Macbook Air and installed linux on it.
Your case is extreme. But on every windows install I've seen at least one of these kind of inexplicable un-fixable problems you just have to re-install to fix or live with.
I like linux because every file is owned by a package, or it's a plain text config file in /etc, or it's in my home folder and has absolutely no effect on another user. It's just generally more under control. Files are not just modified by patch installers etc. So I love OS X app-folders, but many things for OS X use pkg installers, including stuff from Apple, so it's kinda the windows situation again, just a bit more transparent because it's mostly unix.
My tips for Windows: don't do what your mother would do. First things first, go into Programs and Software in the control panel, and uninstall stuff, even stuff you don't recognize. Check the Device Manager to see if you've accidentally uninstalled a driver, and if so re-install it (get it from the laptop manufacturer's support website). Don't install or run Microsoft or Adobe (or Apple) software, stick to the open-source windows software where possible, like 7-zip, sumatra-pdf, Libreoffice, Firefox or Chrome, etc. Also go ahead and disable System Recovery on all drives, to save disk space and performance, since if something goes wrong you'll have to re-install anyway.
"if something goes wrong you'll have to re-install anyway."
This is probably one of those things where experiences will vary but for me most of the time on my old win7 laptop when a bad install broke stuff system recovery worked fantastically.
System Restore's been a toss-up for me. Sometimes it's worked wonderfully. Other times, it makes the problem worse. Still other times, it just doesn't help, or something caused the restore points to be nuked (sometimes that something is the user him/herself).
I've been experimenting with NixOS lately, which has some very nice features along the lines of System Restore done right; being able to roll back any change to system configuration is pretty awesome.
Were the first two on the same hardware? Because the second one just screams "failing hardware" to me, and could easily be tied in with the first.
Startup "chkdsk" on a Windows machine that simply didn't finish shutting down should be very quick - basically the last thing Windows does is write a "clean shutdown" flag to the drive. On startup, if the disk doesn't have that "clean shutdown" flag set then Windows sees the drive as "dirty" and will prompt to check the disk. Chkdsk for this should take less than 5 minutes and will generally not find anything (since almost all processes had ended & almost all files should have been closed). If chkdsk in this scenario is taking a very long time or is making a lot of corrections, that's a very strong indicator of disk problems.
I have no particular guidance on sound issues.
There are a bunch of things that could impact battery life, but if you have hard drive issues and it's constantly remapping sectors, etc. then that would definitely have an impact.
> I'm sure that there exists hardware on which Windows is as stable or more stable than Mac OS is on my mac, but I don't know how to reliably find it.
IMO? You already own it. Boot Camp is the best way I know of to run Windows on a portable machine. (I still prefer to do Visual Studio work through Fusion on my 2012 15" rMBP, but I'll reboot to play games when away from home.)
I don't know if I'd call Windows "worse", I think you happen to be a serious edge case, but I like having both at hand for different things. There's no better environment for what I want to do with regards to game development than VS2015 on Windows, there's no better general-purpose dev environment for me than OS X.
Most of these anecdotes cry RAM issues. Broken memory creates weird issues in pretty much all operating systems, that's why you want to have ECC RAM in places where you don't want weird stuff.
Are there less support calls per capita for Mac & Linux users because the *nix users are more tech savvy? For example, I'd expect developers who manage their own machines to ask for help less often.
I switched from PC to Mac in 2014 and it has been shocking. In 20 years with Windows I've rarely had issues, but I've got a long list of problems with my new Mac. The MacPro hardware feels great, but the OS crashes every few days. I've already had the logic board replaced once. I'm seriously considering selling my mac and getting a new broadwell pc this year.
95/98 crashed all the time, xp crashed if you really tried.
Windows 7 hasn't crashed once after two years of daily usage, and/or keeping it running without a restart for months. Even when the system locks up due to user abuse( me ), and would require a reboot on anything older, you just wait a minute and it recovers.
I'd like to throw in my experience with Linux here. My backround is that I used Windows 2000 and XP from 2001 to 2003 after that Linux for three years and then OS X until 2013, now back on Linux.
I bought a used ThinkPad because I heard that that is the hardware which would work best with Linux now and I installed Ubuntu on it. Everything worked ok but just the OS filled up half of my small 128 GB SSD and running only Firefox for browsing got up the load to over 0.5, which made everything not so snappy.
A couple of months ago I moved to archlinux with Gnome 3.x, it was a bit more complicated to install, that I admit, but it was so worth it! After installation the system used less then 3 GB of my SSD, and now running just firefox the load is on about 0.01 which makes this ThinkPad from 2010 one of the snappiest computers I ever used. All animations are smooth, not a single application ever crashed or hanged itself and so on. But yeah admitedly I mostly only do development, email and websurfing on it.
As an experiment, all of my daily work is done in a Virtualbox VM guest (Win 7) running on a Win 7 host. I'll spin up some CentOS VMs when I need something Linux-like and usually putty into them. I have a couple other machines one my network I RDP into (the lack of an RDP equivalent on OS X is one annoyance, VNC is just not as good).
It works pretty well. There's some occasional performance issues, but that's mostly due to the amount of RAM and cores I have configured for the VM. But snapshots and full-system backups means that the next time I move machines all I'll really have to do is move my image over and install Virtualbox again to get up and running.
I've had the opposite experience, typically. At an old job, I was trying to use a Windows 7 workstation for the same tasks as I'd use my GNU/Linux-based home workstations (and the Macbook I'm using at my current job): running VMs with VirtualBox, running local installations of PostgreSQL, etc. Windows was the more painful one, for me at least.
Granted, I'm not a fan of OS X by any means; far from it. However, the one redeeming quality is the fact that it's Unix underneath all the buggy Aqua and Finder and Cocoa and such, so at least I'm able to set it up with MacPorts (and/or Homebrew) and get some proper work done with it just like I would on any other Unix-like OS. You can't do that with Windows, at least not easily (and certainly not as painlessly).
That said, the presence of virtualization extensions in one's CPU would make a significant positive difference; if the Mac you used lacked VT-x (which could be checked by running "sysctl machdep.cpu.features" and seeing if there's a "VMX" entry somewhere in the output of that), it would certainly explain your difficulties with VMWare and Parallels.
It's also worth mentioning that HyperV isn't exactly equivalent to VMWare Fusion or Parallels; it's more equivalent to VMWare ESXi, Xen, or KVM.
Agreed, windows 8.1 isn't that bad. I hated it to death when I migrated to it due to the dreadful metro shit, but thankfully you don't have to deal with that if you spent some time making sure it's out of your way so booting to desktop and a quicklaunch bar is in place. the rest you get used to. I moved from XP to windows 8 and I missed the start menu a week or so, as I already used a quicklaunch bar for ages in XP.
Now, windows server 2012 R2 on the other hand... aargg...
Yes, the version that ships with Win 8 Pro. It is hands down better in most aspects compared to VMWare workstation. The integration is better, the footprint is smaller, performance and reliability is great and you get it for free. VMWare might have an edge in the OS Support department - it supports more host and guest OSes than HyperV. But I have had very good experience running DragonFly BSD, Various Linux distros (RHEL7, Centos 6.x, Ubuntu LTS etc.) and Windows 7 of course. Sure you don't get as many bells and whistles compared to VMWare but if you don't need them then HyperV is great.
Really? Last I checked, HyperV couldn't do video acceleration nor did it have UI integration (drag from one VM to another, or unity). It didn't seem to have a way to flip from one full screen VM to another. I'm not sure if it can completely redirect USB, either. Nor shared folders, without exposing a network share.
The video thing is probably the killer part though. If I could get essentially penalty free GPU inside multiple VMs at a time and easily flip, then I'd have reason to dump VMware Workstation, which is essentially in basic maintenance mode - v11 literally has no improvements, just s few bug fixes.
The GPU acceleration ends up not mattering for Windows guests which you can RDP to. RDP to localhost with all effects turned on is more than good enough for non-gaming, developer type of work. For Linux too it isn't a problem where most work happens in ssh - for the occassional X11 program you can either use the local console (which sucks agreed) or install Xming and run via SSH X forwarding.
Hyper-V on Windows Server 2012 R2 supports GPU acceleration (need certified GPU), Full RDP-Over-VMBus, USB-Passthrough etc. though.
By better integration I meant my VMs are saved and auto restored between shutdown/reboot cycles without me even noticing, dynamic memory assignment works great and keeps total memory foot print down, I can mount VHDx files outside of the VMs etc. They may be small things but for my workload it feels pleasantly seamless. Workstation always felt heavy and in-your-way and besides as you noted the last few updates have been money for bug fixes essentially.
Thanks for the response. I basically want to run a Linux desktop (XMonad... How'd I live before it?) but also have Windows available for VS/Office, an occasional game, and some hobbyist pro audio stuff. With the security of everything as a guest OS. If Windows had any sort of real containerization it'd be a lot less of an issue.
Unlikely, but if MS ever got serious about making the client great, I'm sure they'd leave Workstation behind right quick. But it's still pretty much a server oriented with a few bones tossed to client use.
Glad to hear it's working well. Have you tried storage features like reverting to snapshots/checkpoints (e.g. reset a disk to known state on each reboot), or shrinking of unused space in thin-provisioned disks?
Snapshots/Checkpoints yes. I have even tried the differencing disks feature and liked it quite a bit. Haven't tried the shrinking unused space part as I never needed it up until now.
The only one issue I had relating to saved VM state was that after a BIOS upgrade restore of VM State failed due to may be the Microcode update I think. But other than that, all good.
Straight from HP. Many people don't know it but the HP Business outlet has really good quality hardware (Laptops, Desktops, Workstations and Options) for a really good price. You can browse the list of available hardware and prices here - https://h41183.www4.hp.com/pps-offers.php - it's updated daily and if you keep an eye you will find what you need one day or the other.
I second this. I'm still running an HP Workstation that I got as a refurb in 2008. Yes, it's really old, but it has been solid as a rock this whole time. I got that computer because I had really good experiences using HP workstations at work before that. The cases are (or at least were) really well engineered and very quiet. I haven't kept up with their latest models because this one has been so solid, but I would expect that the new ones would be similar quality.
Yep we have about 10 xw8600/8400 with HP P400 SAS controllers. Disks come in and out as they fail but the machines just never seem to die. I have never seen one crash either.
As someone who used to get paid to find bugs in low-level windows software, I must emphatically disagree.
At least on the hardware side of things (USB stack, included hardware drivers, etc.), the quality is really quite poor compared to the Linux equivalents. I can't speak to the quality of the OS X equivalents.
>Did you just espouse Linux hardware support as a paragon of quality?
Please re-read my comment. I simply said that Linux drivers tend to be better constructed than Windows drivers. That doesn't imply that Linux drivers are particularly good.
I'll say without too much conviction that the best drivers I've seen have come from the BSD family, but I don't have a lot of experience there.
Sorry, my snarky comment was more aimed at the linux problem of driver installation and configuration complexity, and lack of drivers entirely. I'm way unqualified to comment on the quality of individual drivers.
Perhaps you haven't used Linux in a while? It runs on almost any hardware I can throw at it. Just upgraded to a brand new laptop, and just swapped in my old SSD, and everything just worked.
To add something, I actually like Windows 8 and their Metro stuff. I have a 55' LED screen hooked to a PC that I use as my 'media center'. Metro looks gorgeous on it, most of the time I use it for playing music but it had proven useful for other things like opening the allrecipes app and having the recipe displayed while I'm cooking [1].
Aside from that, that PC is always on and being used for days at a time and it never becomes snappy, never drops out of the internet, not anything. And it does it all while using ~2GB ram, I know it's not little, but come on, OS X barely makes it with 4GB.
That sounds cool, and I can really see that working well. But I'd argue that you're not really using Windows 8 strenuously in the way that people who complain about it do. It's like claiming Windows XP is stable because Solitaire never crashes.
I completely agree with your final OSX comment, it takes up way too much memory.
> I completely agree with your final OSX comment, it takes up way too much memory
To be fair it got bit better with 10.10 - the compressed memory stuff seems to do well under memory load. But still not close to Win 8 which by the way also has same page merging.
This discussion is just going around in circles with nobody adding real useful commentary to the discussion other than "I perceive Apple's software quality to be worse based on my own anecdotal experience." This opinion is being perpetuated by a few people and it's just going everywhere.
I don't think the software quality dropped, it's all about perception. Just a few years ago, everyone was moaning about software quality with Lion but nobody remembers that now, because bad headlines are easier to create than good ones. Yosemite has some bugs, yes, but so do almost every other major releases of Operating Systems.
Apple does have some bugs to iron out, but in six month's time when they're fixed, everyone will forget and start complaining about something else. Perhaps a few happened around the same time, but that's no indication that things are getting worse. People just like to complain.
Those who want to experience a lower "functional high ground" should switch to Ubuntu and discover how much further ahead OS X is.
To the everyday user, there is no drop in software quality. They wouldn't have even noticed unless articles like this continued to circulate. People are just noisier these days.
> Apple does have some bugs to iron out, but in six month's time when they're fixed, everyone will forget and start complaining about something else.
That's the whole point: we'll start complaining about something else because in 6 months when the bugs are theoretically fixed we'll have another OS X release waiting for us right around corner (approx 4 months away) which will put us right back at square one!
Tiger's lifespan was from November 2005 through October 2007. So, if we accept the premise that all the OSes were equally buggy and it also took Tiger 6 months to get the bugs worked out, then you still had another 18 months of a stable OS.
This reasonably allows customers to decide whether to be early adopters or waiters: its perfectly fine to hold off those 6 initial months since you then still get 3/4 of the products lifespan with some assurance that its stable. That's just not the case anymore, if you wait those 6 months then now you've got another release right around the corner, you're perpetually in upgrade mode. Not to mention that most the bugs will probably ship in the major release anyways since (non-security) bug fix release have more or less merged with the new features that introduce new bugs release.
My MacBook Air is running 10.8 Mountain Lion. I regularly receive security updates (including the pushed NTP fix), as well as application updates like Safari and iTunes, and 3rd party updates for apps like Chrome, Firefox, MS Office for Mac, Evernote, DropBox, Coda, etc.
I don't know what the "official" lifespan of 10.8 is, but it has every appearance of being fully supported by Apple today. And yes, it is very stable.
I loved Mountain Lion and also found it to be very stable. I stayed on it as long as I could, but the new versions of several of the apps I use required Mavericks, and after holding out for a while I upgraded. Mavericks was when I started noticing issues... Graphics driver issues, trackpad gestures randomly not working for periods of time, WindowServer crashing when opening the notification center (wat?), so on and so forth. Mac OS is still my favorite, but I've gotten an irking feeling that in the interest of capturing the mainstream market and making things look pretty and full of features, Apple has skimped on stability and quality. Just my two cents
One big question which I have not seen addressed is major release support. Has that changed with the quick release cycle? I think it used to be current and two previous major versions --which in a 2yr release cycle means six years of patches/bug fixes. But with the current release cycle could mean shorter support, unless they support current plus three or four previous releases.
The problem I've seen is that it's not just geeks this time. Sitting around a New Year's party with mostly non-geeks, everyone was discussing how they were hesitant to install new versions of iOS now thanks to bugs et al (both existent ones and media trumped-up stuff). That's both a direct and a perception problem that Apple has got to fix, stat.
The only good thing for Apple is, the same discussions included frustrations over both Android and Windows 8. Maybe the thing is that now, everyone's an early-adopter geek.
Thing is nobody complains about Android updates breaking everything (though there are a lot of complaints about updates not happening), because the upgrades are usually well tested (despite having many more hardware platforms). Win8 there was Metro, but I really didn't hear much about Windows being broken.
iOS and OS X have stability issues, ones that other systems seem to deal with more gracefully now.
Good point, one that's solved in part by the general lack of Android updates on so many devices. Android devices (and sorry, I'm generalizing again based on what I see people around me say/do) are seen as a widget; buy this for exactly what it offers today and nothing more. iOS devices are, on the other hand, seen as future-proofed; buy this today and get everything new that's coming out over the next few years.
No complains about android ? Lollipop is the worst Android update ever, there is a bug where the system process take up RAM until the phone can barely use one app without killing the last one and slow everything.
https://code.google.com/p/android/issues/detail?id=79729
> This discussion is just going around in circles with nobody adding real useful commentary to the discussion other than "I perceive Apple's software quality to be worse based on my own anecdotal experience." This opinion is being perpetuated by a few people and it's just going everywhere.
You quoted the unreliability of anecdotal experience and then went ahead and added your own anecdotal immediately in the next paragraph:
I don't think the software quality dropped
> Those who want to experience a lower "functional high ground" should switch to Ubuntu and discover how much further ahead OS X is.
I am sure most would agree with you that OS X at it worst is way ahead of Ubuntu, except that's not really the benchmark the users who are going around in circles are using, it quite evident in almost each one of these anecdotal commentary that an older version of OS X is the Benchmark
I think the real complaint people have is that a lot of these people switched to Apple for superior well tested hardware and software, thus avoiding problems that they would not need to wait six months to be fixed.
The key take away I think is NOT that recently OS X has some bugs, rather that the seemingly increasing in occurrence of bugs that more advanced users are experiencing is perceived as a sign on discarding their Test Everything Well Before Shipping culture that important to these people.
> To the everyday user, there is no drop in software quality.
The core market for Apple until quite recently was not the everyday user but developers, designers and the more tech savvy.
-----
I am unsure if your comment was designed to substantiate your thought that "This discussion is just going around in circles with nobody adding real useful commentary to the discussion", if it was, at that you succeeded.
>The key take away I think is NOT that recently OS X has some bugs, rather that the seemingly increasing in occurrence of bugs that more advanced users are experiencing is perceived as a sign on discarding their Test Everything Well Before Shipping culture that important to these people.
Apple products are clearly becoming less stable and usable, in ways that are very basic and obvious. Extreme computing skills are not required. I can think of the following bugs without trying too hard:
iCloud Notes fail to sync between my Mac, iPad and iPhone. (I've tried everything I can to fix this. Nothing worked.)
iPhoto makes two copies of all my photos when I download them from them my iPhone/iPad.
Windows/OS X networked file sharing seems deliberately broken, possibly at the Apple end. (You have to install the previous version of Samba to get it to kind-of work some of the time.)
I can't use FaceTime on my Mac because some weird sample rate issue makes everyone sound like a chipmunk.
Incoming FaceTime calls don't always ring on all devices, and some are simply ignored on all devices. (This is particularly unhelpful, especially for business calls.)
iTunes is an outstanding example of terrible software design. (Why is it still impossible to access app content directories from iTunes in any useful way? Why is the app icon layout editor so crufty and clumsy? And so on...)
All these features Just Don't Work[tm]. And it's not as a result of god-mode tweaking. They've simply never worked.
tl:dr; - Apple really needs to improve its software game.
I think the actual problem is complacency and a culture that favours style-over-substance marketing over solid UX engineering.
There's a lot of interest in trivia like flat icons, but clearly no one in or near the C-Suite cares about more basic usability issues.
I think one issue is that, as a general rule, most people in the C-Suite don't spend their time in the muck of their OS X desktop environment. They have people for doing whatever it is they would do there, and for everything else there's iOS.
FWIW I think writing server components are far, far better on Ubuntu Desktop than OS X. Package management is amazing and it matches my Ubuntu server deployments for production.
I agree with you on many points and it seems obvious that apple is spread too thin these days and tries to do too much.
I dont think this statement is true however:
> The core market for Apple until quite recently was not the everyday user but developers, designers and the more tech savvy.
Historically the core market of Apple Computer has been graphic-designers, journalists and the education market. Developers have only been on the platform post OS X because of unix.
If you're willing to live with a 512 GB SSD or swap out a 1 TB HDD for your own brand of SSD, you can consider last year's XPS 15 with a 3200x1800 display. http://www.itpro.co.uk/laptops/21797/macbook-pro-15in-v-dell... There are quite a few others, they don't require Quatro workstation graphics (and their associated high prices).
That said, I've a retina too. :D I'm just impartial. Microsoft Store, for Canadians (or students or both), at least, has some very nice deals. :)
I'm looking for something competitive with a 15" MacBook Pro. In the past, I have struggled to find screens and trackpads that are comparable to Apple's. Any recommendations?
Screens you can find, but trackpads as good as Mac are rare because of how much of a difference the software/drivers have made for things like palm rejection. It's not impossible, but it's not as easy as just picking the Retina model that fits your budget. Apparently, there's a half dozen ways to tweak palm rejection depending on which trackpad driver ships with your laptop, e.g. graphics in answers for http://superuser.com/questions/504571/use-touchpad-while-typ...
Until there's an answer to the question of finding comparable hardware that doesn't include forum posts with tons of competing advice, I think I'll be sticking with Apple's kit.
You don't think Apple's software quality has dropped? Guess you havn't had a hanging Safari or Chrome tab take down the entire OS.
Yosemite is riddled with blantently broken issues like this. I'd love to enumerate them for you, but frankly, Apple should be doing that with a flood of automatic updates. Of which I've seen 1 since installing Yosemite.
As an everyday user, there has been a massive drop in quality. Far, far worse than previous OS X upgrades. This is just not a case of people wanting to complain for the sake of complaining. I've been saying it for weeks now: Yosemite is Apple's Windows Vista.
> Guess you havn't had a hanging Safari or Chrome tab take down the entire OS.
According to the contents of my ~/Library/Application Support/CrashReporter logs my last Safari crash was August 3rd 2014 which IIRC would have been a 10.10 DP release. I don't doubt you are having this problem but it's hard for me to relate considering my freshest CrashReporter log for any app is VLC from November 3rd 2014. In my experience using 10.10 on 3 different Macs (Mac Pro 4,1, MacMini late 2012, rMBP early 2013) the worst issue I've encountered is somehow when using multiple displays and setting SilverLight video to fullscreen my Dock occasionally gets set to auto-hide. Pretty sure that's some ugly hack being done by the SilverLight plugin since it has never happened with Flash or HTML5 full screen video.
My gut feeling is these different experiences might be due to migrating data between OSX releases. I had migrated/upgraded from 10.4 to 10.8 and finally did a fresh install / setup for Mavericks which was then upgraded/migrated to 10.10 DPs. Looking at an older backup from November I see some files ~/Library/Application Support/ renamed as .incompatible. so I wonder if perhaps the 10.10 DPs were doing something unique to dump old app data? It just seems odd to me that some of us are having little to no issues while others are reporting these major issues. It could of course also be machine specific (driver) causes but at least for the last 2-3 generations of machines there isn't really enough variation in hardware to justify that theory.
> Guess you havn't had a hanging Safari or Chrome tab take down the entire OS.
I haven't seen that happen on either of my Macs.
I've experienced exactly one major new issue in Yosemite, and it's only affecting one of my Macs: when my iMac Retina sleeps, it often doesn't wake correctly and ends up doing a system reset so all my terminal state is lost. It's really fucking annoying. I managed to work around it by changing something in the sleep settings.
I've experienced a similar issue with my 2012 iMac: after waking from sleep, I find that all of my terminal tabs are in a "Restored" state, so the history is correct, but anything running ends up detached. It's frustrating, and only started happening with Yosemite.
Well, good for you. But remember - there were plenty of people who had absolutely no problems with Vista. And plenty who did. Were those unhappy Vista users all imagining it?
No. But that's beside the point, unless there can be a quantifiable, specific, report on the number of those events, especially compared to other releases.
Apart from that it's just BS isolated impressions.
For Apple those are even magnified, because whereas for Vista people were running it in 10.000 different configurations (different PC vendors, cards, logic boards, etc) each with a miniscule user base, OS X 10.9 runs in about 20 different configurations (previous 5 years of MBP, Air, iMac etc models), each of them selling in the multiple millions.
If 20% of the users with MB Air 2013 and a particular router have a Wi-Fi that doesn't mean OS X is not perfectly fine as far as an iMac user is concerned. Etc.
>You don't think Apple's software quality has dropped? Guess you havn't had a hanging Safari or Chrome tab take down the entire OS.
No, I haven't. I really don't know what all the fuzz is about. I love my MBA, no problems, no bugs that I'm aware of. Develop on it, run Parallels. Battery life is fantastic.
I have not seen any of the issues with Safari or Chrome that you are reporting. The opposite, in fact: Safari this is so much faster than Chrome under Yosemite that it has become my primary browser.
I'm a developer, by the way, and have both browsers open most of every working day.
If anything, I have seen an increase in quality with Yosemite.
I'll give you a more specific example, going to theverge.com and playing one of their videos in Chrome, will cause OSX Yosemite to switch from integrated to discrete graphics. That is typically enough to crash the whole system, allowing you to only move your mouse around.
By virtue of it happening, that's how I diagnosed it. Each time, OS came to a complete, absolute freeze, unrecoverable. Could not get the Force Quit menu to come up. Had to hard reset my MB Pro. Has happened several times.
Flash was not running on any of those hanging tabs, but honestly, does it matter? In what world is it acceptable for a process to freeze the entire OS?
>This opinion is being perpetuated by a few people and it's just going everywhere.
Well, you're attacking others' opinions for being opinions, and not facts, but your claims are nothing more than opinions too. Sorry, I don't know what your point is?
Just to add my own anecdote, my dad recently asked that his macbook air be reformatted to make it 'fast' again. Meanwhile, the '08 PC that I'm using now is running Win8.1 like a champ.
> Yosemite has some bugs, yes, but so do almost every other major releases of Operating Systems.
Heh, so Apple is just .. like everyone else? No need to hold Apple to a higher standard if that is the case. The entire point of people complaining is that Apple shouldn't be shipping buggy products like the competition.
>Apple does have some bugs to iron out, but in six month's time when they're fixed, everyone will forget and start complaining about something else.
Well, why would Apple fix the bugs if nobody complained? They wouldn't even know about them. Of course that is presuming they fix the bugs they know about.
>To the everyday user, there is no drop in software quality.
So why are the everyday users complaining?
Some of the bugs that I've personally come across.
>Heh, so Apple is just .. like everyone else? No need to hold Apple to a higher standard if that is the case. The entire point of people complaining is that Apple shouldn't be shipping buggy products like the competition.
So, they would use magical unicorn dust to remove all bugs?
Of course they'll gonna have bugs on OS X, every OS has bugs.
If Apple was held on a higher standard wasn't because OS X didn't have bugs, but because it had better usability, nicer design (where it matters), and a UNIX core to boot.
>So why are the everyday users complaining?
Because they were always complaining. I remember simila discussions after every OS version.
Wi-fi and graphics card issues in particular (like the ones you linked to) have been with us since 10.0.
For graphic cards issues it's usually the driver (so Nvidia/ATI, not Apple), and models get updated, so it's not like some core OS X kernel component that can be fixed and stay fixed.
And for Wi-fi there are also tons of Wi-fi routers, repeaters, setups etc, there will always be some incompatibilities. I had my fair share of such in Windows and Linux too.
>So, they would use magical unicorn dust to remove all bugs?
Did the users use magical unicorn dust to find the bugs? Apparently, Apple the "richest tech company", is unable to afford a test team with a wide enough coverage to find such simple bugs.
>Of course they'll gonna have bugs on OS X, every OS has bugs.
Minor - maybe, Major - no. I don't recall of ever even hearing of iOS 4 rebooting after someone changed the wallpaper. Heck, the phone's OS crashing was a rare, one-in-a-million event. It isn't any more. And on top of that, Apple - after putting out OS updates that degrade the phones performance - block the ability of their own customers, to reset the phone OS's back to what it was originally.
>For graphic cards issues it's usually the driver (so Nvidia/ATI, not Apple),
Apple has like 3 models, with almost identical hardware in their laptop lineup, and they control the entire software and hardware stack. If they STILL can't make a higher quality product than their competition, it's because they're either incompetent or they don't care to.
Maybe. It's just odd to me that each new version of OSX seems to revisit basic problems like stable wifi. I would have assumed the way things like that were implemented wouldn't have changed much from version to version.
And "Back to my Mac" seems to get screwed up with each major OSX update. Still no solution for that one.
I may just not use whatever it is that has bugs in OSX, but I've got a 2009 Mac Book Pro that I've been using since it was new, and I always upgrade the OS after a month or so from release.
I've only ever noticed a bug in Lion or Mountain Lion (don't recall which) where time machine would get borked if I put the computer to sleep when it was in the middle of a backup. That was super annoying, but it was fixed eventually.
I've never had any issues with wifi or graphics or whatever. I suspect these issues come about from hardware changes and the related drivers in the various models. Windows has the exact same problem, exacerbated by the freedom users have to mix and match hardware. My work computer running windows 7 would crash once or twice a week, always due to video driver (or the underlying hardware).
While Apple's ecosystem is limited, they actually have a rather large diversity of hardware to support now, and I'm sure that is the source of many issues. Makes me afraid to upgrade actually since my computer works so well.
I don't doubt that there is some bias toward remembering more recent complaints more. However, I don't know if you can pull the "it's anecdotal" card on something like this, because unlike most scientific studies (like on the effectiveness of a medical treatment), personal experience is a crucial part of the definition of "software quality." Also, I think it's reasonable to assume that most software, especially in unjailbroken iOS devices, is deterministic, and thus conclude that the experience of a bug is the result of a software problem. I'm a big iOS user, and I definitely share the experience of a steady decrease in software quality.
I should add that while I use OS X and generally install the upgrades promptly, I spend the vast majority of my time in Emacs and Chrome, and thus probably don't really "use OS X" enough to notice any software quality issues. It seems fine to me.
> I think it's reasonable to assume that most software, especially in unjailbroken iOS devices, is deterministic, and thus conclude that the experience of a bug is the result of a software problem
One major factor people tend to forget is that software can end up executing different code paths on different hardware. In other words—software can be both bug-free on a recent Mac, and yet horribly bug-riddled on an older Mac. This produces a very divergent set of personal experiences where people tend to talk over one-another because everyone is seeing a different part of the elephant.
Absolutely so. That's why I was careful to limit my criticism to unjailbroken iOS devices, both because I have much more personal experience with them, and because the hardware and software combinations are much smaller than with OS X.
But even in the example you provide, the experiences of the person with the old bug-riddled Mac are completely valid, and point to real software quality issues (assuming the new OS X version is officially supported on the hardware). The bug-free experience of another person with a recent Mac does not cancel out or in any way diminish of the person experiencing bugs.
Isn't the converse statement just as valid? The buggy experience of a person with an older Mac doesn't provide any useful information about the code quality of OSX for someone considering buying a new Mac. People using a version of OSX on a Mac it could have shipped with, and people using that version of OSX on a Mac that shipped with something completely different in many ways, are separate groups that really only need to communicate their issues within themselves. Crosstalk between them is mostly mudslinging, rather than useful evaluation.
> The buggy experience of a person with an older Mac doesn't provide any useful information about the code quality of OSX for someone considering buying a new Mac.
It absolutely does, because it shows how quickly you will run into trouble if you don't upgrade your hardware. Macs used to be something that the average user could buy every 4-5 years and it will just keep on running well. These users back then would probably never upgrade, but nowadays with free new OSX versions being in your face every time you launch the mac app store, most non technical users will at some point clicky on the button and run into the trap. Meaning that, unlike with earlier software, users have to watch out what they do more so than just a few simple rules like don't delete anything from the trash if you aren't absolutely sure. There's gotchas here and there, so it's not as safe anymore and thus not as empowering (because before, non technical users could be much more bold and try things out, it was harder to get things into a non working state).
> because before, non technical users could be much more bold and try things out, it was harder to get things into a non working state
I'm still wishing any major OS shipped with effective partitioning between storage of OS/application data and user data by default, such that you could hit the "restore to factory settings" button (Cmd+R on OSX boot) and have a guarantee that the only thing that will be blown away is the OS.
It's kind of getting there via an orthogonal path—defaulting users to saving data to the cloud instead of the disk—but it's still not there all-the-way. If it was, I'd just teach my grandparents the "go back to the way it was" button and have much more calming holidays.
Well, at least OSX so far has had a consistent method of resetting applications by deleting the ~/Library/ files, but I'm sure someone at Apple will manage to screw this up via some new iCloud feature sooner or later.
Yes, bugs affecting only older hardware don't give direct useful information about the current code quality on new Macs. However, it does give useful information about the state of the software engineering department at Apple. It's also still relevant to potential buyers of new Macs, because it gives some useful information about the code quality on that hardware years down the road, after updates.
I agree. People said Snow Leopard was the last good OS X update but I couldn't tell a difference between that and any of the subsequent updates. The 'Apple software quality is declining' meme only seems to be coming from the Apple developer bubble.
The latter point releases of Snow Leopard represented a point in OS X history when there was great stability in user expectations of the Mac operating system. It wasn't perfect, but most things worked perfectly. Performance was excellent. System stability was high, too.
The problem with Lion is that it started dicking around with previously stable user expectations -- a perfect example being the skeu Address Book that had inferior usability.
Mountain Lion improved things a bit.
Mavericks improved things further, and represents another high point in OS X history.
Yosemite once again dicks around with previously stable user expectations -- a perfect example being the new aesthetic which looks crisp in retina but looks fussy and unfinished in low resolution. Spotlight is no longer a drop down menu for no reason. Time Machine no longer spins. The green buttons now do different things. It's even more difficult to enable TRIM on third party SSDs.
I disagree, you did not see the same amount of complaints with snow lion. Observational and anecdotal experiences are still experiences and are evidence that something is going on. If you owned a restaurant and suddenly your customers started complaining after you changed your recipes, would you still think it is nothing to worry about? And just because Ubuntu sucks worse than OSX, does not mean that OSX is not starting to suck.
OT but... I had an argument with wait staff, then the manager, at a local restaurant I used to go to.
"Everything OK?"
"Well, no, not really. This chicken sandwich is... different than it used to be. It tastes different from 2 weeks ago, but the menu hasn't changed. Did you change the recipe?"
"No, nothing's changed. This is the same"
"No, really, it's changed."
Back and forth, slightly escalating - manager comes over.
Same dialog.
He comes back about 10 minutes later.
"The recipe hasn't changed at all. We just use different ingredients".
Well... it's... a bun, chicken patty, tomato, lettuce and mayo. The bun and chicken had changed. But not the recipe. But they didn't apologize for any misunderstanding or anything - quite irate that I was making a "big deal" out of it.
Back story was I'd been going there for about 2 years, 1-3x per month, getting the same thing. It's what I liked best there. Then it was changed. Except not. And it was bad. But there was nothing on the menu to indicate "new" on it. And I was 'wrong' for registering dissatisfaction with the new change that wasn't a change.
The problem with your response is that I don't know that anyone has ever actually provided reliable evidence that the number of complaints has gone up. GP's view, which I share, is that people always complain about new releases, forgetting that the previous release had its problems as well. As GP said, bad headlines are easier to make than good ones. Simply reiterating that you think the previous release had fewer problems, and that there are anecdotes out there about problems in the current release, is hardly a refutation of this. This is entirely consistent with GP's view.
I for one have experienced no significant change in reliability on the road from Snow Leopard to Yosemite (for what that's worth).
> people always complain about new releases, forgetting that the previous release had its problems as well
It might be true, but this attitude makes it very hard for developers not to discount all complaints, including ones grounded in reality. Just because people are paranoid about new releases being crap, it doesn't mean that they are not crap.
Personally, I've experienced a number of really bad bugs in Yosemite. One of the most annoying is four-finger swipes to bring up launchpad: the "blur" animation starts and then just hangs midway, leaving a half-transparent launchpad that works with keyboard controls but not mouse-clicks. Or the infamous "backbreak", where you do two-finger swipes to go back in a browser window and somehow it fails mid-way, leaving an unusable browser page and breaking all two-finger gestures until you close that window (not tab, the whole window). And of course the awful dark corners on the volume icon once you disable transparencies, something you have to do to maintain a decent framerate on a two-year MBP-retina that was perfectly capable to handle all this up to and including Mavericks. And let's not even talk about the wifi dns bug, which I've lost any hope to see permanently fixed in my lifetime.
These are bad because they affect very visible UI elements used hundreds or thousands of time per day; the sort of thing that used to be rock-solid and did not significantly change in new releases. I would understand if a newish feature like Finder tabs had a few bugs, but not trackpad gestures that have been there, working fine, for years.
> It might be true, but this attitude makes it very hard for developers not to discount all complaints, including ones grounded in reality.
I think this is a problem for outside observers, not competent developers. Unlike us, Apple developers have ready access to statistics about bug reports and other forms of support requests that we do not. This data, presumably, is not subject to the same biases that infect an anecdote-driven discussion of software quality by outsiders.
In my experience, you get a bug report for every hundred or thousand affected users, if that; so excuse me for not having much faith in such statistics.
> This data, presumably, is not subject to the same biases
... but it's likely subject to many other biases, e.g.
* "hey, this bug was reported by iLife devs, better prioritise ahead of that bug that has affected millions of users for almost two years -- them people are not going to shout at me in the canteen."
* "hey, this bug was very well-reported by very technical server people, let's prioritise it ahead of that bug affecting millions of semi-literate consumers"
* "hey, this bug blocks the release of $shiny-new-iPhone-feature, let's prioritise it ahead of that bug affecting the trackpad of penny-pinching laptop users"
I wasn't talking about the biases that affect a team's response to a given bug but, rather, the ability merely to know whether a given release has more bugs or fewer bugs than others, and the biases that can effect this count.
> In my experience, you get a bug report for every hundred or thousand affected users, if that; so excuse me for not having much faith in such statistics.
But the number of reports should still be roughly proportional to the number of bugs in the wild, shouldn't they? That's all that is necessary to compare one release to another, particularly if all you're looking for is significant quality degradation, on the scale discussed in the article. And is it really your position that these statistics are worse than a few users' anecdotal views about which release is better?
> But the number of reports should still be roughly proportional to the number of bugs in the wild, shouldn't they?
That's just an article of faith. Regardless, part of the problem is where those bugs are. Maybe file-tagging involves millions of LoCs and it's now completely bugfree, but if wifi connections keep having DNS problems because of one single bug, overall experience is much more affected than it would be in the opposite case (bugfree wifi and buggy file-tagging).
> And is it really your position that these statistics are worse than a few users' anecdotal views
When "a few users" are your most ardent evangelists (Arment, Gruber etc), I'd say you should worry regardless.
> Or the infamous "backbreak", where you do two-finger swipes to go back in a browser window and somehow it fails mid-way, leaving an unusable browser page and breaking all two-finger gestures until you close that window (not tab, the whole window).
This bug is driving me crazy on 10.9.5, along with a couple of other issues. Sadly not limited to Yosemite. (I think Mavericks is worse than its reputation, I really miss 10.6 and 10.8.)
Holy shit! Thanks for the tip! My MBP is only a little over one year old, but disabling transparencies gave a peppy speed-up. I have also noticed a more "sensitive" left-swipe on the trackpad since Yosemite -- it is infuriating to be constantly trying to scroll down a page and have it send my back a page in a browser.
Wifi DNS bug? Thanks for putting a name to a very persistent annoying problem in this household with--holy crap--12 Apple devices.
Also thanks for that Launchpad description. Keeps happening to my kids and I couldn't figure out why. I'm really disappointed in Apple. I'd really rather have reliable wifi than, say, Continuity.
Yes, I meant Snow Leopard. Sorry. I have experienced many problems since Yosemite and am a long time OSX user. Maybe because I use my computer extensively every day in the same ways, it is more obvious to me than a casual user, but there are definitely new (and bad) issues with Yosemite, wfiw.
I will be specific about my complaints that started immediately after Yosemite upgrade: Graphic glitches in google voice and gmail, bizarre cpu usage out of nowhere, two-finger left swipe changed somehow to be more "sensitive", wifi randomly disconnecting and reconnecting out of the blue, shorter battery life, a "slowness" that is hard to describe but noticeable to me because I use my computer the same way every day, two full "grey screens of death", which I have never experienced in all my time using OSX, "slowness" on memory swapping to the point of beachballs, the list goes on. All of these started immediately after Yosemite upgrade. I have been a long-time OSX user, and have never had a full-blown systems stops until now on any machine I have ever owned from Apple.
> They wouldn't have even noticed unless articles like this continued to circulate.
We should clearly find a way to squash and punish the infidels from voicing their opinion.
I feel there is a bit of dissonance here between:
> should switch to Ubuntu and discover how much further ahead OS X is.
> People just like to complain.
I've used Ubuntu for some time. Now on 14.10. Can you point out how ahead OS X is? Not saying it isn't. Just there is nothing I lack or wish I had in Ubuntu 14.10 that I don't already have. I like the interface. The ecosystem of packages. All hardware I want to work works. Also I paid $0 for the OS, (But donated to it before, you can too here: http://www.ubuntu.com/download/desktop/contribute if you feel like it is lacking major functionality)
I used Ubuntu exclusively from 2009 till Feb 2014, at which point I joined a startup, got a MBPr 15", and have been using it exclusively since. And I agree, OS X offers a few marginal benefits, and some drawbacks, but it's not much further ahead.
OS X advantages:
- Entertainment: OS X has better entertainment options and ecosystem, iTunes mainly, though Amazon MP3, Amazon Prime Video, and Netflix are negating this as advantage.
- 3rd party desktop apps: I find little gems in both OS X's app store and in Ubuntu's "store" and repo's, but the best on OS X are generally better than the best on Ubuntu. Notepads like Quiver which integrate Markdown and LaTeX, Slack's only native app (or native-wrapped web app) is on OS X, Kindle native app for OS X, etc. That said, Ubuntu's good enough that I wouldn't miss OS X in this respect.
- Overall better for lay people and non-techies. Gestures, etc. Still complex though, just as difficult as Windows for non-techies to learn. For example, I'd love to recommend to my almost 70yr old parents they switch from Windows to Mac, but the learning curve now is too steep. This is one of the disadvantages of adding lots of features to an OS - higher learning curve.
OS X disadvantages
- Stability:Price ratio: OS X is significantly worse. About the same stability as desktop linux but at a much higher price. I can crash both OS's by abusing memory (eg, too many apps open + too many browser tabs). A free OS I can forgive, an expensive one tailored to a single custom hardware configuration I can't.
- Lock in: little things like having to take your Mac to a service center to replace the battery, or the inability to copy your entire home partition over to a new computer, etc.
- FOSS: personal preference, like knowing my computer has the many eyeballs effect.
- Overall better for power users. Tiling window managers, ability to keybind everything to Vim keybinds (file manager, all browsers, even Emacs), means you can easily dispense with mice and gestures altogether and use a much faster and more powerful interface.
I'm curious as to how you guys manage to crash the OS by using too much RAM. As an engineer, I semi-frequently use up all the RAM+swap on my desktop (running Arch Linux) by various CFD programs, blender etc. When that happens, I go get a cup of coffee, and when I come back the system is OK, except for the offending program which has segfaulted/crashed.
You should try Kubuntu (aka, install KDE in your Ubuntu).
Far much better that Unity, and you can customize ANYTHING. Even, you could do that look&feel similar to OS X .
The only thing that I appreciate from OS X (I don't have it, but I thought about getting it, and I have some contact with it on a friend's mac), was am assumption that is very polished in all aspects. Reading this article makes to think that isn't true any more.
I really don't expect OS X has it, but something I wish I had on Ubuntu was a notification daemon that 1) let me pull up a history, and 2) would batch lower priority notifications...
I think there is more surface area for people to notice bugs, and we're suing the systems far more so we feel problems more acutely.
More importantly, even if it's true that there is a slight increase in bugs now that they've moved to an annual schedule, there is no logic to the idea that the should now just abandon the schedule and go slower. They are in a highly competitive situation. The solution is not to go slower, but rather to improve their processes and practices.
Who are they competing against to go faster? In the desktop space, Microsoft's on (at least) a 2-year cadence, and that was in response to Apple speeding up their release cadence. Ubuntu's also on a two-year cadence (for LTS releases), but I'd be surprised if they're even on Apple's radar in any significant way.
On mobile, Android's also slower than Apple. Android 4.x lasted three years before being replaced by Android 5 this fall; the releases in between were all primarily Snow Leopard-style performance and stability updates.
If Apple's fast release cadence is having adverse consequences (which, FWIW, I'm not convinced that it is), there's no one to blame except Apple: they're the ones setting the expectation of rapid releases.
You're annoyed that 'no-one remembers Lion', because people either upgraded from it or stuck with Snow Leopard to avoid it? It's a bit disingenuous to complain that no-one remembers the issues with an OS that folks aren't really using anymore and has been out of support for several months now.
No-one in the Ubuntu world much remembers Ubuntu 11.04 either - so what? Does that mean that the currently-used operating systems shouldn't have their flaws discussed?
> Those who want to experience a lower "functional high ground" should switch to Ubuntu and discover how much further ahead OS X is.
I did exactly that (12.04 LTS), and am not going back. Configuring some things (I'm looking at you, Silverlight) is a pain, but most stuff is pretty painless. OS X looks much shinier, granted, but in terms of usability I prefer Ubuntu.
To most people (i.e. the majority of the world who doesn't read tech regurgitation) there has been no dramatic shift in software quality from Apple. There have been some hiccups with cloud services or OS updates that affected a lot of people, but these got resolved and Apple still ranks very highly in customer satisfaction.
Alternatively, perhaps people are responding because they are having (or have had) similar experiences. Personally I have been very frustrated with iOS in the past year or two due to crashes and other bugs. Granted, a good number have been resolved in recent patches, but I'm still getting way more crashes than I used to several years ago (weekly+), and shaking off the previously frustrations and regaining trust is a time-consuming process.
Perhaps it's also lack of QC for new releases. When iOS 7 and 8 were released, pretty much everyone I knew were experiencing frequent crashes and other issues.
As someone else pointed out, there is more surface area for bugs to occur. Hopefully QC will step it up.
One difference between 10.7 and 10.10 is that 10.7 cost money and 10.6 kept working as before. It was considered perfectly normal to stay on the old system and be productive. (The Windows world is still "sane" in that way.)
Now Apple and the internet army of Early Adopters have united and it works like that-
1. iOS 8 is released and you have to update (if you want to receive security updates)
2. If you accidentally enable iCloud Drive or use a new version of iWork, you need the Yosemite Developer Preview (or wait for 10.10.0)
3. Now you are using iOS 8.0 and 10.0.0 and live through six terrible months.
It's a lot harder to be a "slow adopter" nowadays, especially if you live deep in the Apple ecosystem (iCloud, iWork).
Ubuntu should not be the bar. Maybe Windows - but Windows has a significantly more difficult task. Apple controls all of the hardware & software. I would expect significantly fewer bugs than the other platforms.
This is also not just about Yosemite. The fact that the iOS update rates are significantly lower this year is a sign users were not happy with the previous update.
I agree 100%. Marco just posts random complaints these days. There are no specific examples of what is actually broken. An OS is about as big of a project as you can get. What evidence is there that Mac OS X is any more broken than any other widely used operating system?
I haven't had significant problem with OS X. iOS is trickier.
For a more concrete criticism of functional software, thoughtfulness of design, and reliability, though, I submit Apple ID, iTunes, App Store, etc. This has been the buggiest and most opaque set of services I've dealt with recently. I have 3 different accounts now, each connected to developer credentials and app store purchases. 1 original account seems to have turned into 3 by way of failed sign-ins via iTunes and multiple failed password reset attempts.
Each of these has 3 or 4 nonsense trivia questions associated with it. I have a folder in KeePass dedicated to managing this stuff. I have a spreadsheet that attempts to map these things to purchases. I can't automatically install updates because App Store downloads have associated themselves with different accounts.
I've given up in a way. I have years-old versions of software that I don't update just due to the complexity of signing in. I don't know how people deal with it.
Since Apple's bug database is private, what would be an objective criteria that you would accept? For my own part, I have more bugs reported on 10.9 and 10.10 then any other releases.
My everyday users hate iOS 8 and 10.10 due to network, Finder, and file share issues.
There a super annoying issue with iMessages on Yosemite where the first time you click on the iMessages window, it doesn't gain focus properly (and you can't start typing a message), and the second time you click on it, the entire window changes position by the delta between the top-left corner and the spot where you click the second time.
It makes the whole experience of using iMessages very aggravating. The whole "use OS X for all messages including SMS" thing is one of their key marketing messages, and it's broken on my Macbook and on my iMac.
I logged a bug for it and was told it was a duplicate of another one already in the queue, which was well over a month ago. It seems like something that should be relatively easy to fix. That it hasn't been tells me that there is likely a very large list of bugs in a similar category (or worse), so they just haven't gotten to this one yet.
This doesnt even begin to address the countless ways in which messages on the laptop falls out of sync with the messages on my iphone sitting next to the laptop. Im not even talking about 2 second delays, Im talking about 10-15 minutes of the laptop not getting the message.
Similar annoyances exist with receiving calls on the laptop. Sometimes it works sometimes it doesnt.
The troubling thing for me is that these arent some esoteric features, these are the features that they choose to highlight so I presume a higher bar for them.
As a person who uses OSX in technical environments that are not well-connected for long periods of time (at sea research) -- my strong perception is that a huge host of the 'not behaving as I would want or expect' issues have to do with the way cloud services are getting integrated 'transparently' into so many places across the various systems and applications. 'Transparent' integration of network services has always been a concept with oversold value -- take as an example: NFS, which, even when implemented in tightly controlled high-availability networks, comes with a lot of obscure failure modes that are basically unresolvable because the consumers of the service have essentially no visibility into the notion that there is a network involved.
Hiding the operation of the network completely from the user leads to application design where there is just no sensible way for the user to build an expectation mind for how things are going to behave -- or what the source of mis-behaviour might be at any particular time. It makes things these problems feel exceedingly random. I get to see all sorts of application quirks that pop up when the network is not behaving exactly as the application designer would've hoped when they decided that some cloud api call or another should transparently affect some aspect of their ui which wouldn't even necessarily seem like a cloud behavior to a user ... Just listening to music from your own iTunes library without a network or with a spotty network is an exercise in extreme annoyance.
I understand that the cloud's not going away -- but I would love to see apple add standard UI somewhere in all their applications to in some way indicate when network operations are in effect and something about the status. Attempting to build this kind of 'awareness of network activity' into the ui might really help application designers avoid including codepaths that amount to 'if network is bad: goto random-ruin' throughout their application's ...
I think this observation is spot on. My parents are in a rural location and for years had horrible random and difficult to trace problems with their computers while they were on satellite internet. The mysterious problems went away when they were able to upgrade to cellular internet. I think the problems were due to the painfully high latency over the satellite connection, but in many cases it was difficult to prove.
Another personal frustration, which perhaps you might share, is the implicit assumption that everyone not only has high-speed internet, but uncapped internet service. My parents had a hefty internet bill the other month because my dad upgraded his iPhone, and there was a lot of occult downloads happening, I assume for Photo Stream or whatever it's called now.
Case in point, something as simple as Expandrive.app - it makes SFTP/S3/WebDAV simply accessibly, but also causes huge failure modes as other Apps try to access what they consider to be a local file (think auto-save).
"Transparency" requires reasonably complex caching and invalidation (see: your browser).
I think he was thinking something to indicate that the current thing you were doing is using the network: an alternative wait cursor would be a nice touch.
> I fear that Apple’s leadership doesn’t realize quite how badly and deeply their software flaws have damaged their reputation, because if they realized it, they’d make serious changes that don’t appear to be happening. Instead, the opposite appears to be happening: the marketing-driven pace of rapid updates on multiple product lines seems to be expanding and accelerating.
I've come to realize (or believe, rather) that very often, leadership is actually more aware of flaws than journalists and commentators think they are. It's hard to imagine that tens of thousands of really smart people within an organization haven't thought about these things themselves.
If Apple or any other company isn't making serious changes, I don't think it's because their leadership is ignorant about something we know. More often than not, I think it's because they know something that we don't.
> "First, you'll be exhilarated by some of this new information, and by having it all — so much! incredible! — suddenly available to you. But second, almost as fast, you will feel like a fool for having studied, written, talked about these subjects, criticized and analyzed decisions made by presidents for years without having known of the existence of all this information, which presidents and others had and you didn't, and which must have influenced their decisions in ways you couldn't even guess. In particular, you'll feel foolish for having literally rubbed shoulders for over a decade with some officials and consultants who did have access to all this information you didn't know about and didn't know they had, and you'll be stunned that they kept that secret from you so well."
Can anyone give specific examples of OS X backsliding in the feature department? The article he links to talks about having to tweak some settings, and not liking Messages... not really egregious issues in my book. I've been pretty satisfied with how it's progressed (and I'm far from a fanboy. There were times in my life when I was using Linux, Windows, and Mac OS on a daily basis.)
I can think of several features that have improved my productivity... mission control, handoff, updated notifications, the new spotlight to name a few. They also finally fixed multi-monitor support in 10.9 which was a huge deal for me.
I do feel far more productive in OS X than other OS's, but perhaps that's just familiarity at this point. I'm curious to hear what other people's major complaints are.
Yosemite is much more memory/swap intensive than Mavericks. On my 4GB 'Air, with 3 apps open, it swaps about 40 times as much as Mavericks. So much so, that SSD lifespan is a concern.
*Based on my own informal tests using vm_stat to measure pageout activity.
10.6: No new consumer facing features, just a big ironing out of 10.5.
10.7: Removed 2D Spaces layout to a left/right one. Totally ruined my workflow. Also got rid of scroll-bars.
10.8: ?
10.9: Not exactly OS level, but iWork took a huge backslide feature wise.
10.10: Blurs everywhere. It can put a hit on overall performance of applications, especially visual ones like Photoshop+RAW images or video editing.
Nothing comes to mind for [10.8, 10.9] which might say a bit to your point. However, Messages has been in since 10.8 and items are still out of order? They need to iron out just like they did with 10.6, otherwise the bugs and deep issues will compound year after year.
Yeah, I still haven't gotten over the layout of spaces to be honest, but overall I think mission control improves my productivity so I've accepted it at this point.
Yosemite seems to run fine on my old Macbook pro, but I don't do any heavy lifting with it.
Speaking of the OS not as the desktop, but what traditionally an OS means: device support, memory and process management, application environment, file system and networking.
As a developer, I find OS X to be immature, undocumented, arcane and volatile. Online Core documentation is very sketchy, often no more than a listing of argument types. Googling for help results in only questions, rarely answers. Versions add new APIs as quickly as they obsolete the old standard ones, leaving everybody hopping to update, often several times per year.
Depending on what you're doing its still frequently better than Win32 (which has an extremely broad range of API and documentation quality), or linux (which is just downright screwy at times).
Although all bets are off if you're including headers from within mach. You're better off finding what you need in POSIX at that point, or, failing that, an external library.
Mavericks SMB file sharing implementation is completely broken. They shipped a new implementation (no longer using Samba maybe?) and the new implementation does not work correctly. Many painful details at http://www.macwindows.com/mavericks-filesharing.html
I tend to look for the simplest explanation to a question, as quite often the simplest explanation is the correct one.
So in that spirit, if the question is "why does it seem like OS X isn't of the quality it used to be?", the simplest answer might be that Apple just doesn't care about OS X anymore.
1) The iPhone business generates 4x as much revenue for Apple as Macs do -- and if you add in iPad sales that figure goes up to 5x; and
2) The iPhone business has grown by more than 10% each year, while Mac sales have been essentially flat.
So if you're Apple, where does the Mac fit in your product line these days, exactly? People still buy them, so it's not like you completely don't care about them, but the importance of those sales to your bottom line is diminishing year by year as sales of mobile devices dwarf sales of Macs. And there's no indication that those trends are going to turn around anytime soon -- no new major piece of hardware coming from Intel or somewhere that's likely to light a fire under sales of conventional PCs, or software package so compelling it would drive the Mac into new vertical markets.
All of which means that maybe, if you're Apple, you see the Mac business as basically a cow to be milked at the lowest cost possible until it dies of natural causes. You don't cut the hardware build quality (at least, not yet), but you don't put a whole lot of effort into developing revolutionary new features for OS X or wringing every last bug out of new releases of OS X, either. You just do the minimum possible to keep those existing Mac customers from jumping ship, or at least to delay the moment when they do jump ship for as long as possible. And by the time that day eventually comes, you won't care because Macs will be a footnote to your real business, your device business.
I like the idea of following the money, but I would argue that iOS is actually in a worse state than OS X. UIKit used to be AppKit's clean and modern sibling, now we have to struggle with bullshit like automaticallyAdjustsScrollViewInsets, the n-th way of implementing view controller rotation, the duality of AutoLayout and struts & springs and an UITextView that gets worse instead of better. If you want to see some examples, follow @steipete on Twitter and look at the radars he files (I think there were 3-4 in December).
Or just start using a custom keyboard full-time on iOS 8. It's still bordering on unusable for me on iOS 8.1.1. I love the empty screen the App Store gives me when I want to write a review.
Oh, and most importantly, you cannot stay on an old iOS version even for one day if you care about the security of your devices. So iOS releases should really be completely, 100% rock-solid, but they aren't.
You're comparing market share (in your link) to revenue figures (in the post you responded to). Apples and oranges.
Over a 2 year time-frame (end of 2012 to end of 2014), Mac revenue increased by a measly 3.6%. iPhone revenue increased by 30%, and was over 4x Mac revenue in 2014. iPads alone generated more revenue than Macs (though to be fair, iPad revenue actually decreased slightly over the period).
As for units, for every Mac that Apple sold in 2014, they sold 9 iPhones. (mental image of Kramer saying '9, Jerry!')
Personally, I think smacktoward nailed it. It's fairly obvious that iOS is top priority at Apple, and it follows that is where the bulk of the design and engineering work is going.
I suspect that inside of Apple, this quote from Jobs is still considered relevant. I can guarantee you that Apple execs don't want Apple to be a truck maker.
"When we were an agrarian nation, all cars were trucks. But as people moved more towards urban centers, people started to get into cars. I think PCs are going to be like trucks. Less people will need them. And this transformation is going to make some people uneasy... because the PC has taken us a long way. They were amazing. But it changes." - Steve Jobs
Well, I disagree with what is happening with OS X at any rate. Everyone's realized by now that the Post PC era stuff isn't really true and I don't think that truck analogy holds.
OS X has been getting much more attention the last two releases. They also redid the Mac Pro and have been steadily updating hardware. I think it's the focus on adding OS X features that has had some stability/performance issues, not lack of attention as you and GP are claiming.
iOS is having the same problems. Development is biased too much to new features and not enough to polishing the user experience. If what you are saying is true, iOS 8 wouldn't have been so ridden with glitches. I personally have had many more issues with iOS 8 than with Yosemite.
In order to build iOS apps, you need to run XCode on MacOSX. The latest releases of XCode have been so unstable that I wonder how much longer before developers finally get discouraged to develop for Apple Hardware.
I wouldn't be surprised if there is no Mac in a few years. Recall that one of Jobs's less-noticed, er, innovations was removing the word "Computer" from the company's name.
Yes, you need a Mac to write iOS apps. No, that is not some kind of immutable law of nature, as any game console developer who works in Windows or Linux can attest.
On a long-enough timeline, being in the "computer" business may turn out to be more trouble than it's worth to Apple.
I believe this has to do with the increased power consumption requirements of feeding the retina display. To maintain the same battery life with a retina display, they'd need to include a bigger battery, increasing weight/size, and res just isn't the priority for the MBA, compared to weight/size and battery life.
In pretty sure the Retina MBA doesn't exist because of battery concerns. It's not a coincidence that retina screens arrived at the same time as the departure of DVD drives (and so, more internal space for the battery).
This is striking, considering that Apple is a wealthy company with a small product line. They have the money to put resources behind getting their products fixed. They have total control over the hardware environment, so they don't have to worry about compatibility with external hardware.
Microsoft has put a lot of work into making Windows fixable. The two big developments for Windows 7 were 1) requiring all signed kernel drivers to pass the Static Driver Verifier, and 2) running incoming crash dumps through a classifier system which attempts to identify similar crashes and sends them to the same maintainer. Those two tools put a big dent in crash-for-unknown-reason problems. Is there any indication that Apple has developed similar tooling for their systems?
I just have this vision of OS X like a model of the golden gate bridge made out of dried pasta. If you grab one end and point it in a different direction a lot of noodles break and its crappy until you get them fixed. If you turn it very slowly you can do so without breaking any noodles, but it takes a long time to get to a new orientation.
My assertion is that OS X (and Windows for that matter) are trying to serve two masters, one is the 'appliance user' who never puts anything on their device except for what came out of the App store, and the 'computer user' who uses their machine as a tool to develop software for themselves, or appliance users. While not a mainstream idea, I think a Macbook with IOS would be a better "answer" for application users than a Macbook with OSX, and a Macbook with OSX and none of the appliancey features might be a better Macbook for developers. Developers would no doubt run IOS in a VM on their Macbook which would both provide them a test platform when delivering new code, and a place to use their own appliancy type apps, away from the core development world. Windows could do that as well, splitting into an 'end user' and a 'developer' mode.
I think that might work well in practice, but I think a lot of us hold out hope that a lot of the 'appliance users' will dig deeper and become proper 'computer users', and we worry that going the separate route will prevent that, and create a population with a low computer literacy rate. Unfortunately, that seems to be the way we're going, with many people adopting iPads as their main computing platform.
The decline of "It just works" is one side of the problem. The other, more painful side (driven by the iOS-ification of the OS, I suppose) is: "When it doesn't work, you're helpless".
I would be pretty happy if each release didn't fundamentally break key pieces of software I use every day. The last update broke Jabber and VPN for me. I'm one of three OS X machines in my group and was the first to upgrade to Yosemite. My canary in the coal mine experience has waived the rest of the team off.
Now I can't call into our daily VTCs or work remotely as easily as everybody else. The Windows users with Putty are running circles around the Mac users in these terms.
MS-Office windows keep disappearing when I go on-off my second monitor. Mice continue to work poorly (especially over the trackpad), on computer build by the company that popularized them.
At least multi-monitor support is generally better at last. I honestly can't believe the software shipped in the state it was the last few releases of the OS. I upgraded to Yosemite purely because of that. Still, it's sometimes buggy and I've had more than one experience where dragging windows over to another monitor cause all kinds of mayhem.
Now if only I could get an OS level window Maximize that worked right and maybe good tiling window support, and fix how broken Finder is, I'd be relatively happy.
I had a choice, Windows or a Mac, and after I saw the atrocious Dell hardware my company had on offer, I really had no choice but to go Mac. But sometimes....sometimes....
To take a simple example, faster OS updates combined with more aggressive OS version requirements for Apple apps (e.g. iPhoto) mean that getting the latest app bugfixes demands much more upheaval in your computing environment than it used to.
In the past, when you were having a problem with an Apple app you had a relatively large window during which you could just update that app without touching anything else to see if the upgrade fixed your problem. In trickier cases you might even try installing a specific version that wasn't the latest and so on.
Today, far more often, updating the app you're having a problem with requires an OS upgrade, bringing in tons of unrelated changes and triggering updates to other apps and ...
If you don't see how the first scenario involves wider options for self-help, plus greater agency and control than the second, I don't know what to say.
The only difference in the scenarios is the rate of change. If you are arguing that change itself reduces users agency then I won't argue with you, but I'll say that it is a problem that is broader than Apple.
I feel like the software really hasn't changed enough for me to feel differently about Apple at all.
However the the fact that the hardware is GLUED together and non-upgradeable in all models is the thing that has me looking elsewhere. Is anyone else frustrated that their retina machines are completely obsoleted on purpose and that even the minis have memory chips soldered into them for no reason but wasteful planned obsolescence?
I think that Apple is going to hit a major backlash when people realize that their memory and hard drives are trapped and limited for no real technical reason.
I really doubt most consumers care about glue. They buy a new phone every two years, they won't bat an eyelash about buying a new computer every four years. I think the glue is less about planned obsolescence and more about selling more upgrades up-front, although I doubt even that makes a difference with most of their users. I bought a mini when they first came out and I upgraded the ram myself because it saved like $100; but I suspect this was happening to a single-digit percentage of their minis.
Unfortunately for tinkerers, these processes allow Apple to produce very small and lightweight hardware. These qualities are more valuable to consumers than tinkerability.
Except that they're not needed to get to small and lightweight. Those qualities are driven by the size of components decreasing due to die improvements. So they can stick smaller, better components onto boards every year.
Look at enough device tear downs, and you realize that lack of repairability is more about laziness than it is about legitimate tradeoffs.
There are physical limitations on socket sizes and construction that can allow owners to reasonably upgrade the equipment without damaging it. There are design limitations on arranging components in such a way that they can be accessed. Also the majority of users never open their machine; why should they have to pay more so the privileged few can pay a little less for storage and RAM?
As someone who has recently assembled a commodity PC (Mini-ITX) and someone who has fiddled with RPi hardware, I can see the disadvantages facing a consumer hardware manufacturer attempting to satisfy a tinkerer's urges. Notebooks, all-in-ones, and Mac Mini-sized desktops are just at the edge of serviceability, and clearly the mobile and tablet product lines are well beyond it.
It's not just tinkerers. If you crack your screen, the glass is generally glued onto the screen, necessitating replacement of the entire display assembly, which isn't all that much cheaper than just buying a new one. They do this on purpose to force people to buy more devices.
The only reason they get away with it because we don't force them to take repairability seriously. As someone noted, most customers prefer new devices.
You only need to purchase more devices if you make it a habit to break the ones you have. No one's being forced to do anything in the repairability case.
I've long said that the Apple hardware is second to none ... and I have a G3 Wallstreet that not only still runs, but it's battery lasts for a couple hours.
My mid-2012 rMBP is pretty amazing too, but I quickly became irritated by OSX. It's now running Linux Mint with the XFCE WM now and I couldn't be happier (note that with a kernel upgrade, it even supports my Thunderbolt Cinema display).
The one thing I wonder is: "Are we in the minority?" Do the "unschooled masses" simply accept that this is how computers have to be?
Two previous mac books died on me after a year or so (they were already used for a year or two when I bought them). One died due to GPU solder problem. The other due to some unspecified damage to HDD controller.
Hope our new retina will live longer. If only my girlfriend that actually uses it would allow me to kill the osx and put Windows 10 there it would save us both a lot of frustration.
Mac OSX is hardly usable. I get how developers can get used to it. Once you start Terminal you have a fairly usable sort of linux computer. I still don't get how graphic designers use it. Its UI is so buggy, glitchy, uncooperative towards hardware (wacom tablet! or just ordinary mice).
As for the hardware it's really powerful and the price is right or better than right for those components. The thing is that you wouldn't build PC out of those component yourself, because you'd rather use processor that's 8% weaker but costs half as much.
You cite anecdotes that you don't even pretend to justify with examples. Are you speaking from experience? I'd bet not. And it's interesting that you think putting Windows 10 on your girlfriend's machine would solve your problems - moving my entire family to a Mac running OS X has solved all of mine.
To list problems I had with Mac OSX over last few months I'd need at least few blog posts.
But that would be redundant because a lot of them are not unique. When I'm infuriated by something I search for solutions an I'm seeing lots of people expeiriencing same glitches or "featurs" and sharing some workarounds (working if I'm lucky) often paid, or just piling up comments about the issue.
Are you sure your familiy still uses computers you migrated to mac osx? Maybe after that stunt they are just afraid to ask you about anything else?
There was a joke where a guy who was sneezing went to the doctor but the doctor made a mistake and perscribed him a laxative. When the guy shows up at the doctors again in few day a the doctor asks "Do you still sneeze?" guy replied "I don't dare..."
> Are you sure your familiy still uses computers you migrated to mac osx? Maybe after that stunt they are just afraid to ask you about anything else?
Yeah, I'm positive, since they're not entirely without problems and I still get pinged periodically. I also live relatively near them and talk to them all the time.
It's worth pointing out that while you say your issues with OS X are numerous and well documented on the Internet already, you didn't like to any of them. So it's still unjustified anecdotes. Here's a practical one I just dealt with - there's one hold out in my family on Windows, and he wants to set up a syncing calendar between his computer and iPhone. Outlook can't natively support CalDAV - the backend his company's server runs uses IMAP and Cal/CardDAV - so I could either pay for a plugin or convert him to Thunderbird. If he had a Mac, it'd have been a two minute deal since the built-in tools have CalDAV support. But that's just one example, of course.
Don't even get me started on iPhone. Putting mp3 ringtone on it is some serious trick coercing of itunes even on Mac. Ios web browser can't even download arbitrary files. You can't mount iphone as a thumbdrive to move files around.
As for mac, magic mouse is basically useless as clicking causes unwanted scrolling especially in adobe apps and google maps. All praise to good people who made Mouse Prefs app and give it away for free. They should get a pot of gold from apple.
Finder is such a piece of crap that basically first advice given to mac users is buy a file manager. It can't display directories above files. You need to install xtra finder for that.
Image browser is nowhere near the functionality of Fast Stone on windows. Also 3rd party browsers are far behind.
Network drive doesn't autoconnect sometimes and often disconnects. Remote folders visible in finder window don't show up in save dialog of a web browser. Folders disappear from favorites in finde for no aparent reason. Finder sometimes doesn't indicate if contents of the remote dir ar still loading so when seeing empty remote dir you don't know if it's empty or it just didn't load yet.
Some high dpi usb mouse can't be used because it feeds mouse move events too fast for mac osx and they swamp mouse down event so dragging a window becomes very hard and deliberate operation.
Dropown menus on finder got damaged after some use. Clicking away from the dropdown menu on some button locks this button, as if it didn't register mouse down event until it gets mouse up event that somehow got swollowed by dissaperaing menu.
Computer sometimes closes very slowly. Finder just hangs and blocks computer restart. Multiple desktops, multiple fullscreen apps, fullscreen apps at all just get in a way. Switching them with gestures is source of additional confusion.
Additional monitors display weird lines when they are turning on an off.
I even installed clean yosemite in hopes that it will fix some obvious defects. It didn't help much.
So far my expeirience with mac that costed three times any of my previous computers is that it's pile of failing crap.
Yosemite is pretty win 98. Maverics was the same but with different skin. Even linux wasn't that bad. At least not for the last 5 years.
Let me guess, NVIDIA GPU? The Windows laptop world got hit pretty hard by that one, too. It wasn't just Macs affected, it was a systemic failure (brought by EU-mandated lead removal in solder, iirc).
I have the same problem on my Mac with an AMD GPU. It's fairly common for 2011 Macbook Pros and there was a recall on the 2010 MBPs for a similar issue iirc.
Personally, I've been using Yosemite and iOS 8 every day.
iOS 8 - no complaints whatsoever, at least on iPhone 5 and 6. I've heard some people complain that on iPhone 4/4S it could be slow but that's a few years old hardware and Apple probably doesn't spend the majority of their time optimizing that most likely.
Yosemite - major bug is that my WiFi stops working once in a whole (~once a day) and have to turn it off/on.
Overall, I think the the criticism is definitely a little bit overblown.
I'm going to suggest that the 'functional high ground' was relied on a unique combination of marketing, market penetration and era.
Of course, we know Apple always pushed the 'it just works' mantra, but this was when Apple was supporting a small number of devices, and when they had forced everybody to update to new hardware in order to run OSX anyway, so what you had was an OS that worked VERY well for a small number of people on a small number of devices.
Those few (like myself) who claimed that OSX was not any better than windows (my first experience with OSX had me in a reboot loop trying to get Pages or Keynote working), were a minority of the OS users and a tiny percentage of overall computer users.
As OSX has grown in popularity, the 10% who dislike the newest versions has gone from a few tens of thousands (may hundred thousand) to a few million. Some of them very vocal, like Marco.
Lastly, I mentioned the 'era'. When OSX was gaining in popularity, many people were moving from Windows '98 and XP. They had skipped Vista and didn't try Windows 7. Moving to OSX was a massive improvement, as it was a much more modern and cleaner OS. Those same people are now used to the bells and whistles of OSX, and each new upgrade shows minor improvements and a few odd little features that, from what people tell me, come with the recommendation of "turn off the new stuff".
Those who complain about Microsoft OS quality, didn't compare Apple OS quality of a similar era. If you compare OS9 to 98 or XP, I think you'd find that it's a much more even comparison, and Microsoft may even come out on top.
I don't know, I've been a Mac user off and on since Macs were 68k based, had a 2007 MBP that I was really unhappy with (of my still functioning machines from that era, it's easily the worst, by $300 netbook is easily a better machine at 1/10th the cost). The last 3-4 years or so my daily driver at work is a rMBP.
I use Windows 7 at home, but feel about as comfortable in front of either system.
If I compare any of the last 3-4 versions of OS X to Win 7, I'd say they're roughly comparable. There's pluses and minuses on both sides. I just spent a few days writing some Python code on Windows, and the experience was generally about as good as OS X. Win7 is the most rock solid non-x-nix OS I've used, and once you learn your way around it generally flows well.
I don't perceive a huge downgrade in quality on OS X. Things change, bugs get introduced. I don't aesthetically like where Apple is going with their flat design, but the chrome of most apps I use takes up so little real-estate of what I use that I can ignore it. I think you have to do a bit more work to get an OS X machine into a real usable state than Windows. But once it's working it's relatively pain free. I don't think it's as stable as Win 7. But it's like comparing 4-9s uptime to 3-9s uptime. On a day-to-day level I don't really notice it as much.
I'd say uptime on my Win 7 machines is better, but Microsoft forces restarts and updates too frequently to make that kind of claim. It's nice that OS X is a x-nix in the sense that what I do on it is more easily translatable to other x-nix systems so deployment is easier. And that's great until it isn't, because some tool or library or something is different or not available. But then again, development on Windows forces you to assume all sorts of crazy non-standard stuff as a constant.
I don't really know where this perception that it's getting worse is coming from. I recently fired up my 2007-era MBP and it really feels like a worse experience, top-to-bottom. Modern OS X really is nicer.
I suspect that people are finally starting to look at Apple products more critically (at long last). Perhaps it's because the pace of innovation seems to have slowed down, perhaps it's because Jobs is gone, but I'm starting to notice Apple pundits are finally starting to see that things aren't perhaps as good as they dreamed, and looking for ways to push the fruit forward.
In mavericks, I could hit command-space and start typing, and knew that even though the spotlight box hadn't popped up yet, my text was being captured. Now in yosemite, it's more like... hit command-space, start typing "chrome", wind up with a box 1.5 seconds later filled in with "ome". This is only a problem on my i5 mac mini. My macbook pro (same year - late 2012) is just barely fast enough to pop open the window in time to capture my text input. But that just shows that their UI stack is too deep to provide the kind of snappy user interactions they are shooting for.
Maybe there is something odd about the mac mini that causes this; my 2009 mac book pro with mavericks has no problem with it. I tried, but couldn't hit a character fast enough to cause a miss.
It wouldn't surprise me to find out there was something hardware related with the mini that leads to this.
Couldn't agree more. I can't open my coworkers KeyNote presentations because my copy is a year newer. My two year old Macbook Retina Pro black screens half the time it tries to wake up from sleep and yes, I've wiped every startup program and kernel extension. I'm just glad it isn't my work Macbook Pro which black screens on login half the time and insists I forgot my password and won't let me time my password in another quarter. I suspect it has something to do with disk encryption since it started after they turned File Vault or whatever it is on.
It's not just the functional high ground that Apple has lost, it's the look and feel battle.
To me, both Windows 8 and Windows 7 look way, way more modern than the latest Mac OS. There's animation, there are colors, consistent and solid keyboard support, strong consistency (why can't I still rename a file in the Mac OS file dialog but I can in Finder? They both look identical!).
For a little while, Mac OS had the UNIX foundation advantage over Windows but these days are gone. Today, I use git, ssh and bash seamlessly on Windows.
I've used OS X daily through every public release (including the public beta). I actually wrote a column about OS X during its early days. All of these articles are anecdotal, and I agree with other commenters here who say it's all about perception. The fact of the matter is that Apple has had some high profile "quality scandals" the last year (some would argue 2/3 were not deserved):
- iOS point release that bricked phones and was pulled
- Bendgate
- iCloud hacking scandal
None of them actually were OS X bugs, but you couple that with some minor OS X bugs and everything goes into a whirlwind of negative perception. What are the major OS X bugs that everyone is referring to though? I haven't experienced them, but that's anecdotal, so that opinion is just as worthless as everyone else's. I heard people are having some Wifi issues - that sucks - but where is the showstopper that's affecting everyone?
On the other hand, I do feel that iOS and Swift have been buggy enough for developers the last couple of years that its hurt Apple's rep in a legitimate way amongst the intelligentsia - programmers who have a pedestal to preach from. But let's be real - in terms of day to day problems, things are much much better than they were during Mac OS X 10.0 to 10.2, and yes, they're still better than Windows/Linux (which will always be plagued by the huge number of hardware configurations they must support).
Lest we don't forget, interacting with SMB file shares has been a constant pain for years now, pretty much in every release.
I will spare the details from before, but in Mavericks, when things looked like were being fixed, one cannot delete an entire folder. It would delete the contents, but not the folder, then one had to delete the folder by itself after it was empty.
In Yosemite some folders are left "locked" without being able to do anything to them from OS X. This was similar and worse before.
And don't get me going on how it leaves .DS_Store, ._.DS_Store, .apdisk and ._.apdisk crapola laying around.
Oh, how about driving everyone to properly support case-sensitiveness? All sorts of things do not work or crash, or simply refuse to install if installed in case-sensitive volumes.
How about training the "geniuses" at the stores that there are other standards for video cabling? (And that one should not have to buy thunderbird-enabled monitors to be able to have more than one monitor on a macbook?).
My MacBook Pro froze and rebooted out of the blue this past week. It was pretty much idle. has never done that before.
Marco's article seems a little hyperbolic since as I recall Leopard for instance was hugely unstable on release.
Conventional wisdom with OS X has always been to wait a few point releases before jumping in. With Microsoft it was to wait until at least SP1.
I don't think things have changed that drastically.
* SMB support still sucks.
* OS X Server is still a monumental turd and I wouldn't be surprised if Apple kills it a release or two down the line.
* Yanking Spaces in Mountain Lion and going to Mission Control followed by Yosemite finally getting (flaky) multi-monitor support is classic Apple focusing on the consumer before the pros.
* Active Directory in Yosemite is a bit buggy but anyone who's used Leopard will remember that the gold master wouldn't even allow AD logons if your domain was .local - I mean did anyone at Apple even test something as basic as that!?
It goes on, but I honestly don't think Apple have changed for the worse. They're not any better in my personal opinion either.
The biggest problem we face with Apple at my company and it could certainly create the impression of increasing bugginess, is that Apple's release cycle is much shorter now.
No more 18 month releases with $100 upgrade fees.
Every developer in our shop can click 'Upgrade' on their machine and fuck their whole environment up.
It used to be much harder for them to do that in the past which goes back to my original point. Don't upgrade a production machine until you're at least a few point releases in! (that goes for your iOS devices too...)
I'm surprised that he's complaining about OS X and not iOS 8.
Coming from an iPhone 5 with iOS7 going to a 6 with iOS8, it was a major regression in terms of minor annoyances.
Particularly:
- selecting text for cut and paste in a text edit field is often very broken. I hit this one daily. I used to be a prolific iOS typer, but I've gone back to using my MacBook Air 11 just for this reason.
- selecting text in a browser windows brings out the weirdest bug, where the window gets stuck scrolling all the time. I have this OCD thing where I continuously select and deselect text for no reason whatsoever. As a result, this hits me many times per day. The only way to fix it is to kill Safari.
- When you have your phone open in landscape. Power it down. Then switch it on again in portrait, iOS doesn't detect the change in orientation. You have to rotate to landscape and back before it notices.
There are just minor things, but when they hit time and again, they get annoying real quick. And they don't get fixed.
"having major new releases every year is clearly impossible for the engineering teams to keep up with while maintaining quality." ... "We don’t need major OS releases every year. We don’t need each OS release to have a huge list of new features."
I would argue everything we know about software engineering process says that more releases is better. Incrementalism lowers risk for engineers and for users. Once a year doesn't sound often enough to me, going the other way would just repeat Microsoft's failings.
However it is possible that the core of his point, that marketing trumps what engineering can accomplish while maintaining quality, may be true. There seemed to be a lot of bolt on features in the last iOS & Mac OS X releases that didn't necessarily need to be pushed out right now, and some of the regressions are painful.
"I would argue everything we know about software engineering process says that more releases is better. Incrementalism lowers risk for engineers and for users. Once a year doesn't sound often enough to me, going the other way would just repeat Microsoft's failings."
Yes, but that can be in the form of .X releases, not X. releases.
Which is to say, by all means release X.2 and X.3 and ... X.9 and so on ... keep polishing !
But OSX releases 5.X and 6.X and ... 9.X ... and so on. Those aren't the actual numbers of course, but it is major releases that keep coming rather than minor releases.
A friend of mine has just joined apple in a systems role and can't believe what a mess it is behind the scenes. He described it as being like "1000 startups all working on their own thing but owned by the same parent company" (with all the cross-group communication mayhem that implies).
That sounds like essentially a good thing. The idea 1000 startups are likely to be better at innovating than one monolithic bureaucracy seems like a familiar one here on Hacker News.
Sounds good but apparently the reality is a horrendous 'mish mash' of different tech stacks and fairly poor cross-group communication. Throw a marketing driven release cycle in to the mix (as the article implies) and it's no wonder a few cracks are appearing.
The age old problem of being a monster sized company I guess. No one is immune.
It sounds surprisingly like how I've traditionally viewed Microsoft. Microsoft has always had a lot of really great technologies, but they've traditionally struggled to leverage them in a cohesive way.
You'd see them completely reinventing the wheel between different groups while simultaneously turning around and shoehorning in MS tech places where it just didn't make any sense. Throw in all the in-fighting between different groups and you have a recipe for the company's malaise through the 2000s.
One of the clearest examples I can think of is Microsoft's acquisition of Danger and the disaster that became the Kin:
tl;dr - Microsoft buys out existing, successful mobile company, scraps their existing technology stack to rebuild with MS technology while building multiple, competing mobile platforms internally, internal turf wars ensue, and in the meantime their carrier partner basically loses patience with the whole thing and seals the fate of the entire project.
To me, the major difference between Apple's OS's and the Windows/Linux/Androids of the world is that Apple _fails smoother_.
A typical bug may result in an iOS app freezing and then quitting on the user. However frustrating this may be, Apple's "something went wrong" user experience is much better than the annoying pop-up alerts, blue-screens of death, and cryptic error messages of the other platforms.
These error messages are only relevant to a small, small subset of power-users, so why show them to everyone else? The other 99% of users are just going to quit and restart the application anyway (as we've been trained to do), so why not smoothly lead them down this path?
"My iPhone 6 rebooted after I changed the home screen wallpaper. Tapped a new image in the wallpaper settings, and poof, it rebooted. Worse, it never stopped rebooting. Endless reboot cycle."
The other 99% of users are just going to quit and restart the application anyway (as we've been trained to do), so why not smoothly lead them down this path?
The fact that 99% of people want to be completely ignorant about the systems that their lives are centered around, and partially controlled by, does not mean they should be restricted from any opportunity to do otherwise. To answer your question,
so why show them to everyone else?
It gives the opportunity for someone to be curious and search for a solution, or at least mention the error to someone who can help. Even if they don't remember the message completely, telling someone "it couldn't foo the bar" even if they don't know what that means is far, far better than "something went wrong". We are told to be as detailed as possible when asking questions or writing bug reports, but these opaque error messages encourage the exact opposite.
This is from someone who has been asked to troubleshoot others' Apple laptops.
That's a cool mantra to repeat but it do not seem to match reality.
For instance, in over 10 years of daily usage, I do not think having ever noticed any bugs in vim, grep and PostgreSQL. Just to take grep, for instance: if it says the string is not in the file, I have 100% confidence it is not in the file. So not only grep do not freeze my PC, it is also deterministic, which means perfectly reliable when one talks its language.
OSes are more complex you will tell me. Yep, but you said "all software has bugs".
You can't acknowledge that an operating system is more complex, but then not give it the benefit of the doubt of halting a system on a bug. If grep or vim fail, they just abort. If Postgres fails, you maybe get a corrupted database and it aborts. If OS X kernel panics, the machine halts. All software has bugs.
Also, if Postgres' bugs were unnoticeable, why did 9.3 get to 9.3.5 before the went to 9.4? Just because you didn't notice a bug doesn't mean someone else wasn't affected by it, and just because you notice bugs in OS X (which I'd bet you didn't) doesn't make them big issues.
Well, if we're being pedantic, the fact that you have never noticed a bug in Vim, grep, and PostgreSQL does not mean they don't exist. If you google, e.g., "postgresql bug" you will find evidence of this.
The charitable reading of GP's comment is, of course, that virtually all software has bugs, or that all complex software has bugs. Or, even better, that it is extremely difficult to write bug-free code, and this difficulty rapidly increases with the complexity of the software.
I feel like Apple software quality has been degraded because I always expected kind of top and flawless. It has been and must be to keep its position.
As far as I remember, in earlier versions of OS X, I just didn't care on visuals so much because it was very flawless, so everything felt smooth like a fluid and nothing bugged me.
With 10.10, I see visual/animation glitches very frequently on same hardware. And it is getting bugging me up and it feels flawed. I believe this is because of moving on to a new visual styles. But if this sustains, it's just a matter of time to hit the bottom.
The thing I don't get in this discussion is that somehow the issue is that OS releases are too soon. If we assume that we were on a bi-yearly cycle instead, wouldn't that mean 2 times as many features would break in an instant?
Shouldn't Apple be doing updates faster (and smaller) instead of sooner? Whatever happened to agile?
Considering the amount of third party software now in the ecosystem, faster updates seem the only way forward. Unless you want your entire dev setup to be broken for 3 months every 2 years while all of your tools update to the 100 different incompatible changes.
I don't agree with owenwil. Apple is actually making bugs release after release. It really brought pain upgrading to Yosemite, waiting the fucking hours copying files in the so-called "a few minutes". As I can recall, upgrading from 10.8 to 10.9, Preview begins to blur when scrolling, and more memory taken. Then in the release of Yosemite, MATLAB GUI won't work at first, then goes the ugly color theme for mobile devices. Few will upgrade to 10.10 if there were not Swift support.
I think iOS lulled us into forgetting that longstanding truth. For a few years, every iOS release meant my iPhone got better -- and occasionally even faster. That changed when iOS 5 came along, of course. Apple really had us conditioned to think that their OS releases were getting unequivocally, monotonically better... at least for a while.
I always reinstall and rsync my home directory back and reinstall applications on osx. I never have any of the problems people seem to have here. Just as an anecdotal point.
Upgrades "work" but honestly I don't trust upgrades on any operating system. I'm working on finishing up ansible plays to get things sorted out entirely.
Upgrade if you want, but accept you probably have skeletons in your closet.
If I don't listen those who provide the service, am I going to listen to you? If Apple cannot give me a good solution, why would I buy its product? I know you are good at backing up your files and reinstalling, but I don't want this stuff bother me every release like this. I am just not that free, dude.
Fair enough, but then a followup question, and don't take it the wrong way. If there is a known, os agnostic, way to fix issues with upgrades, and you do not employ that solution, is arguing about upgrading without trying that option productive? I understand you don't listen to the os vendor, but in that case I can't see the logic you'll use as all os's have issues/quirks/edge cases and "trusting trust" of someone. If you're not getting value out of OS X, then honestly you should be moving off it regardless of what anyone says.
As a note, here is how I do the entire thing, I rsync my home dir on each home wifi connect (scripted for years with ControlPlane), and midnight as well. I also rsync before upgrading, then I upgrade the os, rsync again, reinstall the os and rsync home dir back and reinstall vmware.
Sounds hard right? Takes about overnight really most for the rsync back and that just happens while I sleep. I do the same thing for linux/freebsd also. As an example, freebsd 10 upgrade had issues with bhyve not working after upgrading. For linux as an example, things can be markedly worse for things like ubuntu upgrades. Which is/isn't my experience as a (thankfully now ex) devops admin. Nothing is perfect, especially upgrades. That said I have tested the tires out on upgrades of OS X and not really seen issues that others do.
I understand no time, but setting this up, which granted for me is trivial as vmware is about it that I reinstall, the rest is in ~/Applications, means its trivial to not encounter dragons needlessly. I understand you might have the adobe creative suite or whatever installed and reinstalling might be worse than crossing into mordor, but this wisdom is passed on not just by me but from those that came before me. The rsync bit could be improved but I'm lazy and it works well enough.
The first thing I have to think if I want to reinstall instead of upgrade is, how to backup MacTeX & homebrew. I have to consider if it will generate pdf documents correctly. (MacTeX is installed in /usr/texbin/) Reinstalling MacTeX is not that hard compared with reinstalling homebrew. Homebrew, though bottled most package, will take a long time compiling a version gcc with openmp support, let alone recalling who to install. And gcc is not the only thing that need a compiling job. Maybe, years later, I will follow the path of reinstalling or moving back to Linux.
So I compile llvm+gcc often with home-brew, the compile time on my laptop takes total about 5.5 hours. But I setup vagrant and vmware to allow me to automate this all away and create txz's of the install.
And as for mactex it seems to work fine for what I use, though it installs to /usr/local/texlive when I tested it a month ago.
having major new releases every year is clearly impossible for the engineering
teams to keep up with while maintaining quality. Maybe it’s an engineering
problem, but I suspect not — I doubt that any cohesive engineering team
could keep up with these demands and maintain significantly higher
quality.
And yet, OpenBSD springs to mind for not just the timeliness of their releases (every 6 months) but also the quality and features within each release.
he wasn't even using Apple's software... using the Mac as a Terminal...
I'm using two Macs (Macbook Air and Quad-Core Mac mini) all the time. There are a lot of annoying bugs in the Mac OS. But I can't say that it is more.
Some of the recent stuff is quite cool and very useful. There are some things I don't like (iCloud, iTunes speed and stability, ...), but in general the latest OS works quite good.
Having drops in software quality doesn't surprise me at all. Anybody that used Mac OS back in the System 7 days likely remembers seeing the bomb message and getting intimate with macbugs (G^FINDER) to try to save your work.
I think the only way you'll see a dramatic rise in quality would be if software becomes their core business (instead of hardware)... which isn't happening.
By what measure is Apple losing reputation? Their sales are at an all time high in a market that is losing ground to tablets. This is the second post on HN today about how terrible OS X is, and I just don't see it. And I hardly consider the yearly updates major OS changes. Yosemite was pretty big, but that was the first big one in a few years it seems.
Just a couple weeks ago someone asked me how Yosemite was compared to Mavericks. My answer: "they are the same modulo a couple minor UI tweaks". I guess it's because I use very little Apple softwares, despite using almost exclusively Apple hardware.
Turns out Yosemite is perfectly fine to use Sublime Text, the Terminal and Firefox.
I haven't had any issues with Yosemite (the last buggy release for me was Mountain Lion), but iOS in my iPad and iPhone are a mess since iOS 7.
Messages breaks in every way possible. Safari went from crashing twice every half hour to freezing and continuing to play videos after I've closed tabs, and getting stuck full screen.
I bet Jony Ive will go back to focusing on design and someone else will run engineering again. Apple tends to prefer internal promotions but I wouldn't be surprised if summertime external becomes new head of software. Maybe someone who has a NeXT background.
From an enterprise standpoint the wifi on 10.10 is basically a DDos attack. I know they are trying to put in a feature to send files between devices but it doesn't work on a college campus at all, which should be an environment they are testing in.
Also before yosemite release apple announced a public beta, which is kind of unusual for apple since they always wait until the party to release stuff. Does that mean some part of apple’s QA is now offloaded to public?
Btw Gnome 3 looks and feels almost like Mac OS. But it works and does not crash like Mac OS. Everyone who uses Mac OS should try modern Linux distributions. They do not suck so much as 2 years ago.
I'm not running OSX as a server OS and don't plan to. But even with the latest 10.10.1, I still see uptime in terms of weeks including mandatory reboots due to OS updates. I think it's easy to say that other desktop OS's are years behind and falling farther behind with each release.
As has happened in prior OSX releases, this latest release may have been rushed. The next release will probably address current bugs.
"I still see uptime in terms of weeks including mandatory reboots due to OS updates."
This now applies to every OS out there. Windows, iOS, Linux, ChromeOS (aka Linux :). It's simply not a bragging point anymore.
"I think it's easy to say that other desktop OS's are years behind and falling farther behind with each release."
You're right that it's easy to say. That makes you technically correct - the best kind of correct :). I personally disagree, and think this XKCD strip is relevant:
I suspect Apple's software development practices have never really been all that great, it's just that they worked harder when Jobs was around to make sure the product worked. Now without his influence, quality has been steadily slipping.
If Apple's not careful, they could wake up in a few years and find they're just another tech company.
It would be interesting to find what kinds of stories give a user who submits more karma. negative or positive? It would be quite ironic if creating negativity on Hacker News is what builds karma.
I'd say the biggest change in the development methodology happened when Bertrand Serlet was replaced with Craig Federighi.
With Bertrand, we would move in giant monolithic releases where every group would just dump in whatever they had ready and the whole thing would get released with nightly builds. With SnowLeopard in particular, I remember three dozen releases in a row where Xcode was unusable due to obj-c garbage collection issues. Random stuff you didn't expect like CoreGraphics would have showstopper issues and then we'd report it and it would get fixed by the next week.
This resulted in extremely late releases that had a ton of bugs that we piled patches onto as time went on.
Craig moved the organization onto a sprint system, where we would develop new features for 2 weeks and then spend a week fixing bugs. After 10 or 12 or 16 of these cycles, we would deem it ready and ship it out.
I felt this produced more stable but more conservative software. It seemed like giant rewrites and massive features would be very difficult to introduce and if they did get done, wouldn't happen until two thirds or so into the release cycle.
On the other hand, Craig has consistently been able to release on time with most of the features promised.
I was only there up to the release of Lion (the first Craig release), so I don't know how updates and patches worked from then on. Maybe they're worse now.
But I've been using OS X all this time, and honestly I don't think it's any worse than before.
What has changed is that releases and features happen more often. Tiger and Leopard had a good 2 years to mature and get patches while their delayed successors missed target dates. In the meantime they stagnated with ancient unix tools, safari build, QuickTime frameworks, graphics drivers etc.
They felt stable because they were just old, sort of like Debian stable. Meanwhile, the development versions of Leopard and Snow Leopard (the two I spent most of my career at Apple developing) were downright horrible and unreleasable. Each of those releases went gold and had an almost immediate .1 release to fix glaring issues.
It's just that you remember them better because they had a longer history as a stable legacy OS than the modern versions.