Hacker News new | past | comments | ask | show | jobs | submit login
Apple’s declining software quality (sudophilosophical.com)
1282 points by pljns on Feb 4, 2016 | hide | past | favorite | 705 comments



This has to be a management problem. Apple has total control over the hardware, total control over third party developers, and $203 billion in cash. What are they doing wrong?

Apple has the resources to approach software like aerospace. Formal interface specs, heavy Q/A, tough regression tests, formal methods. It's expensive, but Apple ships so many copies that the cost per unit is insignificant.

Microsoft did that, starting with Windows 7. Two things made Windows 7 stable. The first was the Static Driver Verifier, which examines driver source code to check if there's any way it can crash the rest of the OS. This includes buffer overflows. The driver may not work, but it won't take the rest of the kernel down with it. All signed drivers have passed the Static Driver Verifier, which anyone can run. Driver-caused crashes stopped being a big problem.

With the driver problem out of the way, any remaining kernel crashes were clearly Microsoft's fault. (This has the nice problem that kernel bugs could no longer be blamed on third party drivers.) Microsoft had a classifier system developed which tries to group similar crash reports together and send the group to the same developer. It's hard to ignore a bug when a thousand reports of crashes from the same bug have been grouped together.

That's part of how Microsoft finally got a handle on their software products. Is Apple doing anything like this?


Nah, I think it's a perception problem.

As someone whose starry-eyed Mac obsession predated Windows 95 - Apple's software has always been buggy. It was buggy under Sculley, it was buggy under Amelio, and it was buggy under Jobs. I remember getting plenty of sad Macs under System 6 and 7, and early versions of OS X weren't any better.

We just didn't care because Steve Jobs was really good at distracting us with promises about how great things were going to be, really soon now.

The comparison with Microsoft is instructive. Microsoft software was even buggier than Apple's during their period of greatest dominance. Win95/Win98/WinME would crash all the time, and was an open barn door for security. Early versions of IE were pieces of shit. Even later versions of IE (6-9) were pieces of shit. Microsoft finally got a handle on security & software quality just as the world ceased to care about them.

Apple's been driving change in the computer industry since the iPhone was introduced in 2007. New products are always buggy - the amount of work involved in building up a product category from scratch is massive, and you don't know how they'll be received by the market, so there're frantic changes and dirty hacks needed to adapt on the fly, and they often invalidate whole architectural assumptions. It's just that most of the time, this work goes on when nobody's paying attention, and so by the time people notice you, you've had a chance to iron out a lot of the kinks. Apple is in the unenviable position of trying to introduce new product categories while the whole world is looking.

The Apple Watch is buggy as hell, but I still find it useful, and pretty cool.


I think you've touched on something key: people's acceptance of bugs is often outweighed by the new features they get. In other words, if the "new functionality" is perceived to outweigh the costs of the lack of reliability, the product will be deemed "less buggy" because people "understand" why there are issues.

This made total sense in some of Apple's biggest products: OS X and iPhone. When OS X first came out it couldn't even burn CDS, but we all "understood" the magnitude of the project and thus gave it some slack. Similarly, the iPhone lacked a lot and was slow, but it was such a revolution that we let it slide -- in fact we let the rest of the products slide.

The problem today I think is that these decisions are being made for reasons that users don't deem "worthy". Introducing some new music services is not a good enough reason to break my iTunes. The fact that a new watch was released is not considered important enough to let other platforms languish. We "get" why less attention is being paid to other products, but unlike with the phone, its not deemed a good trade off.

In other words, I don't think Jobs was distracting us with promises, but with actual shiny things that made the bugginess worthwhile.


As a long-time Apple user and former employee, this is exactly how I feel about the current situation. I still think that my Mac/Apple devices are more solid than my Android/PC devices, but this is exactly what I'm noticing more regularly.

I forgive most faults that happen because it's almost as if I can forgive them not working all the time since 99% of the time, everything is awesome. On Windows, that same forgiveness manifests itself as me not using my Windows machines as much as my Macs. I still love to use my PCs, but not for anything that I need to rely on the majority of the time.

Now, though, Apple is making changes to things (iPhoto/Aperture were a really great example) where it seems like the change is just to bring parity of some sort to OS X and iOS rather than introducing new features. iPhoto was buggy as hell when they added Faces and Places to it, but I totally forgave that because 99% of the time it was making my life way easier than it was before by detecting faces properly. If it crashes every now and then, it at least saved the data, so I was still better off than I was before the update. I still like Final Cut X (I know, I know... I'm an outlier), but convincing me that a switch like iPhoto/Aperture -> Photos is worthwhile is much harder since there's nothing to distract me away from those issues and I've somehow managed to actively lose features that they convinced me were necessities in the past.

I hope this is not an indicator of things to come. One thing that gives me some hope is that they've gone back to alternating between feature updates and stability updates. Leopard was cool, but Snow Leopard was incredible to me. If that pace comes back, I'll be happy again. Until then, Apple needs to get their software game back in line with the rest of the company.


If you think that it was bad when they removed features from iPhoto, then I'd hate to think what you thought when they removed features from Numbers!


Oh yes... That was a bad move, I think. Luckily, I rarely have to use Numbers so I didn't really care. It just annoyed me that they removed some of the features that I actually did use when I needed to use Numbers. If they added the features back as quickly as they did with other apps, I wouldn't care, but they didn't. :(


I (Apple) will one-up you with Final Cut Pro X http://arstechnica.com/apple/2012/01/more-fcpx-fallout-top-r...


I love the new FCP. As a long time user of FC7, I'm ok with losing out on some of these features as long as they added them back over time and they've done that, for the most part (at least for my uses). The old FCP really needed a facelift and was trapped in an such an old mindset when video was still mainly stored on tape drives and needed to work like real life video editing tools. FCP X is so fast for me and such a treat to use for 99% of things that I can deal with having to jump back to FCP 7 every once in a while. As long as Apple doesn't somehow prevent me from using FCP 7, I don't care and love the new direction of FCP X.


Reminds me of a discussion that was on anothe HN article a few weeks back where someone proudly stated that if a feature customers used didn't fit for in with the companies strategic direction they'd drop it, and tough luck for the customer.

Apple seem to have the same mentality. They used'to get away with it, but mostly because they replaced it with something better. Now they just seem to drop features entirely. That's not a good way of going. As much as I despise Steve Jobs, he never let that quality of a product drop to the degree that big customers (or even smaller customers) left Apple without a major fight to keep them.

It's looking like Apple's obsession with making great and quality products is taking a bit of a backward seat. I think they probably need to worry a little less on their schedule, and more on polish and feature completeness.

Rather remarkable I'm actually saying this, to be honest! Apple would be the last people I would have guessed needed this advise...


I think that's an awesome sentiment if you're talking about something like an Arduino, where part of the experience is working around its quirks and limitations. If you've bought a device expecting it to basically be a transparent window into the internet (or your documents, etc.), having to deal with its quirks and limitations can put a really bad taste in your mouth. Especially if you paid top dollar for it.


"I think you've touched on something key: people's acceptance of bugs is often outweighed by the new features they get. In other words, if the "new functionality" is perceived to outweigh the costs of the lack of reliability, the product will be deemed "less buggy" because people "understand" why there are issues."

Right - which is why we have all of the snow leopard nostalgia: because none of the newer releases have given us anything substantive that we really needed to justify the hassle and the bugs.

I am trying to think of something - anything - that compels me to upgrade SL on my mac pro, and all I can think of is that nifty take-a-picture-of-your-signature in the Preview app that you can then insert into PDF documents.

Ok, and maybe USB3 ?

That's all I can think of.


AirPlay; much better multi-display support; tags in Finder; Spotlight enhancements. More than anything, the iCloud/iOS continuity features were also big if you had an iPhone or iPad, everything is just much easier to keep in sync.

I'm a Safari user (better battery usage for the # of tabs I have open) and it too has improved with El Capitan though that's irrelevant for Chrome/FF users.


ok, airplay I guess - although I've never used it, I do see people using it to good effect.

Worth mentioning that airplay is just userland software - nothing special, and no reason it couldn't have been added to SL.

I don't know about multi display, though - I've been under the impression that that is broken in new and fascinating ways with every single release...


Yeah absolutely. Snow Leopard's multi-display was great. As was Leopard's, Tiger's, and Panther's.

Then Apple broke it massively in Lion, and only finally resolved most of the (severe, productivity-destroying) issues with Mavericks.


Handoff is a really useful feature (when it works).

Also SL mamed Expose (that weird non-proportional grid view) that was reverted to the Leopard-style in Mission Control (of which Mavericks/Yosemite had the best implementation, and they've now broken its utility in ElCap thanks to hiding thumbnails by default. FFS.)

But apart from that... I think I preferred the Apple apps back in 2009-or-so.

To be honest, I think the latest Apple release cycles have been more about "remove a feature so that we can add it in again and sell it to our users again". Think multi-monitor support, something that worked perfectly in SL and earlier, and then broke fantastically with the full screen apps in .. Lion? ML? One of the two.


True Apple software has always been buggy, Apple calls it undocumented features.

Apple always makes up for bugs with newer devices with faster CPU and GPU units that make OS code run faster. That means buying a new Apple device to get better performance. The older Apple devices are left out of updates eventually and if they do update to a newer OS version it runs slower.

Apple is driven by an upgrade model to buy a new Apple device every three years or so. In the PC world Windows 7 can still run on old Pentium 4 systems and if I am not mistaken some of them can upgrade to Windows 19, the 32 bit version but it can still work. For example I used to have a Macbook that only ran up to 10.7 and 10.8 needed a newer Intel CPU to install. Anyone with an iPhone 4 is going to find the latest iOS slow as well.

It is in Apple's business model to sell customers a new device every few years or so and phase out old Apple devices.

Apple doesn't care if their software isn't the best quality as long as it is easy to use and will keep people buying new Apple devices to run things faster.

I myself like GNU/Linux better than OSX, because it can run on older PC systems and it runs quite fast and has a good quality to it. GNU/Linux is virtually unknown to the average consumer and when people get tired of Microsoft they usually just buy an Apple device. Apple devices are easier to maintain and use. You even got toddlers using iPads, that is how easy they are to learn to use.

Apple has saved up billions just in case they have problems. Apple has done well financially in an uncertain economy where other companies are struggling.

Only Alphabet seems to be doing better for some reason. Google's parent company. Google's Android needs better quality as well and since Oracle sued them over the Java API they have to change the way the OS works. The Web Services seem to earn a lot of money and Google's AI is very advanced.


Apple is driven by an upgrade model to buy a new Apple device every three years or so.

I disagree.

My wife's iMac is 6 1/2 years old, and still gets the latest OS updates (for free!), though we're upgrading to a new iMac soon.

My iPad 2 is almost 5 years old, and still gets the latest OS updates (for free!), though we're upgrading to a new iPad soon.

It is precisely because our older Apple hardware is still working well, and Apple still supports us with the latest updates to that older hardware, that my family is not only sticking with Apple, but we've recently invested in new iPhones.

Apple has earned our trust.


Yeah, OS X and the iPhone were new products, not just upgrades of what came before (as you say, OS X had significant limitations compared to OS 9). You didn't have to upgrade right away, but if you did, you got an entirely new experience.

Compare this to today's Apple, where upgrades add "hundreds of features" but feel mostly the same (except everything runs a bit more slowly). There's no coherent vision of what the future of the software should be like.


wow, I couldn't put my finger on it, but this is why I am starting to hate apple. I spend a lot of time on the computer so I like a lot of things apple does, mostly the interface and the UI. To clarify what UI means to me, it is how simple and intuitive it is to use the device, and give it the instructions to do what I want. I am willing to forgive a lot because this is good.

Apple has always been a fairly closed system and it didn't bother me more than not having the features I wanted. In El Capitan, it was different. Things didn't work well and Apple took over my whole system. With SIP(system integrity protection) I had no control. It would seem to turn protection back on after being disabled, and it takes a nontrivial amount of time to turn it off because you have to reboot the entire system into recovery mode, wait for it to connect to the internet and download a bunch of apple shit, and then select a language preference and then type a command into bash and reboot.

Deleting apps is difficult, changing settings is difficult, having siri take up 10% of my iphone is annoying, removing apps destabalize the system, installing my system from time machine reinstalls their system and settings and overrides mine.

I disabled most of apples applications and processes, the system in fairly stable, although I think I went to far with disabling notification center, but your point is correct.

tl;dr users are willing to accept a lot for revolutionary changes. Evolutionary changes with only marginal improvements are not going to make me forget that they unpredictably disallow me from using sudo and are fucking up all my devices doing things I don't want them doing in the first place.


A lot of people in this thread seem to think this is all about Apple not adding enough revolutionary features or something. But consider this alternate explanation: with years of experience comes a more sophisticated judgement. What used to seem good enough now seems to have obvious flaws, even if it the same as it was before. Lack of control is an example: beginners often don't notice or care much, especially if it feels simpler, but as your needs deepen it becomes more important. Being able to set things up and then not keep touching it is one of those tastes that develop with experience.


that is a good point as well. I definitely agree with it. The one thing I would add is that I repeatedly get update notifications on my iPhone. Due to the increase in lockdown of all features, I am legitimately afraid to update as the:

* provides security update

* increases iTunes performance

type descriptors, do not provide enough information about how they will fundamentally change my system. Most notably when I updated my iPhone and found out I loaded in some horribly inefficient talking pseudo AI that was not neutral, but a straight up negative feature consuming system resources.

I think you are really correct though, as you gain more experience and skill with technology you have more needs and better judgement. You can evaluate things better because you are aware of what is possible. The biggest problem isn't that they make changes, it is that those changes are not predictable so they become difficult to mitigate.


I'm also concerned about updates. For instance, I'm currrently having to route all my iPad web traffic via Charles proxy to remove any instances of style="overflow:hidden;" in a body tag are cleared out.

Why? Because in iOS 9.2 Apple released it with a bug that causes the viewport to zoom incorrectly on these web pages. This affects LibreOffice's OpenGrok, which I browsed regularly on my iPad.

They still haven't fixed this, and it's a major regression. iOS updates are few and infrequent. Consequently I'm seriously questioning what their updates actually do to my iPad and iPhone.


I wouldn't hold my breath. The iOS Mail app can not negotiate any TLS version above 1.0 (for IMAP, possibly SMTP too) even though it obviously supports TLS 1.2 because it sends a TLS version of 1.0 in the ClientHello message even though that same message will contain TLS 1.2 ciphers (AES-GCM).

I reported it in October and Apple's security team replied they're aware of it but it's still not fixed 2 releases later even though they probably need to fix like 1 line of code (the advertised version flag).


They have actually fixed it - if you get bit with it then you can reference rdar://problem/22242515

The WebKit bug is here:

https://bugs.webkit.org/show_bug.cgi?format=multiple&id=1528...

The patch to fix it is here:

https://bugs.webkit.org/attachment.cgi?id=268394&action=diff

The workaround, FWIW (thanks Simon!) is to add shrink-to-fit=no” to the meta viewport tag.

For me, it was too much effort to get OpenGrok fixed, so I just did a rewrite rule in Charles Proxy that gets rid of the style attribute.


I agree with your summation. To add to this, there are many things going on under the hood that none of us asked for that are taking up system resources, dialing home and draining battery life.

Some time, try this yourself:

    sudo opensnoop


The original iPhone wasn't slow at all. One of its main selling point was the speed of menus and apps (I forget what they actually called apps before the app store).

I think you forget how crazy slow feature phones were. Opening a GPS app and finding your location could take 5-10 minutes in 2007 on a feature phone.


They called them apps too. It's easy to remember with the infamous (quoting from memory): "You can write apps in HTML".


Guess I completely forgot that. Thanks.


I think you're right but people tend to take for granted the features that are worthwhile and underestimate the difficulty of making anything ever work correctly all.


>> Apple's software has always been buggy. It was buggy under Sculley, it was buggy under Amelio...

Classic Mac OS was buggy by design - it didn't have multitasking and memory protection, it was single user... Windows had that same problem before 2000 (well, NT 4, but not many people used that).

I use OS X daily and I use Windows 7 daily. I have far fewer issues with OS X for whatever reason that may be. My computers don't magically reboot or bluescreen nearly as much. It might happen every 6 months at the most, where with Windows it probably happens every 2 months.


I agree that OS X is more reliable. I have both a Mac and a PC in my office, and my 5K iMac has never crashed. I had some weird issues with Windows 10 after I upgraded my laptop, where it would hang after an update.


I switched from using Windows for 20 years to OS X (excluding Linux for work related stuff.) This is the first time I've been able to work from a laptop with my productivity level as good or better than a desktop. The design and usability surpassed what I expected. I haven't noticed any bugs.


I can't imagine having a different brand of computer, but there are lots of OSX bugs that cause my teeth to grind. The Finder doesn't remember that I only want one layout ever, and it resets to a random alternative setting regularly. There are still progress bars that pop over, and can't be hidden. The data display which shows used disk space and it is nearly all "other" is a bug as far as I'm concerned. My AppleTV(8) has stopped incrementing itself, but its not ideal. And as a genuine question, has anyone not had strange Xcode behaviour or crashes at least once per day? I currently have a slow motion simulator that changes views over 5-10ish seconds.


In terms of laptop OSes, OS X is by far the best. It's stable, usable and 100% desktop-OS focused. I can't stand the touchscreen features that Windows 10 tries to still force on you. Somehow my Windows 10 laptop got put into "tablet mode," and it was pretty unusable. I couldn't access the desktop anymore, it was slow and it took awhile to figure out the issue.

I think it was flipped on after an update, but why would I even want to be able to enable that mode on a laptop without a touchscreen?


If Windows 7 is "magically rebooting and bluescreening" often enough to comment on, then you have a hardware problem.


I don't use Windows very much, because I hate using Windows. But what I will give it is that I haven't seen a blue screen in the past decade for any reason short of bad memory, overheating, dead disk. I suspect that a lot of the Windows image problem is that people are free to buy really cheap hardware and fiddle with things they don't understand.


Apple merging MacOS with NextOS to make a Unix OS was the best move they could make at the time. It happened during the time of the $10,000 Unix workstation and Apple made OSX as an easier to use Unix. It was cheaper to buy a Macintosh than it was to buy a SparcStation or some other Unix workstation.

Because of Apple making OSX Unix based, it cut into sales of other Unix companies like SGI, and also GNU/Linux cut into sales of SGI and others as well.

But making OSX Unix based solved a lot of problems that Classic MacOS had that they couldn't solve.


"...magically reboot or bluescreen nearly as much." Mine never magically reboot. Ever. (OS X)


My Macbook Pro did which reminds me the extended warranty period is ending.

https://www.apple.com/support/macbookpro-videoissues/


I regularly get asked to fix someones windows PC, no such problem with people who have Macs. With OS X there are waves of releases(major versions) if I remember correctly - some introduce swaths of new code/features/replacement code. Some other are more of a speedup and bugfix versions. Maybe I'm wrong.


I go places on my Windows pc that I wouldn't dare take my Mac. I expect it to need repair.

My pc is my beater car, and it needs repair--regularly.

My Mac is the classic car in the garage, that only gets used for work, or safe places.


Speak for yourself. I used an old G4 PowerBook for grad school, and it travelled over 100,000 air miles, and into the various labs where I had to work, and also to far-off Asian countries for holiday. Plus, I didn't have to buy a developer kit: it came free with the machine.


I assumed that the travel referred to dangerous parts of the web.


Actually NT 3.51, which I used for dev and was great compared to my colleagues on plain windows.


Actually NT 3.1. 3.5 followed, then 3.51, then 4.0, then 5.0 (2000), and then the NT line ended as it was unified with the non-NT line.


Technically 9x line ended since Windows XP was NT-based and not 9x-based.


Yes, I suppose so. I didn't want to say the 9x line ended, because it's really the line of DOS-based OSes, and while the NT line is ongoing, it's no longer called NT. 2000 was the last version to mention NT, and it wasn't part of the name itself, just a tagline.


One of the things about OS X is that most of the time it's put on high quality, but non-exceptional hardware. So things like bad RAM, flakey power supplies, bad GPU drivers etc are almost never an issue with a stock machine.

Windows, not so much. The only stability issues I've had with windows have been related to poor drivers, almost exclusively from nVidia or ATI/AMD. The equivalent hardware for Apple machines either didn't exist at the time, or was running much less ambitious drivers.

I probably have more issues with my Macbook Air (relating to sleep, hibernate, and wake-up) than I do with my Windows machines these days.


To give you a counter-anecdote, I use macbooks in work. For the last five years, I've had two hardware failures and gray screens maybe once every four months. In addition to that, I have issues maybe once a month where the machine more or less locks up (from the logs it looks like windowserver/loginwindow has crashed and OS X is trying to do spindump to them).

Compare that with the _desktop_ Windows 7 machine. It first crashed intermittently (memory failures), but after I changed the motherboard, it has not crashed at all. But then again, I am not using, for example, the most cutting-edge graphic drivers.

I remember quite some crashes during the Windows XP times, but I've since taken a more conservative approach to hardware and drivers.


Jeez, you're talking about something designed and built in 1982-1983 (over 30 years ago) and meant to run on something with 128KB of RAM with no hard drive and a 400KB floppy disk.

You try fitting all that plus a GUI those constraints.

What's amazing is that it had the features it had and that it worked at all.


You might like to check out MenuetOS/KolibriOS and the old QNX demo disk.

Both provide GUIs and rudimentary Web browsers. QNX was full POSIX, too, although the demo disk didn't include a terminal.


... and there goes my macbook (2008?) where windows 7 runs harder better faster than osx, and pretty much more reliably than the monster imac at the office.

Pity that nobody remembers Windows NT4, it was miles better than OS7. I stopped using mac altogheter after starting to use it.


> Microsoft finally got a handle on security & software quality just as the world ceased to care about them.

Not quite. They got a handle on security when Linux list a fire under their ass.

Competition, true honest to market competition, spurs improvement.

The thing about Apple is that they may have competition on hardware, but they have no competition on Software.

If you buy a Mac or a iPhone, you have already thrown money at Apple. But you can easily assemble a PC without Windows and then install Linux on it.

Keep in mind that the latest US warship is not running Windows, but RHEL. That is a very big wake up call for Microsoft, where before we have seen the likes of Win2k (US ship) and XP (UK submarine) used around the world.


> They got a handle on security when Linux list a fire under their ass.

I have my doubts that all 0.02% or whatever of PC users who are dedicated to using Linux as their desktop OS influenced Microsoft to do anything but if you have data to show otherwise I'd be interested in it.


I'd assume parent was thinking less of PC users and more of other OS consumers: https://en.wikipedia.org/wiki/Usage_share_of_operating_syste...


And those are the public ones.

BTW, i seem to recall that the London Stock Exchange had a spectacular failure when they went with Windows. The result being that they switched to Linux with in a year or two of bringing their brand new Windows system online.

Ah yes, found it: http://www.computerworld.com/article/2467082/data-center/lon...


Desktop smesktop. For MS the desktop has always been a means to an end. Its have been about "total cost of ownership", where they can claim people need less training before being productive at their new job.

But to manage all those desktops you need server, and with MS the billing pr active user etc.


"if you have data to show otherwise I'd be interested in it." ...

Some data says Linux desktop/laptop share is 1.5% (not counting chromebooks)

https://en.wikipedia.org/wiki/Usage_share_of_operating_syste...

Please note also that Android is Linux and iOS is Darwin BSD unix.

Linux and Unix based kernels are more numerous than Windows based units.


The fact that iOS and Android have a UNIX like kernel doesn't count much if the majority of userspace apps use non UNIX APIs and tooling.

They could replace the kernel with something else and most devs wouldn't even notice.


> 0.02% or whatever of PC users who are dedicated to using Linux as their desktop OS

Wikipedia: 1.5% [1] NetMarketShare: 1.71% [2] W3Schools: 5.6% [3]

[1] https://en.wikipedia.org/wiki/Usage_share_of_operating_syste... [2] https://www.netmarketshare.com/operating-system-market-share... [3] http://www.w3schools.com/browsers/browsers_os.asp


I don't know about Linux influencing Microsoft on the desktop, but surely it has lit it up on the server. You are 100% trolling by claiming 0.02% of PC folks use Linux. It is at least two orders of magnitude higher than that, and then android... :)


The obvious example would be netbooks. MSFT moved very fast to counter Linux on that front.


Not exactly, oem pc's have ( mostly ) an oem version of Windows on it ( which is pretty cheap, don't know what other incentives microsoft has though).

But you can install Linux on it for sure :)


I had to pay $100 extra for the Windows license that came with my Lenovo Ideapad.

And Lenovo wouldn't let me buy their laptop without the windows license.




Frankly i suspect MS would love to ignore the consumer world, except that then they would lose their beloved "total cost of ownership" argument for doing B2B sales.


OS X and iOS are both way more stable than any pre-OS X Apple OS. I can't believe people forget it.

When people talk about OS X having issues, they often mean some new feature is a little flaky. Classic Mac OS lacked basic stability and security features like preemptive multitasking and memory protection. Classic Mac OS was just like the pre-NT Windows: crash prone.


That's not a high bar though. The only other high profile desktop OS around at the time, Windows NT, was more stable than any pre-OS X Apple OS for a long, long time.


I agree with this sentiment. We can complain all day, but the fact that we have these devices and software that have been made accessible to us by Apple is astounding from a historical perspective. My parents are in awe of the calendar app, and apple maps, etc. As they should be!

OSX also is still the best development platform despite it's flaws.


Are you missing an /s tag there or am I temporarily dumb?


Neither? Not sure which part you are referring. I really believe that it's the best dev platform, I've tried all of them. Ubuntu can come close, but orders of magnitude less user friendly. In my opinion and the opinions of folks I've discussed this with.


I would agree. I love my Macs for dev work. Web dev and app dev alike are an absolute treat with a Unix-style backend but a much more polished front-end. Ubuntu is probably the only Linux distro that comes close to giving me that terminal power without rubbing it in my face constantly when I'm just trying to manage my day to day stuff and, even then, it's not even close to OS X. Windows, on the other hand, is only usable for me with third-party software for everything and then I feel like I'm spending just as much time futzing with everything as I am doing anything productive.


Same with me, I basically do 3 things on my computer: develop code, edit pictures and write stuffs. Almost all my files are in the cloud available through the browser, and the fullfledged terminal with lots of convenience tools just feels great.


I still can't work out how to get XCode to load up the LibreOffice gbuild projects. When I do I think I'll probably be a convert. Till then, I guess I remain with vi.


OK, thanks for your polite answers.


Perception problem is the right description. Let us take a devil's advocate view and try to fit the facts into a narrative that inverts the public wisdom.

Microsoft is making over 4 times what it made in its glory days, growing year by year, across a wide range of products and services. Windows and office account for only half of that, making them a diversified company with plenty of potential for revenue growth. Windows 10 is by far the most successful windows release ever, with more active installs than os x (any version). Basically the only place microsoft is truly failing is phone.

Apple by contrast gets two thirds of their revenue from the iphone. They have nothing else that even comes close, and nothing that could replace it if iphone sales start dropping. Mac sales are down, ipad sales are down, and the apple watch is a dud. Since 1990 apple has basically had two hits: ipod and iphone. I did not mention ipad because it is just another iphone model, which you can tell by its sales slumping as iphone screen sizes moved up. Success for Apple is rare, and most of what they do isn't all that amazing. The apple tv isn't going anywhere, even after the refresh. The apple watch distinguishes itself from other smartwatches only by its price. Basically the only place that apple is truly succeeding is phone.

Perception is everything. How you choose to look at the facts determines which facts you see. Apple is perceived as strong and microsoft as weak, but the facts give you the option of going either way.

Regardless, apple has few excuses for any quality issues. They have the resources, and they have had enough time (given that aside from the watch everything else is half a decade old or more). Personally my mac and ipad anno 2015 have the same amount of glitches as my mac and ipod anno 2005. For me, Apple doesn't seem to be getting worse, but they don't seem to be getting any better either.


> We just didn't care because Steve Jobs was really good at distracting us with promises about how great things were going to be, really soon now.

I tend to characterize the "reality distortion field" as a magician-like talent to focus an audiences attention on a particular subject.


Jobs was taught the RDF by his professor. This means it can be learned. The reader is encouraged to learn how to perform the field, so that they can defend themselves, and others, from its effects.

As the villain in the Incredibles said:

"When everyone's a superhero, no-one will be."


> Jobs was taught the RDF by his professor.

I didn't know that story. Who was the professor? What was the technique?

> This means it can be learned.

Oh definitely. Magicians learn all their tricks, and they are very useful for anyone performing in front of a crowd.

> The reader is encouraged to learn how to perform the field, so that they can defend themselves, and others, from its effects.

Which reader is that?


Robert Friedland, apparently. And he wasn't a professor, but rather a classmate (and Reed's student-body president).

https://en.wikipedia.org/wiki/Reality_distortion_field


We cared when we tried using System 7 Macs to control industrial machines, as I did.

If you treated 16- and 32-bit Windows nice -- typically running one program over long time periods -- they were quite stable on the plant floor.


We have two different theories here. We're smart people, right? We should be able to figure out if we just percieve software quality to be worse or if it really is.

So how do we measure this in some valid manner?


> I remember getting plenty of sad Macs under System 6 and 7

When System 7,8 crashed, it crashed hard. Complete system lockup. And it crashed rather often. No recoverable... progressive crash like Windows.


When you say the world doesn't care you might be nearly right from a consumer perspective but that's not really their target market. Outside of cool IT and design companies almost everyone's business machine is running Windows, a lot of servers are running windows too and sql server and visual studio are at an all time prominence for business software development.


As someone who regularly uses OS X and Windows, I'd say OS X is as reliable as Windows 7, 8.1 or 10 or even more reliable, particularly with regards to crashes. Apple did have a really annoying wifi bug that has been fixed but that did take awhile.

iOS is also in pretty good shape, but almost every time Apple releases a new version its buggy. By now, iOS 9 it's a very stable and robust OS, but it needed work up front.

The biggest places were Apple is having trouble are with new products. Watch OS was slow, buggy and limited at release. It's pretty much at a 1.0 state right now. The new Apple TV is by far the best version of the Apple TV, but the OS is buggy and still needs refinement.

My take on this is two-fold:

1) Apple is doing more and more products, causing there to be issues with newer products. They haven't been putting in the QA work on newer software. OS X is old and mature software, so it's pretty stable, but something like Watch OS is very new.

2) Apple's insistence on yearly OS upgrades is causing there to be a lot of 1.0 roughness each year. Just slowing down to a two-year cycle would allow for a lot more time to refine and more time where the OS has been patched and is the latest OS. iOS 10 will be announced in a few months, but iOS 9 still has at least one major point update to go.


This is more or less my take - other than some occasional high profile bugs - I'm not sure Apple has a declining problem, their software seems to still be well above industry norms for quality (and either in line or better than Microsoft, largely because of their strategy to abandon backwards compatibility).

Apple however is held to a much higher standard than they've ever met, and much higher again than industry norms - and you raise very valid points that they're shipping shit before its ready, I don't think yearly upgrades are terribly compelling anymore, or really needed - I want a computer that works really well and does the things that I want it to do, with a minimum amount of fuss.


Apple's R&D budget seems to be mostly focused on hardware. It's hard to say definitively, given how secretive they are.

Microsoft invests a massive amount of money into MSR, and creates tools out of the most useful results. The Static Driver Verifier depends on Z3, an SMT solver developed at MSR. Other verification tooling like SAL (C/C++ annotations to assert contracts for functions) has a similar history.


They should probably consider using that "static driver verifier" because Surface Pro 4 is crashy as hell.


Yeah, but that's because Skylake is buggy as hell. It's a bit of a bummer. MS cannot publicly blame Intel for the raft of skylake bugs, but it's first gen hardware, first gen software, and first-gen firmware.

For what it's worth, my Surface Book (obviously a product they care a lot about) has basically gotten to a crash rate equal to my mac, which is maybe 1 actual problem every 2 weeks). I suspect the Surface 4 Pro will get that love next as the other Skylake power issues sort out.


If Microsoft was serious about enhancing the real and perceived quality of the brand they are establishing with the Surface Pro line (which seems to be very nice based on the my mother's earlier generation model), they should have tested Skylake, and it in the Surface Pro 4 prototypes, detected the problem, and delayed the launch. Intel's problem became their's when they shipped it in their hardware.


What's interesting about this is that it appears they did! Very very early SB models exist and tested MUCH better than the first run of production hardware. You can find a lot of early reviews that praise the battery life, etc.

Then the first wave of consumer-facing SBs went out and it was a total disaster. This might be something that Microsoft can fix, because they have dramatically improved the product experience and been very receptive to trading defective hardware. Mine was traded up the instant they looked at it, with apologies and a customer care call.

> they should have tested Skylake, and it in the Surface Pro 4 prototypes, detected the problem, and delayed the launch. Intel's problem became their's when they shipped it in their hardware.

Did you say the same thing when Apple shipped a massive defect rate on their first gen retina macbooks? Because they DID test thouse, and they still ended up shipping a truly phenomenal number of lemon screens with huge defect and failure rates.

Oh, and Apple refused to replace all but the most egregious failures. I still have a machine with such significant ghosting that it can be difficult to use. Ironically, DaringFireball is actually unreadable. I keep this machine around because it was part of a very special segment of my life, but also because I like showing people, "Yes even Apple's legendary hardware is rife with first gen bugs, and your iPhones and hypothetical new macbooks are no different."


Yep. That's likely why there are no Skylake MacBook yet.


The fact that Skylake was totally out of cycle with Apple's usual release efforts might have something to do with it. MS was in an unusual position.


As an engineer,this is the correct response.


Well as an engineer, I think we all know that manufacturing defects can creep in even after prototypes pass.

It's also the case that it's incredibly hard to test things like battery life, wifi connectivity and the effects of heavy processor workloads in a systematic way. You hope that your vendors do a good job (and I bet Microsoft's contract with Intel involves penalties for these major defects to try and incentivize Intel to handle these).

Look at the first Iphone4. How did they miss something as simple as skin contact causing significant antenna interference? Most of us hit it immediately. The answer: hardware in the real world is really hard.


Just a little bit of trivia which I found interesting -- Apple actually did not miss the antenna interference problem. They knew that it was an issue, but I guess they figured it was an acceptable tradeoff for the design they wanted.

http://www.bloomberg.com/news/articles/2010-07-16/jobs-says-...

I get the impression thatJobs did not think it would be received as negatively as it was.


I suspect if they knew how badly it hurt my reception (I totally lost signal and it took a long time to get back), they would be less surprised at the reaction.


That's not what I'd call a bug that "crept in". Surface Book I bought crashed unprovoked _several times a day_. That's deliberately shipping a completely faulty product that any self-respecting customer will take right back to the store. Which I did.


You should take your Mac to the store and have them take a look. Or at least run hardware diagnostics. I've been using Macs for well over a decade and in that time I have maybe seen 4-5 crashes total, across multiple laptops and desktops. A crash every two weeks is not a normal situation in OS X.


Depends on what you are doing.

Just yesterday, OSX was convinced that I had an external monitor. I did, but that was 2 hours back when I was at the office. So I got to the preferences screen and.. the kernel crashed.


Do you have some kind of third-party display or window manager installed? I regularly (as in "every day") have my Macbook hooked up to dual monitors and I've never had the kernel crash due to a disconnect or a change on the preferences screen. Does that happen regularly for you or was that just a one-off occurrence?


I've got an otherwise unexceptional LG monitor that with one specific generation of MacBook causes all sorts of problems. My windows machines and newer macbooks don't have this problem, and connect to the monitor fine.

So it can be hardware issues. Often subtle ones.


Connected via HDMI? And goes into YPbPr mode because OS X thinks its a TV? And has no override.


Some combination of (Windows on VMWare, startup utilities,corporate virus/malware protection tool and external monitors) are my bane.

I've given up keeping VMWare open, and I experience very very few issues - even on an older OSX release (again, due to corp IT).


Lucky you, VMWare is getting rid of Fusion anyway.


Where did you read that?



If you use virtualization, it is.


I don't know if it's fair to compare Microsoft's efforts with Windows and Apple's with OSX. Windows runs on hardware from a variety of different vendors. OSX is pretty much commercially locked down to Apple's own hardware (unless i'm missing something?). It's actually a shame that it isn't damn hear flawless.


OS X runs pretty well on non-Apple-certified hardware. I have a Hackintosh I've been running on desktop for a few years. There's a large community around these things and users' experiences are mostly positive. Of course, this is due to the dedicated efforts of a small group of hackers.

My hardware configuration matches no Apple product.


Microsoft indeed improved quality a lot, though Windows 7 inexplicably grinds to a halt and sometimes outright hangs on my desktop occasionally (one can blame my Wacom tablet but that contradicts the thesis of driver verification working wonders), and Windows 8 periodically renders the laptop unusable, using near-100% of the disk bandwidth (I tried like 5 tweaks recommended on the web for this problem, nothing helped.)

But that is not nearly as bad compared to having to rely on software developed the way they do in the aerospace business! From http://blogs.law.harvard.edu/philg/2010/02/09/public-tv-figu...:

> Who crashed Colgan 3407? Actually the autopilot did. … The airplane had all of the information necessary to prevent this crash. The airspeed was available in digital form. The power setting was available in digital form. The status of the landing gear was available in digital form. …

> How come the autopilot software on this $27 million airplane wasn’t smart enough to fly basically sensible attitudes and airspeeds? Partly because FAA certification requirements make it prohibitively expensive to develop software or electronics that go into certified aircraft. It can literally cost $1 million to make a minor change. Sometimes the government protecting us from small risks exposes us to much bigger ones.

(I agree that Apple's cash hoard does not make $1M sound like a lot, however, they also have much more software to tend to.) Overall, it seems that today you have to trade correctness for features and development time, and the cost in features and development time cannot be borne by a market participant unless the market is regulated so that all competitors have to do it, in which case the user is going to get way, way less functionality. I believe that the cost of bulletproof correctness might drop significantly enough at some point to change the game - and I really, really hope formal methods will take off big time, without being sure they can - but it doesn't seem like we're there yet. (This is my opinion, not data, of course; the one thing that I think $millions buy that works very well without costing too much time or features is automated testing.)


Tim Cook doesn't seem to care about anything other than the supply chain. He came from being a COO, so that's all he knows. Consider how many different models of iPad there are.[1] It's like he's bragging. "Look how good we are at managing suppliers. Look! LOOK!" Meanwhile, the watch, Apple Music, and everything else that reached a v1 release under his tenure so far has been buggy and broken. But hey, at least they have a "gold" Macbook now!

[1] http://www.apple.com/ipad/compare/


>>Apple has the resources to approach software like aerospace. Formal interface specs, heavy Q/A, tough regression tests, formal methods. It's expensive, but Apple ships so many copies that the cost per unit is insignificant.

They don't have the time, however.


They have the money to buy more resources and a lot of those tasks can be done in parallel.


There's this really great book you should read about myths and man-months


That's true but only if there's only one team working on software at Apple. There's no reason to assume that the iTunes team needs to be the Photos team; while there might be certain dependencies on something like iCloud or OS X, the areas everyone complains about tend to be clearly contained within a single app and there are well-practiced ways to deal with things like API changes.


9 women combined can't make a baby in 1 month


That just means involving more people _right now_ won't speed things up. But you can "make 9 babies in 9 months" if you plan early and involve enough people. I.e. know how much Q/A people you need and hire them upfront.


Yeah, and have them twiddling their thumbs while there's no product to test?


9 women can make a baby every month for 9 months


with an initial delay of 8 months.


Or rather, the marginal return isn't worth the investment.


>>> Apple has total control over the hardware, total control over third party developers, and $203 billion in cash. What are they doing wrong?

Nothing. And that is the problem. When you are generally speaking doing everything well there is little room for massive improvement. The days of perpetually exponential improvement, and resulting growth, are over. Apple is not a startup. Like Ford, Sony and GE, they now have to settle into the grind of incremental improvements for reasonable returns.

Or they can put their markers down on ever more grandiose schemes. They could branch into transportation by starting an airline, or a robot taxi service, but I doubt shareholders will tolerate such outflows for long. If doing so causes the neglect of the core business (iPhone) shareholders will revolt.


Apple is losing market share to Android. The gravy train may not go on forever. Apple today is in the position of GM in GM's glory days, the wonderful days of powerful V8 engines, HydraMatic transmissions, tailfins, and the "longer, lower, wider" wide-track Pontiac. GM didn't think they needed massive improvement in quality. They were wrong.

Watch this commercial for the 1967 Pontiac GTO.[1] Looks a little like one of Apple's teaser ads from the Jobs era, doesn't it?

[1] https://www.youtube.com/watch?v=tzF_CdKLTP0


I think that the Apple/Google/Microsoft/IBM quadrifecta perfectly illustrates one of the lesser-known points of The Innovator's Dilemma: customers care about different values in different points of the product's lifecycle, and that leads to differing companies becoming dominant.

When a new product category is introduced, customers primarily care about ease of use and relevance to their lives. Radical vertical integration is usually necessary to achieve this, because any friction in the product's interface is on top of the friction of getting consumers to use a product that they're completely unfamiliar with. Hence, the market is totally dominated by one company that makes everything from the chips to the hardware to the OS to the apps. This is Apple. This is the Apple II in 1976, the Mac in 1985, the iPhone in 2007, the iPad in 2010, and now the Apple Watch in 2015. (It's also Netscape in 1993, Yahoo in 1998, Amazon in the early 2000s, and AWS today.)

As the market matures, more competitors enter. An ecosystem of third-party apps develops. Hardware supplier prices drop as more hardware manufacturers develop expertise and enter the market. Customers start to value compatibility, options, customizability, and adherence to standards over raw ease of use. This is Google now and was Microsoft in the 80s & 90s. This is MS-DOS in 1981, and Windows 3.1 in 1991, and IE5 in 1999, Google Search in 2000, Chrome in 2008, and Android in 2011-present.

Eventually the technology moves up-market. Customers start to care more about security, stability, reliability, and performance. That's Microsoft now and IBM in the 70s & 80s. That's mainframes in the 80s, and MS Office and Win7 now. At this point, the technology is already being disrupted, but the disruptive technology isn't reliable enough for a segment of the market.

Finally, you get to the point where customers care about brand and compatibility with existing installations. This is maintenance work, where the company becomes a consulting outfit to keep all the technologies they invented a generation ago running. That's IBM now.


This seems like a good description of product maturity process in B2B markets. But consumers are motivated by different things(once products are good enough) - the chief among them is psychological/social value/perception, mostly created via marketing or by being first - and in general pretty hard to disrupt.


I think that the emotional-value aspect slows down the disruption cycle in consumer markets, but it doesn't stop it.

Emotions, after all, are just the brain's way of processing lots and lots of information that can't be compared on a rational basis. Part of that information is "What do my friends use?", part of it is "How does it fit into my life?", and part of it is "What does it say about me as a person and what I value?"

But all of those factors are still subject to reality: if a new product comes out that fits into your life better, eventually somebody's going to break ranks and adopt it, and they'll be able to explain to their friends, authentically, why they believe it's better. All of the catalysts I mentioned in the original post reflect changes in the ecosystem: the shift from ease-of-use to features & compatibility reflects more things you can do with the product, the shift from features to reliability reflects using the product in more consequential situations, and the shift from reliability to branding & maintenance reflects how you're perceived for choosing the product.

The Tipping Point describes the mechanism for this in consumer markets well. Product adoption starts off with Mavens, people who like trying & evaluating new technology on its own merits. It spreads through Connectors, who have a wide circle of friends and enjoy telling them about interesting new things that might benefit their life. Finally, the holdouts are convinced by Salespersons who explain, point-by-point the benefits and answer objections.


>> I think that the emotional-value aspect slows down the disruption cycle in consumer markets, but it doesn't stop it.

That may be true.

But in the context of iOS vs Android:

1. Most features come from apps - both have strong app ecosystems, and iOS probably has the stronger app ecosystem because it serves wealthier people. To a certain extent that applies to reliability.

2. Some features are native to the OS. So you see a competition, and Android is certainly faster there, via the rooting community, competition between OEM's , etc. But Apple usually respond - at least when things appeal to the mainstream , and don't negate their strategy.As for the question of reliability - i'm not sure Android is viewed as more reliable(think security vulnerabilities like stagefright). But yes, maybe Google can lead Apple here ,because they seem stronger technologically. The only question will they do this permanently or will it just buy them some time and would it be enough ?

Also , let's not forget the network effect embedded in iOS via iMessage(which many users say it prevents them from moving to Android).

>> the shift from reliability to branding & maintenance reflects how you're perceived for choosing the product.

I'm not sure that's true. it all depends on how psychologically important that product is to you, versus how important is the features/reliability differential.


Thanks, that was a really interesting read. However, like my sibling states, consumer markets are subject to the whims of marketing, which may distort this somewhat.


> Apple is losing market share to Android

Who cares ? They are siphoning profits from the market and selling more phones than they ever have before. They have periphery businesses e.g. Apple Pay, App Store that are doing very well and I am sure more will come in the future. They are never going to be the company that goes for market share above all else.

> Apple today is in the position of GM in GM's glory days

I don't think so. Apple seems to be quite happy just to acquihire their way out of any innovation slump. There are a ridiculous number of companies especially in the VR space that they have acquired that we have seen no evidence of in their problems.

Pretty exciting times ahead I imagine.


They're losing marketshare in the phone market as a whole, but in the high end, premium market, they're doing quite well. And that's the market where the profit is, not in the low end, free on contract devices.


Fantastic commercial!

"The Great One"

I can almost see Don Draper standing in the shadows behind the car.


2 possible causes for this:

1) Apple still develops the OS using waterfall over the year. Entire sweeping changes are made only at x.0 releases that trickle down to teams that have to work around the instability all year long and there's no other approved way to get in significant changes.

2) They keep adding more apps to the core OS image that can only be updated with a full software update now. This makes delivery of quick fix updates near impossible since they have to go through the OS release management teams.

It certainly sells better to have a huge list of changes at WWDC that then become reasons to upgrade, but software delivery has moved on from waterfall, so in that respect Apple's OS teams are behind.


>They keep adding more apps to the core OS image that can only be updated with a full software update now.

This is a huge drawback for Safari, both on desktop and mobile.


#2 is just not true. They deliver point releases that add new core functionality all the time. For example, Photos for OS X was delivered in 10.10.3. A point release that came mid-year and delivered a huge amount of new functionality, including photo streams shared between iOS and OS X.

They also deliver many bug fix releases throughout the year, on both platforms. The 9.x releases have seen them add support for WatchOS 2, and many other things.

It's a huge ecosystem, and many teams, that all have to line up their product release schedules, and now 4 operating systems (OS X, iOS, tvOS, WatchOS) that have features that all work together. This is not trivial.


I would be skeptical about agile on something like OS development which is on a different scale from your average software project.

Not saying it won't work, but I would like to see some comparisons of OS level projects that have gone agile and compare it to the waterfall approach.


Well Linux is run like that right? Releases very often

Not to mention Facebook itself... We all know FB has fallen flat many times but its never been busted for weeks at a time to my knowledge


One possible answer is that near-perfection (the perception of "Job's" products), including recognition (widespread adoption), is attainable at any given moment in time, but typically unsustainable long-term...given human and technological constraints...

It could be entropy, as some have suggested, or simply the difficulty of maintaining a level of quality one has become associated with producing...

Maintaining the (high) level of quality one has reached is difficult enough...

Incremental gains on a level attained become much more difficult...opportunities become infinitesimally smaller...


>Apple has the resources to approach software like aerospace. Formal interface specs, heavy Q/A, tough regression tests, formal methods.

I would think that what you stated, while true that they currently have these resources, would directly go against their product roadmap schedule, the consumption of said devices in that schedule and thus their bottom line ($203B in cash)


There's just no way in hell Steve Jobs would be putting up with this and I wish he was alive to tear some people a new one. I didn't like Jobs but respect his ability to achieve things.

Perhaps, the years of oversimplifying applications has created an Apple that can't handle complex applications?

Or, there's a chicken and egg question: XCode and the surrounding tools are atrociously buggy and hostile to the developer and it seems to increase with each release. Is this a symptom of what's going on inside Apple or a cause - perhaps Apple's own developers are dealing with the same hellish development experience and are just happy when something can compile without crashing Xcode.

Or, perhaps, at some point, software becomes too complex for humans to deal with.

Windows got so much flack over the years. It wasn't the prettiest but it worked and did what it said. Sure it BSODed sometimes and had some memory problems but it handles infinity more hardware/software/driver situations than OS X. Visual Studio is a dream, if you're into that ecosystem. MS dev tools are actually very nice.


>There's just no way in hell Steve Jobs would be putting up with this and I wish he was alive to tear some people a new one. I didn't like Jobs but respect his ability to achieve things.

You probably wasn't paying attention when Job was running things.

OS X 10.1 was a mess -- it took until several updates to become somewhat usable. The Finder was half-arsed for a decade. Mail.app had several issues. The Windows versions of iTunes was crappy. OS X releases that are now praised as "the best ever" etc, got tons of complaints for introducing bugs and instability. XCode has been historically crap (and it's much better now). And don't even get me started on the state of their Cloud offerings under Jobs.

Hardware wise the same. Every new release, from the original iPod to the iPad was met with complaints and bickering ("No wifi? Less space than a Nomad? Lame") -- even if it actually took wifi and batteries 5 more years to even start making practical sense to have on such a device for syncing. Aside from specs people complained about, there were also all kind of other HW issues, from the overheating G4 Cube, to the logic boards dying on G3 iBooks, to cooling goo spilling from G5 towers, the crappy "round" mouse, and lots of other stuff besides.

That said, I don't buy the "Apple software went downhill as of late" thing. First, because as said there were always issues. Second, because in normal use I don't see any particular decline. If anything things have got better, to the point where we complaint about trivial stuff. The thing is Apple of today puts out a heck of a lot more software and hardware products than the rosy Apple you remember.

I'd take iTunes in the back and kill it though -- as the latest redesigns are absolutely crap from a UX perspective. Then again, I wouldn't call that a programming quality issue -- more of a "idiotic focus and shoving on our faces of BS online music platform issue".

>Or, there's a chicken and egg question: XCode and the surrounding tools are atrociously buggy and hostile to the developer and it seems to increase with each release.

The opposite. XCode was "atrociously buggy" in the 3/4/5 era and before, and has gotten quite better in the 6/7 series (despite having to support a whole new language).

In fact a large list of early XCode 6 crashing bugs have been squashed months ago -- which was (as reported) around 90% of them.


That said, I don't buy the "Apple software went downhill as of late" thing. First, because as said there were always issues.

I fully agree. I jumped on the Mac and OS X bandwagon in 2007. The first few releases of Leopard (10.5) were quite hellish, because I was constantly having Wifi problems. My HP Laserjet didn't work until 10.5.2 (in 10.5.0 and 10.5.1 it would just print a few pages and get stuck). Snow Leopard was a smoother ride, but only after a couple of dot releases.

I think there was a low point around Lion and Mountain Lion (lots of new features, but they were not polished enough). But since Mavericks it's been smooth sailing for me.

People also seem to have overly romantic recollections of past Apple software. Sure, Aperture and iPhoto were killed in favor of Photos. But Aperture was crap. Version 2 was still acceptable, but 3.x had constant hangs and slowness. And they were always trailing behind badly with RAW support. As many people, I have bought Aperture 2 & 3 and switch to LightRoom pretty soon after because of Aperture's lack of polish. A while ago I tried the new Photos. And although it does not have all of Aperture's features, it's a far better application in terms of speed and user experience.

tl;dr: I don't see this drop in quality.

Of course, App Store and iTunes need to be burned down to the ground and rewritten from scratch.


I agree that photos is a lot better than iPhoto & co. The performance is blazing and I migrated from lightroom to it because of it. I can't find a faster and easier to use photo library management app. Lightroom is really slow compared to it.


You just convinced me to give Photos another chance. I absolutely hated it when they gave up on iPhoto and Aperture because I felt like I lost a lot of features if I moved to Photos. Ever since then, though, I haven't found a photo solution that works quite as well as any of the Apple products I've used in the past so I'm willing to give it another shot.


Photos.app is atrocious from a UI perspective. It's an iOS app shoehorned onto OS X. How to back up your library to, oh, I don't know, an external disk? Drink the kool aid or else.


What? The UI is totally fine. Just because it's a grid of photos doesn't mean it's just "an iOS app shoehorned onto OS X." The app is written from the ground up for OS X. It's not just some basic half assed port of the iOS version. The UI is different. The editing tools are more powerful. The performance is blazingly fast. Go give the Photos app on Windows 10 a shot if you want to see what the competition is up to. That is a pathetic Windows (mobile) 10 port.


> The Windows versions of iTunes was crappy.

Oh joy. I remember those days when people were angry because plugging your iPod into a new PC means wiping your library!

> That said, I don't buy the "Apple software went downhill as of late" thing.

Perception on Mac after people switch from Windows to Mac OSX immediately is awesome. The perception is different if you use Mac OSX for a long time and suddenly you see computer frozen every few days and requires a hard reboot. Many of that have to do the software you are running at the time, but why a sudden OS freeze, mouse couldn't move? That can't be a fault with Office / Skype, it has to be either a OS issue, or underlying hardware issue.

Mac OSX is generally really great, it doesn't feel slow compare to a lot of the PC out there. Polished UI and very responsive. In some release you do see slowness opening between finders or moving from one setting view to another. Those are defects people will cry and ask those defects weren't caught by the performance team.


Maybe the grass is greener on the other side. I was using an iMac for a couple months on a job a while back after using only Ubuntu for 5+ years and I found the whole Mac UI extremely sluggish and confusing.


My main computer is an OSX both at work and at home. Honestly it isn't very responsive compared to recent lung desktops or even the latest windows 10. Or at least it mostly feels that way when using Eclipse, so maybe its the JVM.


I usually think UI in most Java-based applications are very slow on Mac OSX but that's just my limited experience with Java-based UI on this platform.


> The opposite. XCode was "atrociously buggy" in the 3/4/5 era and before, and has gotten quite better in the 6/7 series (despite having to support a whole new language).

Xcode 3/4 were a lot harder to work with (IB and Instruments separate, had to use Clang on command line etc), but I don't recall them being particularly buggy. Xcode 7 with Swift is the buggiest version I have used. It crashes on me 10+ times a day with all of the instant compiler checks it does. The progression from 3 to 7 is amazing, but I feel it has come at the cost of stability.


Xcode 3's code completion made everything run so slowly that I had to turn it off, and for a while I just copied and pasted method names out of the Cocoa header files.


I just wish one could tell xcode to stop using ALL THE CPU


> If anything things have got better

Much better in my opinion. For example they've fixed all the issues with iMessage interoperability on OSX/iOS. There are a time when if you used Messages on the Mac you had to suffer through missing messages, duplicate messages, out-of-order messages, etc. Now, as far as I can tell, it's working flawlessly. Even better that it now includes SMS messages.


> If anything things have got better, to the point where we complaint about trivial stuff.

This is the key. We're definitely spoiled by how solid a lot of the basics are.

I can fault them for some of the cloud service issues nowadays, but even those work better than they did in years past.

Apple has always had strange hardware and software issues, just like everyone else. Just today I was trying to buy Windows 10 to install it on a PC I'm building, and the MS storefront and my Live account couldn't sync the deletion/addition of a credit card until I logged out of everything, closed the browser, and logged back in. No one is getting this stuff working flawlessly.


Microsoft's OAuth stuff is ridiculously buggy. I've tried half a dozen times to get my Office 365 Microsoft account properly linked with my (identical email address) MSDN Microsoft account, and it's always wrong. Every time I go to login to some Microsoft OAuth site, it picks the wrong cookie, so I have to logout whatever account it thinks I'm using and login repeatedly with the right one.


> OS X releases that are now praised as "the best ever" etc, got tons of complaints for introducing bugs and instability.

I bet it would be easy to find an "OS X is on a slippery slope..." article for each OS X release.


I remember the initial Windows release of Safari. Apple insisted on doing their own font rendering, and the result was blurry headache-inducing text.


Wasn't that a Windows issue, though? I feel like font-rendering on Windows, in general, is far inferior to font-rendering and aliasing on the Mac. Especially with the new HiDPI displays popping up, Apple's stuff looks far better to me than any display I've ever seen on Windows.


font-rendering on Windows, in general, is far inferior to font-rendering and aliasing on the Mac

That's because Windows has to deal with so many more differences in hardware. E.g. look at these pixel-level variations in LCD screens. http://www.digitalversus.com/tv-television/screen-technology... I don't understand how it would be possible to support subpixel rendering on some of those. IIRC when I last used Windows (XP?) it let you choose from 8 different schemes. But that still wouldn't nearly be enough!

OS X and IOS can be tweaked to support a much smaller subset of that mess.


They actually went back on that decision because Windows users complained so much that the text was antialiased properly.


That's right, the anti-aliasing didn't look good to me either. I'm sure it comes out very nicely in print.


That's interesting because 'back in the day' the IT department accidentally installed safari on my win XP box and as a long time mac user I thought 'finally some font rendering done right'. I guess it's down to taste or it perhaps it didn't work well for screens with low pixel density or people who like really small text and ui settings.


People like what they're used to. I'm not in the least surprised that you happened to be a former Mac user...


A lot of awful software shipped under Steve Jobs. X Code has always been a buggy mess, iTunes bloated greatly under Steve (remember Ping?), iMovie '08, etc etc.


Yeah. The revisionist hero worship and sanctification of Steve Jobs has reached absurd levels since his death.


The Cult of Steve? It think it's actually declined a little.


That Isaacson book really leveled him for a lot of people.

OP is right about him not putting up with crap. Sure, not everything that came out of Cupertino was rock solid. But I think as much of an aesthete that he appears to be have been, it's also pretty evident that he was incredibly shrewd and knew he had to ship at some point.

The cruddy software might be coming from Apple repositioning itself from an innovation to a legacy brand. Acquiring Beats (can you imagine Steve championing this - lol), less of a focus on the high-end workstations for consumer electronics and watches, removing the grunt from high-end apps like Logic and FCP. Certainly no longer the underdog we root for, as the article points out.


> That Isaacson book really leveled him for a lot of people.

To me it actually had the opposite effect: after reading it I had a newfound respect towards Apple and Jobs. The book was honest and fair, I don't know why people think it does a disservice to him since everyone knew he was an asshole. This is coming from someone that avoids Apple products, FWIW.


That's the book that uncritically repeats Jobs' (completely bogus) claim that Apple invented the switching power supply.

http://www.righto.com/2012/02/apple-didnt-revolutionize-powe...


[ q.e.d. ... dang, c'mon dude. give it a rest. ]


Totally forgot about Ping. I love being reminded of old "features".


Its funny watching the announcements of these products now.


I just get sad seeing Steve with meat on his bones... We watched him wither away.


iTunes was hideous. I once tried to reverse engineer the XML that stored the library and concluded it must have been designed by a distracted intern.


Why reverse-engineer? The format (known as "Apple XML property list") has always been openly documented:

https://developer.apple.com/library/mac/documentation/Cocoa/...

There are also many libraries available to decode and encode these files in various languages.

The reason why XML property lists look this way is that they were a direct translation of the older NeXT "property list" format, which was sort of like binary JSON. Dumping an alternating list of keys and values isn't pretty XML, but it ensured minimum translation headaches from the old format.


Right, so technical debt then.


It's still hideous. Case in point, although this is a combination of things I guess but, when I upgraded to a iPhone6S+ 2 months ago I backed up my 5s in iTunes then restored on the 6S+. 22 apps could not be restored, among them 6 of Apple's own apps.


Yes, good software applications from Apple have been the historical exception rather than the rule. Great hardware, good OS, mediocre software for the most part. This reflects the company's priorities as a hardware vendor: they make very little money from software, and hence have little direct incentive to put a lot of resources into it.


I don't agree with great hardware, if we're talking about durability and easily fixable. At least the Macbook line seems designed to fail after a number of years or be too expensive to continue fixing in favor of buying a new one. Apple products, besides the high end desktops, seem disposable, no matter how well made it looks.


Years is still a long time to measure the lifespan of a laptop. Plenty of less-expensive laptops fail at the one-year mark, some fail at two, some laptops equally as expensive as a Macbook will fail before five years are up. Some Macs fail before five years, some last beyond that.

And five years is a long time when it comes to computers.


Jobs didn't care about Xcode. iTunes was decent or even good. He hated social and reluctantly allowed Ping to go forward. iMovie is a tiny niche.

I agree with OP. There is no way Steve would have allowed some of the current stuff to see the light of day.


iLife was very important to Steve, he dedicated a lot of time demoing the apps on stage. He was a huge fan of ripping out all the features of iMovie.

Same with iTunes (which has gotten a ton of criticism over the years), he was the one that kept cramming everything under the sun into it.

I don't know if he was as passionate about X Code, but he sure liked to brag about having the best development environment. Interface Builder in particular seems right out of Steve Jobs' brain (even if it sucks).

Here he is introducing X Code: https://youtu.be/Rh5spZrzu6c?t=59m3s


Interface Builder is superb once you get the hang of it.


It does a great job of getting you from zero to an app, but it breaks down terribly once you try to do anything custom with it. The fact that it hides code from the developer drives me nuts.


Well it depends what you mean by custom. I create custom views all the time and its even better now we can render them 'live' in IB:

https://developer.apple.com/library/ios/recipes/xcode_help-I...


Ooh I forgot about that feature. It does at least answer my big problem with IB: you execute code you didn't write and can't read. I'll have to give live rendering another look.


I won't start a religious war here, but IB is awesome in demos and less awesome in real world. Versioning alone is enough to make you go crazy (especially back in the .nib days!).


I understand both sides of the argument - it took me considerable time to wrap me head around IB and for the longest time I just assumed I simply wouldn't "get it".

Interested to how you do versioning on UI with or without IB though. Personally I just maintain branches until an agreed upon design is in place.


There are old videos of him speaking about development environment at NeXT (against Sun). Sir Steve did like allowing 'creation'.


> iTunes was decent or even good

You clearly never used iTunes for Windows.


| You clearly never used iTunes for Windows.

IME, every terrible thing about iTunes applies equally to both platforms.


It's worse on Windows. The Windows version would hash file names or something. Probably to get the search features working. So once you imported your files you didn't know what was what. The Mac version probably just uses spotlight and file names were always readable.


My iTunes library has made several round-trips across platforms and I've never encountered anything like that. My music, apps, books, and podcast files all have consistent, readable naming schemes.


iTunes sucks everywhere. And it's a horrible UX that it's needed for syncing an iPhone. A better UX would be just to plug it in and move things around.


iTunes was awesome on Windows when it came out (until about 4.0?). Compared to its contemporaries, Music Match, which was a bloated piece of junk, and Winamp 2.x which while awesome, had a steep learning curve to get really useful (J for the win) and had an extremely unpleasant UI.


> iTunes was awesome on Windows when it came out

It forced a bundled QuickTime install (still does I believe), and back then, it tried hard to become the default media player on your system (even reverting your "no" choice after updates). This led many people to remove QuickTime, only to discover now their iTunes refused to work. It would fail halfway through sync's and upgrades of iDevices routinely, had a generally pretty buggy interface that wasn't very responsive most of the time. It's iDevice backup process was cumbersome for normal users, and often failed without the user knowing (leading to very upset individuals when they needed to restore but couldn't).

Now it seems every new version redesigns the UI in major ways, causing even long term users to not know what to click, etc...

If you really just listen to music, maybe it's fine. For all other purposes, it was/is horrible, however I can't complain because it generated quite a lot of work for my side repair/contract business back then.


> iTunes was awesome on Windows when it came out

Whip the Lama's A WinAmp is still better.


I had whatever version was around when the video ipods came around (and a bodgy bit of hardware that was). I couldn't sort videos the way I wanted to - itunes says that that file extension means "tv episode" instead of "movie"? Sorry, it's a tv episode. Not to mention the terrible UI with tiny targets that doesn't blend in with the user's desktop theming. Maybe that was v4+, but all I remember is hating to use itunes (including managing updates as already mentioned)


The software that went with Microsoft's Zune media player was amazingly good. Sadly, even though it was available separate from the Zune hardware, almost nobody downloaded and tried it.


You clearly have never read his biography ;)

via an All Things D event:

> What’s more, thanks to the popularity of iTunes on PCs, Apple has become a major Windows software developer. “We’ve got cards and letters from lots of people who say that iTunes is their favorite app on Windows,” noted Jobs. “It’s like giving a glass of ice water to somebody in Hell.”

That quip apparently caused Bill Gates to become quite angry:

https://books.google.com/books?id=6e4cDvhrKhgC&lpg=PA463&ots...


Steve Jobs publicly mentioning only praise about things Steve Jobs made? Unbelievable!


I can only conclude that the people sending such letters had only previously used GTK+ apps on Windows or something.


That may have been by design.


Don't forget early OS X. So slow it was nearly unusable until performance improvements in 10.3.


Improvements of a new OS are happily received. However, we're talking about the reverse, deterioration of mature products.


What mature products deteriorated (apart from iTunes) and in what concrete ways?

Mail is much better than it was even a year ago, XCode too, the OS is stable...

And FCPX didn't deteriorate over 7 -- it was a written-from-scratch reboot of the platform that just happened to cut some features people used (most got back with a vengeance).


There a number of examples in the surrounding threads. I find most software getting worse these days on all platforms. See CADT: https://www.jwz.org/doc/cadt.html


And most of that is simply looking at the past with rose tinted glasses.


> the years of oversimplifying applications has created an Apple that can't handle complex applications?

Ah, you've fallen for the illusion. Those "simple" Mac apps you love are fantastically complex to implement. It's the user experience that is simple, and it takes a ton of sophisticated engineering to pull that off.

You can think of it as there being a certain fixed amount of complexity in the user attaining some goal. You can make your software simpler by foisting the complexity on to the user: just make them do all of the nit-picky tasks.

If you want to make it simple for them to achieve their goal, your app is going to have to contain that complexity itself.


>>Or, perhaps, at some point, software becomes too complex for humans to deal with.

I'd say rather that things need to be simplified and features removed in order to improve quality. Every new Mac or iOS release touts XXX number of new features. If you want to offer the best products, having more features isn't necessarily a prerequisite.


I can't upvote this enough. So many products have been ruined by feature creep, it's sad to even think about it...


Its interesting to think about - is feature creep uniquely in the domain of software? I don't think so. There is plenty of software that has been immune to it (think everyday developer tools, command line stuff, etc). Often the apps with the least amount of feature creep are the ones where they are bounded by the APIs they talk to, i.e. if the API (which they don't control) is static, they can't really add any more features.

Outside of software, we see a similar thing, a wrench (or spanner) hasn't gotten many new features, because the bolt that it turns isn't changing. the goal is roughly the same. The only appreciable changes have been in ergonomics, and even these are effectively static.

On the other hand, we have cars, which are suffering from so much feature creep it is unbelievable. every year, car engines get a little bit more efficient, but they also get heavier, bloated with more systems, infotainment, seat adjustments, window adjustments, etc.As a result, the efficiency of the improved powertrain seldom makes appreciable performance difference. (yes, there are outliers).

Cars then, might be the equivalent of "the ultimate app" which does everything for you, but loses sight of its purpose. Meanwhile, we have a long history of leaving single tools alone, and they tend to work great.

The trouble is that in the world of physical tools, the workflow changes/context switching between using one tool and then another is easy. Meanwhile, in software, feature creep ends up being the solution for poor context switching between apps. In an ideal world, working on an image in photoshop and then pixelmator, and then illustrator, and then publishing to wordpress would be as seamless as using a wrench, then a screwdriver, and then cleaning things up with a rag. Unfortunately, software interface constraints almost necessitate feature creep as the "simplest" way to add functionality, even when convoluted menus, hotkeys, and naming conventions obscure utility.


Awesome analogy about context switching. It's definitely true. The "unix philosophy" never really worked well for GUIs and that's been a big problem for composability.


Interestingly, the Unix philosophy works great in machine to machine communications. It is pretty excellent at handling the needs of the IoT and distributed systems, networks, sensors, etc. You hit the nail on the head about GUIs.

Ironically, it is the "handoff" features that Apple was traditionally best at. BUT, I have this feeling that most of what we hate can be described as "too little, too soon". After all, it could be argued that Apple is aiming for an environment which is "document focused" with their focus on standardized APIs (Adobe is working on this, too), but none of this works yet, because software is still written from an "Apps" POV. In the real world, we change the tool relative to the job, and the best tools are specific single tasks. given the experience so far, I'm far from convinced that a 2D GUI should ever try duplicate that.


I think it boils down to asking what the product is actually for, or rather, what fundamental human need does it solve? That's a rather tough answer for computers, but for cars it's easy - getting from point A to point B. Every modification to a car is in pursuit of either making the job more efficient or making the job easier/more comfortable for the customer.

I think the fact that the answer is so tough for computers is why feature creep is so common with software. When your tool can literally do anything you want it to, why say no? The constraints are actually in the mind of the customer.


This is very good point. The constraints are in the mind of the customer, but also in the UI. i.e. a panorama viewing app is not great in vertical orientation on an iPhone. There are physical as well as mental constraints in the UI of a computer.

On the topic of cars, it is true that modifications strive toward comfort, but they also strive for marketability. While a leather-wrapped dashboard may be comfortable to look at, it is primarily focused at adding something to market. This same phenomenon exhibits in software, where features are added to increase marketability, "Look! this ERP software has a social network built in!". The utility of a given feature is often perspective-based, especially when the user is not well aware of their actual needs, or the user and purchaser are not the same person.


I agree, feature creep is not unique to software. Look at cars. Creep is just made worse now that they can run software.


Plenty of product are ruined because they don't allow you to do that one thing you really need to do.

Feature creep is necessary for closed applications.

The ideal is a set of simple tools around a simple and open file format. As soon as you lock in folks to your application (as Apple loves to do), you're on the hook for being all things to all people, or getting panned that your software is only for the lightest casual usel


They (Designers and Product people) will one day learn that the single greatest feature and design, the most objective one is raw speed.

Always go faster for a long time. Meaning tackle long aged hardware, long uptime, and just generally maintain same performance. Across everything, from device hardware and speed locally on the device to the wifi/4G signals, ISP, to the backend and their software and hardware. Performance in speed is the greatest subjective design and feature.

Unfortunately the time to invest in it requires convincing the designer and product that integrating A/B tests for hearts vs star for favoriting isn't worth it for the long run.


> I'd say rather that things need to be simplified and features removed in order to improve quality.

I'd say they need to fix the features they've implemented, and only release new features when they're fully cooked.

Removing features is terrible: it is not an improvement to me when my stuff breaks or when I'm forced into a dumbed-down, lowest-common-denominator use-case.


It's funny -- on the one hand, people SAY they want this (only fully cooked new features). Yet everyone is really quick to jump up and down and complain and bash $company for lacking innovation when their new OS or the next version of their software comes out and the focus has been on stability/bug fixes.

Not just Wall-Street types either; I see this primarily from tech people. Examples: Android M, OS X Snow Leopard (I think, I may have that release wrong but there was one there that was definitely 'not many new features'), basically anything where the next version isn't predominantly new shiny.

Damned if you do, damned if you don't.


> There's just no way in hell Steve Jobs would be putting up with this

Maybe. I felt like the quality started going downhill shortly after iOS came out, starting with low-level APIs, then xcode, then making it into user-level applications.

My theory is that a lot of the really experienced engineers (the one who started with NEXT and OpenStep) left when they were rich after the iPhone stock jump.

So really there is nothing Steve Jobs could have done unless he had a developer education program or something.


>> "My theory is that a lot of the really experienced engineers (the one who started with NEXT and OpenStep) left when they were rich after the iPhone stock jump."

While possible true I think the real issue is resources being spread too thin. Remember they had to delay the release os OS X one year as all their engineers were working on iOS? Now they're doing yearly releases of both OS's. I'm sure they've hired more engineers but pre-iOS there were probably lots of engineers who had been working exclusively on OS X for years and were able to maintain quality. New people take time to get up to speed.


I agree. I don't think this is a Steve Jobs thing at all. In fact, I think post Steve Jobs Apple actually pays a lot more lip service to better software than they did before.

I simply think the attempts to build stuff in common with iOS is taking its toll.


Aren't you contradicting yourself? If post-Jobs they only pay lip service to quality but with Jobs they actually took action, doesn't that suggest Jobs may have been a positive influence on quality?


I think the action they take is about the same. They pay more lip service these days. I wouldn't be surprised if they are actually putting more work into better quality these days, but stuff is just more complex (e.g., everything is cloud based. Imagine OSX today, except running MobileMe, .Mac, etc. as the cloud backend instead of iCloud, which while it has its issues, is way better than those disasters).


Ah ok... maybe you have a different usage, but "lip service" is used to indicate "all talk, no action". So paying more lip service to something means they're churning out more empty words without the activity to back it up.


> There's just no way in hell Steve Jobs would be putting up with this and I wish he was alive to tear some people a new one.

Steve Jobs, Steve Jobs ... that's the guy with the skeuomorphic preferences, right? The one who, at the first iPhone release, told developers that they don't need native apps because using HTML + WebView is enough, right?

Just to make sure we're on the same page here.


Not everybody hates skeumorphism, especially when not taken to an extreme (okay, Apple under Jobs often did take it to an extreme... like the leather Calendar app). But skeumorphism, when used properly, provides affordances and hints to the user.

> told developers that they don't need native apps because using HTML + WebView is enough

That's because the SDK wasn't ready for developers yet. You must remember that, first and foremost, Jobs was a great salesman. If you don't yet sell it, it's a piece of crap and unnecessary, right? And when you _do_ sell it, it's the greatest thing since sliced bread.


No, they weren't going to do a public SDK, until after the huge sustained public outcry.

I understand the plan was to work with select third parties on a case by case basis, sort of akin to standalone game consoles.


I hear that repeated a lot, but I haven't seen any real evidence to back it up.

It seems equally plausible, especially given how quickly they announced the SDK after the iPhone release, that Apple was preparing for it but not ready yet (and/or wanted to give themselves time to work out the early kinks in all of the other parts: working with AT&T, supporting the new device and new OS, etc, etc).


It was reported in the Isaacson biography:

"Apple board member Art Levinson told Isaacson that he phoned Jobs “half a dozen times to lobby for the potential of the apps,” but, according to Isaacson, “Jobs at first quashed the discussion, partly because he felt his team did not have the bandwidth to figure out all the complexities that would be involved in policing third-party app developers.”

Apple had already established a model for selling third party software on the iPod video line through iTunes -- with select partners only.

When WWDC 2007 rolled around, after the release of the iPhone, Jobs presented web apps as the solution for developers, without the need for an SDK. It was only after months of sustained outcry from the development community, and the nascent jailbreaking scene, that Jobs announced that they would prepare a public SDK for the next year, after they decided on a method of signing and sandboxing applications.


None of which contradicts my thesis: that it wasn't rejected, but that Apple wasn't ready at the time of release to undertake the task of opening it to all comers.

The supposition that it was only due to the outcry of the development community is exactly that, a supposition.


According to an Apple board member it was rejected internally by Jobs himself.


What was amazing was the '07 jailbreak experience was like 1000% better than what I'd experienced with Palm or WinMobile.. tap, tap, app appeared on home screen with a "lickable" loading progress bar.

I seriously doubt that jailbreakers invented the smooth experience - they just likely unlocked the functionality that existed before Apple was ready to push it out (I'm guessing Steve wanted apps in the store at launch).


Yeah the UI assets had to be there in the first place. Compared to SBSettings, the jailbreak era control center, which looked like someone taped together the buttons, and it becomes apparent that Job's statement was just salesmanship.


Oh really. That, I wasn't aware of. Well, at least they saw the cash cow that the App Store could become at some point, and acted. It's good to change your mind when new information comes to light.


I suspect people rooting and writing their own apps pushed them as well. If they didn't control the 3rd party apps, other people would.


I'm rather firmly convinced that the "they don't need native apps" was Jobs merely stalling til the SDK was stable enough to release to the public.


"The one who, at the first iPhone release, told developers that they don't need native apps because using HTML + WebView is enough, right?"

Nowadays we have plenty of people complaining that they have to have "apps" for everything when they have a perfectly good web browser on their phone, so he wasn't that far off.


For OS X, the decline began under Jobs with 10.7. I'd always assumed it was simply that Apple no longer cared about computers; in Jobs' own words, “milk the Macintosh for all it's worth and get busy on the next great thing.”


>For OS X, the decline began under Jobs with 10.7

So what issues do you have with 10.11? Because I don't see anything to complain about. Then again, I also didn't see anything troubling in 10.10, 10.9 and 10.8, with the exception of their ill-fated transition to a new DNS backend, which they later reverted.

A few bugs here and there, yes. Nothing I haven't seen since 10.2, or that's not comparable to the kind of issues I have with Windows 10 or the Ubuntu box I use for development (actually that's far worse, but I digress).


I'll take a crack at that:

- UI non-responsiveness: it is extraordinarily frequent that I will chord a tab change in Safari or in iTerm2, and the system will not respond for sometimes multiple seconds. It's the same with creating tabs in Safari. ⌘-t or ⌘-{ do not respond.

- Application switching focus failures: ⌘-tab will raise another window, but window focus will not follow. This has caused me to lose work. ⌘-tab, ⌘-w will sometimes close an iTerm2 tab that's behind the Safari window I'm looking at.

- Mouse pointer lag: Probably related to the input lag above, the trackpad will not respond for multiple seconds after I begin touching it. If I "wake" it with a two-finger scroll, it will often lose half the input and instead click.

- AirPlay stuttering: even two ethernet-wired systems will still lose data between them. It's a crappy experience.

- discoveryd: My Apple TV's network name is currently "Apple TV (5)". Macs sometimes do this too.

- Slow laptop wakeup: I almost always have to tap a keyboard key to wake the display after opening my laptops (MBr and MBPr). Almost always. But not always.

That's just off the top of my head. Many of these have followed me between OS X revisions and different hardware. It amazes me that such bugs stick around.


As I read your comment I was thinking "Huh, I've never seen those focus failures on Mac, but they happen to me on Windows at work all the time."

Then I clicked Chrome in the dock, hit Cmd-Q, and watched Safari disappear while Chrome opened a new window. Guess it's not just you.


>UI non-responsiveness: it is extraordinarily frequent that I will chord a tab change in Safari or in iTerm2, and the system will not respond for sometimes multiple seconds. It's the same with creating tabs in Safari. ⌘-t or ⌘-{ do not respond.

Ok, for this I can't say much, because every since 2010 or so I've used Chrome in place of Safari. As for iTerm2, I've tried to switch to it several times over the years (later mostly because of Tmux integration) but always found it to be buggy and reverted to the Terminal.

>Application switching focus failures: ⌘-tab will raise another window, but window focus will not follow. This has caused me to lose work. ⌘-tab, ⌘-w will sometimes close an iTerm2 tab that's behind the Safari window I'm looking at.

Hmm, haven't seen this -- and I use ⌘-tab and the ~ variant heavily.

I have seen lagginess in focus when switching full-screen apps use, and I sometimes start typing before that happens. This got a little better in 10.11 though (either faster focus switch or less transition time).

>- discoveryd: My Apple TV's network name is currently "Apple TV (5)". Macs sometimes do this too.

DNS issues I've had (and mentioned in another comment). They tried a transition to a new DNS backend which was buggy. They reverted back to the old one with 10.11 (or sometime in 10.10.x) though and has been OK since then.

>Slow laptop wakeup: I almost always have to tap a keyboard key to wake the display after opening my laptops (MBr and MBPr). Almost always. But not always.

Do see this from time to time (though it almost always works in my case).

Could be a sensor issue though -- not a software thing (the display up sensor not registering, but tap working ok).


I see the input focus lags behind the UI after cmd-tab almost every day, for AG least the last two OS releases.


Just a quick nitpick: discoveryd was reverted on the last release... OSX is back using mDNSResponder again. You can't really say "It amazes me that such bugs stick around" if you're not actually on the newest version.


True, I conflated tvOS with OS X here, but it's still software by Apple.


Caveat: i havent used it on my main machine yet, so a lot of this is impressionistic; feel free to correct.

Still a total disregard of Fitt's law. Horribly inconsistent keyboard support. Behaviour that should be trivially configurable seemingly set in stone. Still, I think, impossible to cut a file in Finder. However many shots that take, apple can't get WiFi working properly. The transparency is an abomination.

If you want something more 'big picture', I think all the changes introduced over the lifetime of OSX have been a bit piecemeal with no overall, unifying process. For example, full-screen mode gets bolted on, rather than nicely integrated with other window actions. Notifications blossom into a side panel, but there's an overlap with bouncing icons in the dock. Etc. There are some great ideas there, but we could really do with an OS11 that picks the best ones and presents them together, in a clean interface, in which they all belong.


>Still a total disregard of Fitt's law. Horribly inconsistent keyboard support. Behaviour that should be trivially configurable seemingly set in stone. Still, I think, impossible to cut a file in Finder. The transparency is an abomination.

Well, those are not software quality issues. Some of those are design decisions, and have been with us forever, not random accidents: "Cut", for example, has never been on the Mac. Transparency in 10.11 is so lightweight you don't even notice it -- nothing Vista-like about it.

As for "total disregard of Fitt's law" that's not some decline either, as it's not worse or better than it has ever been in OS X.

>However many shots that take, apple can't get WiFi working properly.

Well, that qualifies as buggy software. But I have to wonder.

I've had an iBook, 2 MacBookPros (1 company issued), an iMac, a MacBook Pro Retina (current), 2 iPads and 2 iPhones thus far. And I've travelled all over the US, Europe and in several parts of Asia. I've never had any trouble with wifi, even to non-chain, el-cheapo motels.

The only offender has been my iPhone(s), which indeed I've not been able to connect to 3-4 places (restaurants etc) while traveling, over many hundreds of locations over 8 years. And I can't even know if it was because of the iPhone crapping out, or they using some crappy, third party router.

So I wonder, what are all those wi-fi issues people mention in forums etc.


""Cut", for example, has never been on the Mac. "

Apple Notes would like a word with you. Specifically the word 'Cut' under the Edit menu.

Apple Calendar would also like the word with you.

Terminal. Script Editor.

Not sure how you come to some conclusion that Cut is not "on the Mac".

My Apple devices regularly need me to switch WiFi off and on, in my home which has an Airport Extreme and Express, both with latest firmware.


In context, they mean "cut" as a way to move a file from one place to another, not related to text which seems to be what you are talking about.


>Not sure how you come to some conclusion that Cut is not "on the Mac".

We're talking about Cut for files (in the Finder), not inside apps.


>And I've travelled all over the US, Europe and in several parts of Asia. I've never had any trouble with wifi, even to non-chain, el-cheapo motels. So I wonder, what are all those wi-fi issues people mention in forums etc.

I agree with this 98%, with the exception of early releases of Yosemite, which really did seem to have a WiFi problem on the rMBP (even with Apple Airport Extreme base stations), in that it would disconnect a lot and you'd have to recycle your WiFi off/on. Annoying but not a deal breaker. And fixed within a month or two.

Otherwise I'd suspect there are some hardware + driver variations of Macbooks that may have had issues for others.


If you want "Cut" functionality in Finder, you can Copy (cmd-C) followed by Move Item Here (cmd-opt-V). It's non-obvious, but it's there.


So Cmd-C Copy can retroactively remove the file instead of copying it?

I think you're proving his case rather well.


Compare to Windows where it's called "Cut" but is really "Copy and mark for maybe deleting if you paste later". It doesn't remove files when they go to the clipboard (which is what cut does literally anywhere else).

And then Paste becomes a destructive operation that deletes your original files, or if you prefer, paste turns into "Move Item Here."

They've taken two different approaches to avoiding accidental data loss by overwriting the clipboard, but I wouldn't say one is inherently more right than the other. Windows makes new actions but has the interface pretend that it's doing the same thing as normal cut and paste. OS X makes the UI less standard, but describes what's being done more explicitly.


10.7 was a pretty low point, but OSX quality has been inconsistent throughout, with a few high points late in release cycles. And it's always panned in .0 releases (same historical issue with initial hardware revisions).


Context: that's a quote from 1996.


Seems like many people are not making enough of the distinction between choices and bugs. Sure there are bugs in lots of stuff, and that's an entirely valid conversation, but there's also a layer of confounding choices. Some make sense from an "corporate" perspective, some...

-iTunes bloat is a choice, and bugs.

-Mail.app is mostly bugs. (non-standard .mbox was a choice)

-Final Cut X was a choice.

-Eschewing strong AppleScript support in native applications is a choice.

-The app store(s) are a choice.

-Allowing core utilities like Contacts and iCal to stagnate and be outshone by 3rd parties (BusyMac) was a choice.

-Aperture+iPhoto=Photos was a choice. (So was selling Aperture after it was EOLed)

-FFF


> The app store(s) are a choice.

In that case, allow me to summarize some of the choices that went into building the Mac App Store.

- It's a web view. No local caching of layout or anything. If your network connection hiccups, instead of a reasonably rendered error message, you get this: http://i.imgur.com/5xNgwMH.png

- Text in the search box is blurry. I'm not 100% sure why, but I think someone made a choice to render it with its y-position at a half pixel increment. And then nudge it down so it comes into focus when you click on it. Because really, what sort of asshole would still be using a low DPI screen? http://i.imgur.com/gj3mqrz.png

- How many Install buttons does an update need? Eh, let's choose two. Two's a nice number. And have that "Update All" button stick around even though all updates were already installed. https://i.imgur.com/p9QoCqZ.png

- Earlier today, though it didn't lend itself to screenshots, my Xcode "Install" vs "Installing" button that couldn't decide what mode it wanted to be in and just bounced back and forth instead. Why choose one when you can be both!

- The "Check for Unfinished Downloads" menu. Do you know what happens when a download doesn't finish? Or why it's not presented in a sane way in the normal UI? Me neither. Instead of gracefully handling errors, let's choose to bury them in a secret menu that people will happen upon via StackOverflow searches when their installation keeps failing with no indication of why. https://i.imgur.com/xkgP0lc.png

A+ design work right there. Excellent choices.


This is exactly my point. Apple has been making these choices for a while, but they're being swept into a convo about bugs, and thus capacity/engineering. I think atp.fm dip into this for a second in their most recent (yesterday) podcast episode.


> There's just no way in hell Steve Jobs would be putting up with this

Says everyone who disagrees with any decision Apple makes. "Steve would have had the same opinion about this I do!" Statements like this are just you projecting your own opinion onto him.


OSX doesn't need to handle that many hardware and driver issues, so I don't see how that's relevant to a Windows comparison.

I've been using OSX since just after Panther. I generally agree with the idea that some things started getting worse after Snow Leopard, but I still don't think it's come close to a point where I'd actually move back to Windows or try out desktop Linux.

And I'd say Windows had far more issues that BSODs and memory problems. I've used Window for music production for years (by the time I switched everyday stuff to OSX, I was locked into my music workflow and haven't cared to spend the time learning a new package like Logic, even after all these years). The way I survive Windows problems is pretty simple: never plug in an ethernet cable. I'm sure things are far better now, but for a large part of the past 15 years, doing so opened you up to a lot of problems and required utilizing software you simply should not have to install in order to have a functional system.

Also, a high percentage of OSX users have no idea what XCode is, let alone care if it's not as nice as VisualStudio.


It's sad that so many people, myself very much included, now stick with OS X merely because ``it's not quite shitty enough to switch''. When I switched to the Mac initially, I switched because it was vastly better option than XP; now, I feel that lead has been eroded and that Windows 10, while different, isn't far behind OS X in most respects.


As far as workflow and window management goes, OS X is a good half decade behind Windows, and in a lot of system-level ways, it feels a decade behind. I use a MBP Retina at work, and it feels primitive compared to Windows 10, which I use everywhere else. Everywhere else, I'm doing audio, video, and photo editing though. If your day to day work is more CLI and *nix oriented, I can understand the appeal of OSX... although some of the variations of Vim that I see people running might as well be GUIs.


Windows 10 is -in my limited experience- awful. Admittedly I don't use it for much more than gaming, light browsing, and occasional ssh sessions but it's painful to use, the apps aren't great, and it frequently craps out with weird error messages (e.g. "The required TCP protocols not installed on this machine" actually meant "The NAS isn't responding").


I've found the opposite, its been rock solid for me.


I switched to OSX originally because the company I was working for was all-Apple. I probably wouldn't have thought XP was shitty enough to change otherwise, even though it obviously was.

Conversely, I'm now working at a company that's all-MS, but even after two years now (albeit, only a month on W10, previously on W7), I'm still not feeling much of a desire to switch to Windows at home. My Macbook is getting pretty old now and I'm going to want to replace it sometime soon and it's going to be another MBP.


The funny thing is I actually switched from OS X to Windows XP, primarily because Apple made me angry when they refused to do anything to continue Classic app support. (If I'm going to lose all of my favorite apps anyway and have to start over, might as well start over on the OS that has 10 times more apps, right?) (Also, I still hold a grudge over the "free forever" .Mac service.)

I wouldn't say Windows 10 is behind OS X at all. In some contexts, like a corporate workplace, it's at least a decade ahead, and always has been. (Then again, Apple and Mac fans generally discount that environment entirely.)

The biggest problem Windows 10 has are: 1. Crummy HighDPI support (and yes, they've been working on this, but the work is WAY too slow-- this should have been solved 5 years ago, guys) 2. Crummy third-party apps, made by developers who have no respect for the OS or its users 3. The new "constantly updates, and occasional ads" philosophy Windows 10 is taking. I wouldn't even mind the ads much if they weren't so stupid. (Stop trying to sell me the copy of Office 365 I ALREADY OWN!)


> The way I survive Windows problems is pretty simple: never plug in an ethernet cable. I'm sure things are far better now, but for a large part of the past 15 years, doing so opened you up to a lot of problems and required utilizing software you simply should not have to install in order to have a functional system.

I don't think that's really true. From 2K onwards windows was pretty solid if you kept it up to date. I never used any firewall/antivirus/etc., just disabled unneeded services (admittedly the defaults in 2K and XP were poor), didn't run executables attached to suspicious emails etc.


>Also, a high percentage of OSX users have no idea what XCode is, let alone care if it's not as nice as VisualStudio.

I really liked XCode, a lot more than Visual Studio, until they remade it to look like iTunes (around 2012).

So there is definitely some preference involved for those people who say Visual Studio is better (and of course, vice versa).


Xcode 4 was a definite backward step in UX terms, in my view, from which the product has yet to recover. Xcode 4 has the dubious distinction of being one of only a handful of products that I've used daily for a good period - in Xcode 4's case, 20 months - without ever finding a way that I could be happy using them.

(Sometimes I just throw my hands up and decide that something, whatever it is, is just never going to be my cup of tea, and that's that. I've done that with a few software packages and/or styles of working. But in Xcode's case, I'm pretty sure it's them, not me. Because Xcode 3 was fine...)


Regarding your last point about many people not knowing what XCode is: I assume GP was theorizing that Apple's declining software quality is related or even in part caused by their bad development environment and tools, compared to competitiors.


just curious, what software were you using for audio production under windows?


> Windows got so much flack over the years. It wasn't the prettiest but it worked and did what it said.

No. Just No. Windows loves to "forget" things. Things like Bluetooth devices. Or wifi devices. Windows likes to update your laptop for you when you're trying to close it ("Don't turn off your computer..." Wait, what? I have a plane to catch!). Windows scatters files all over the place! And that registry. UGH!


> Windows scatters files all over the place!

Oh look, OSX has touched this usb stick, and vomited it's dotfiles onto it. Did it ever write to the stick? Nope, just need to throw those trash dotfiles onto every. single. thing. it. sees.


Grr. You can always tell when somebody has been on the NAS or done any development work on a branch checked out of source control on a MacBook...


The development work thing isn't the fault of Apple, but rather the stupid developer who didn't pay attention to their commit.


I think there's plenty of blame to go to both of them. There's no defensible reason to put dotfiles in every single folder, and there are many, many reasons why it may not be desired of various levels of vehemence.


When closing your laptop Windows goes into standby. It should only install updates when shutting down and it tells you that beforehand. So it "did what it said" ;)

> Windows scatters files all over the place!

Just have a look into your Library folder on OS X. Also some programs don't use that, but create a dot folder in your home directory. Maybe I'm missing something though? I haven't used OS X much lately.


Library is pretty well organized. Settings go in Library/Preferences, cache files in Library/Caches, persistent files that don't need to be exposed to the user go in Library/Application Support, and they're all organized by app name or bundle ID.

Apps which use dot folders in the home directory need to be smacked until they stop, but the OS can't really control where (non-sandboxed) apps put things, it can only establish conventions and encourage apps to follow them.


I can't look it up right now, but I think it's called "Library Support" or something? Anyway: When I was writing a program for Mac OS I had to choose the name of the folder myself. Using the app name or bundle ID seems to be just a convention, I don't see the difference to %AppData% on Windows or ~/.config on Linux.

Also I remember a tool I used when I had a Macbook which helped you remove all the files when uninstalling something. I think it was this one: http://www.macupdate.com/app/mac/25276/appcleaner I remember some programs even putting files in other directories than Library Support.


> There's just no way in hell Steve Jobs would be putting up with this and I wish he was alive to tear some people a new one.

Remember Antennagate? The iPhone 4 had a serious hardware fault that significantly degraded it's signal. Jobs himself was the one who said to a customer "You're holding it wrong"


The software that had "the focus of Steve" was generally very high quality. However there were a number of rotting products at Apple while he was still in charge.

The problem now is none of the SVPs seem to have that attention to detail, or they are too stretched doing numerous things.


Give me a break. Steve Jobs put out plenty of crap products and buggy software over the years. Don't pretend like he had a perfect record just because he's dead.


I haven't read much about Jobs but I wonder how he achieved the perfection in his product. Force the teams to work overtime to correct the bugs? Did he double the QA size in order to catch all these bugs? Did he demand that all code checked in must be at incredibly high standards, thus forcing the devs to QA more of their own code?


With Jobs, it wasn't so much what you have right now as it was where he was taking you. To the bright, beautiful future where everything was different and better. One could overlook the imperfections because the vision was so appealing.

Now that he's gone (and without anyone with his charisma to take over), the public is left to contemplate the ignoble reality of the current product sitting in front of them. The sense of wonder and possibility is absent.


They weren't perfect by far. For years I've seen people struggle to figure out something as basic as creating a playlist or syncing to a new device. It's perhaps always been better software than windows counterparts, it always fallen short of the "intuitive" goal it's aspired to (and frankly, bragged about)


This is the best explanation I've ever seen. Just lots of focus, attention to detail, and iteration until they got it right:

http://inventor-labs.com/blog/2011/10/12/what-its-really-lik...


He went on stage and gave a sock presentation that made everyone forget about the bugs. Users judged the book by it's well-kerned prettily animated cover.

This is the same old tripe about the young generation's moral decline, and rose-tinted rearview glasses.


Imagine how many "this never would've happened if Steve were still alive" articles would be written if the iTunes 13 installer were to delete users' entire hard drives like iTunes 2's did (http://www.wired.com/2001/11/glitch-in-itunes-deletes-drives...).


Exactly this. Snow leopard was a great release in terms of quality, but leopard wasn't.

Many releases had critical flaws that needed a point release in less than 2 weeks.


The Steve Jobs method: hire really good people and inspire them to work to the best of their ability.

Apple UIs were better because they were doing what ESR suggested at a time when the rest of the world was smashing every new option into the right-click menu. Occasionally I'll still hear someone say, "Oh, just put that in the right-click menu." The right click menu isn't a trashbin to dump things that don't fit elsewhere. In any case, here is ESR's UI advice: http://www.catb.org/esr/writings/cups-horror.html


I agree with a lot of the sentiment in this discussion and from the article. It feels like Apple software is getting worse from my perspective but I wonder if a change in the target market has a part to play. Perhaps the HN crowd is not a significant Apple demographic any more?

When I got my first Mac back in the PowerPC days it was definitely a step up. In the past I feel that Apple was targeting power users. Today I think they are going after casual users. Probably capitalizing on the general popularity of iOS.

I've still got a couple of old Macs but they're all running Windows now. Visual Studio still has the occasional lock-up whatever hardware it's on. At least you don't have to sign in to an app store to update it though.


Windows handles maybe an eighth of the hardware that works in Linux.

Visual Studio is nice, if you're not writing in Rust, Perl 6, Ruby, Objective C, D, Scala, Smalltalk...


And what IDE on Linux handles all those, without installing 30+ plugins?


Certainly not Visual Studio. Maybe it'll be taken seriously again if it were ported to the platforms where development is happening nowadays.


The cult of jobs exemplified. There were absolutely issues under Jobs, but somehow his sheen made people ignore them.


> Visual Studio is a dream, if you're into that ecosystem. MS dev tools are actually very nice

Most likely because dogfooding


Huh? Apple devs use Apple dev tools, and they are still terrible.

Like it or not, but MS tends to be very good with productivity software, be it Word, Excel, or Visual Studio.


Applications are open for YC Winter 2024

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: