I keep seeing reports of stuff breaking as a result of upgrading to High Sierra. Does anyone know why this new version of the OS breaks so many different things?
This happens with every release of OSX, it's why most long time OSX users are so cautious about upgrading when a new version arrives. Here are a few examples.
I had trouble finding lists of Sierra's issues due to naming conflicts with High Sierra, but off the top of my head I think that was the version that broke Homebrew's permissions scheme, requiring manual intervention on the part of many users.
An alternative explanation, likely unsatisfactory, but possibly correlated to continued growth of social media such as this very site:
Stanford linguistics professor Arnold Zwicky coined [the term "frequency illusion"] in 2006 to describe the syndrome in which a concept or thing you just found out about suddenly seems to crop up everywhere. It’s caused, he wrote, by two psychological processes.
The first, selective attention, kicks in when you’re struck by a new word, thing, or idea; after that, you unconsciously keep an eye out for it, and as a result find it surprisingly often.
The second process, confirmation bias, reassures you that each sighting is further proof of your impression that the thing has gained overnight omnipresence.
–"There's a Name for That: The Baader-Meinhof Phenomenon..."
I think there is another psychological factor at work, which is that people feel more critical toward, or at least more free to express criticism of, very successful people or orgs.
When Apple was struggling for success against Microsoft a decade ago, it seemed like a lot of tech folks gave them a pretty big benefit of the doubt. OS releases had problems then, but the reaction was usually balanced by a sense of, "this is progress though."
Now that they are the most valuable company in the world, there's not really any benefit of the doubt anymore.
I think a similar effect was at work in the 2016 U.S. presidential election. The widespread assumption was that Clinton was by far the stronger candidate, so there was very little benefit of the doubt for anything.
In 2008, Apple wasn't exactly struggling. The iPod reached its peak in December 2008, the iPhone had just been released and it had a market cap of $500 billion.
I think the name for that is "Survivorship Bias" (https://en.wikipedia.org/wiki/Survivorship_bias), though it doesn't inherently have to do with being critical of noticeable/successful entities; just focusing on them.
I have a mid 2014 Macbook Pro (top of the line, discrete GPU and all) and some recent update has made it really slow (not the Meltdown updates, something before that). It now takes an age to switch workspaces, to load the task switcher, to change applications etc. Stuff also seems to go wrong more. App crashes, iTunes refusing to open, Finder crashing etc.
I feel like Apple are pushing me away. My mac is still better for me than a Linux machine, but it's getting close now.
The thing with this is that I don't really think Apple cares that much. Apple is a phone company now, and OS X and Macs feel like an afterthought.
It wouldn't surprise me if in the next few years Apple slowly winds down all things Mac. Marketing, updates, upgrades etc. and while doing that makes XCode Windows compatible.
This is not normal behavior. Sounds like it may be a hardware issue. I'd first back everything up, create a High Sierra USB boot disk and wipe and reinstall the OS and apps. It's possible something went bad with the filesystem migration, and if that's the case, this should fix it. If not, then it's a hardware issue.
I have to admit to the same suspicions over the years, but then the question is: what supported/reliable solution will they implement for iOS development? Sure, there are options for developing to the iOS platform outside of X Code/macOS, but something better would have to be introduced before EOLing macOS.
Do they simply introduce macOS as an independent operating system and/or license vendors to produce hardware to a certain specification?
Tragically I think this is the answer - all of Apple's talented developers have most likely shunted over to iOS and the quality of macOS has been left to rot.
What's infuriating is that High Sierra, as referenced in its naming scheme, is supposed to be minor improvements on top of plain Sierra, but it breaks so many things that I am drawn to compare it to Vista - half-baked, unfinished, should never have been released. And yet Apple have previously managed good releases like this - Snow Leopard (my all-time favourite OS) and Mountain Lion were successful. High Sierra is a train wreck. I am holding all company laptops back from upgrading because I can't trust the thing.
I hear this all the time but I’m not sure it’s true.
iOS is in the exact same place. The number of weird bugs and subtlety broken interactions I’ve seen has been growing exponentially.
Even as I type this my Safari address bar is pushed halfway into my status bar, making them overlap.
A couple of weeks back my phone became unusable in the middle of the Everglades as I was relying on it to get back to my hotel. A message notification got stuck on screen and it blocked input with anything system related.
I couldn’t even turn it off because the strange new reset procedure isn’t exactly discoverable and the normal slider couldn’t be intersected with!
Those are just anecdotes, but it happens often enough I’m sure most users of iOS 11 will be able to identify that this is just par for the course on iOS these days.
I'm not saying the developers actually are moving, or would make any difference to buggy software if they did, but Apple can shift iDevices by the shipload, faster than they can build them. They've become a phone company first and foremost, and it feels like their computer lineup is suffering as a result. As such, I can easily believe that developers see no future in macOS and are willingly moving over to iOS.
> all of Apple's talented developers have most likely shunted over to iOS
This gets trotted out a lot, but I have trouble believing it. I don't think Apple considers developers to be in short supply--and if they do, it's very stupid of them and they should reexamine that thinking.
I know that demand > supply for skilled developers right now, but Apple has enormous amounts of money in the bank. They can't use a tiny fraction of that to get themselves more, or better developers? I doubt that shareholders would notice/care if a few millions of dollars went towards a hiring blitz for, say, 100 really good engineers with salaries so competitive that Apple could poach them from wherever else they work.
I'm not saying that "throw bodies at the problem" is always a good solution, but Apple can easily afford to throw really, really talented bodies around, and if short-staffing is their problem, it seems like the solution is obvious.
Am I missing something here? Would adding a bunch of beyond-competitive senior developer salaries dent Apple's numbers more than I think they would? Are there really just not that many developers willing and able to work on this at any price? If not, is the problem with "willing" or with "able"?
I'm not saying Apple themselves have shunted the developers, I'm thinking that the developers themselves have moved jobs of their own accord since they see no future in macOS. It's been obvious for a while now that Apple only considers iDevices as their money-spinners. Their actual computers have been long neglected, and their updates for the last several years have been to include more iDevice-like features, so I would think the macOS developers would willingly believe Apple themselves aren't wholeheartedly behind maintaining their desktop OS. It just feels like there is very little willingness from Apple to build a good desktop OS any more, and no sense hiring macOS developers as a result.
Money won't be enough to attract the best developers. The best developers want to do the new and shiny. Apple is in maintenance mode as far as software - no judgement but working on the next iOS release is nothing exciting. On the hardware side there is still some exciting stuff going on.
I suspect the answer is more that Apple refuses to grow its teams beyond a certain size. It's not a lack of money or talent; rather they value a certain way of working (look at all the stuff about collaboration in the new building). But the size of the company and product range has pushed those teams beyond their limits.
It doesn't matter as much if an iOS upgrade breaks stuff as a Mac OS upgrade. The iOS market is so strong that developers rush to make their software compatible. Within a month of the iPhone X introduction I had dozens of apps with updated that just said they were updating to support it better.
Mac updates are a lot slower if ever.
Fortunately, the only apps I really care about besides the browser are developer related tools. Those are usually updated relatively fast on Macs and Windows.
iOS 11 didn't break things because of API updates. It literally broke phones. I had three friends switch to Pixel because of iOS 11 bugs, like not being able to make calls. Apple told them all to upgrade to a new iPhone.
If that was a widespread problem. It would have been another "gate". Like "BatteryGate", "AntennaGate", and "Bendgate". There hasn't been a "Bendgate".
There also haven't been widespread reports that Apple told people that couldn't make calls on there phones to buy another phone - if that were true there would be yet another class action suit. The lawyers would have been salivating.
I wrote this in another comment, but it's relevant here:
It wouldn't surprise me if in the next few years Apple slowly winds down all things Mac. Marketing, updates, upgrades etc. and while doing that makes XCode Windows compatible.
I'd like to see Linux support, but it wouldn't surprise me if it didn't happen unfortunately.
> The Mac line may not be Apple’s biggest seller, but it’s not like it’s losing them money. It’s a huge revenue stream.
Totally, though one that's only getting smaller as a % of their revenue. At some point it might start to seem like a distraction.
I've seen companies drop profitable products before, even those that were a larger % of overall business than Mac is to Apple, so it's not clear that it won't happen. I hope it doesn't, but it might :)
> Also don’t underestimate the impact of the fact that every employee at apple uses Mac.
It will be interesting to see what happens here. Could be an early warning sign of things to come.
When Steve Jobs returned to Apple in 1997, there was a huge backlog of bugs and deferred maintenance in system software, presumably because resources had been diverted to the failed Copeland project. Jobs addressed the problem immediately, and the result was a series of very solid OS releases, starting with System 8 in July 1997. Ironically, Apple seems to have come full circle. If Apple has their priorities straight, this year they will invest a chunk of their repatriated cash in Mac software.
Jobs returned to Apple in July 1997, the same month that Mac OS 8 was introduced. Jobs had no real interest in Mac OS before OS X. He used a computer running NeXT until OS X came out.
Been there done that. A real solution would be to stop hiring new developers, to reduce the size of the current dev team, to reduce the size and responsibility of the QA team in order to force developers to test their own code. To eliminate any new feature requirements, to prioritze regression testing (migration, update, etc) and to allocate a whole release to trim the pile of bugs in radar.
Why Apple can’t do that? 1. Pressure from investors “we want to see new stuff”. 2. If they stop hiring new “talents” that would benefit other companies including competitors. 3. You can’t hire good developers and ask them to fix bugs for a year because they would run away from you which would compromize 1. And 2.
It is a tough problem for Apple it ain’t easy to fix. In other words, Apple needs a Steve Jobs. It needs to focus and disconnect from the world for a little while no matter the short term consequences. If you’re confident enough you’re doing the right thing for your customers, it’ll pay off in the long run.
> You can’t hire good developers and ask them to fix bugs for a year because they would run away from you
Really? I've never experienced that in the workplace. However, I'm not that widely travelled and not in SV, so I guess I'd be "unsurprised but depressed" to learn that this degree of idiotic entitlement is common. And idiotic is exactly what it is: truly quality software is the product of lots and lots of bugfixing. It's like the "10% inspiration/90% perspiration" thing, but for initial development and bugfix/gradual improvement work. Especially on huge (OS-sized) software, people hiring on should not be surprised that this is the bulk of their day-to-day.
I get that greenfield development is more fun up front. I just (perhaps naïvely) hope that most professionals understand that a) "fun" and "good for the product" aren't the same thing, and b) that it can also be very personally rewarding to spend a long time fixing bugs (yours or others') and see the quality of/user happiness with a product noticeably increase.
Also, I think being able to say "I fix bugs to maintain an operating system used by millions" is at least as rewarding as being able to say "I make buggy websites for startups whose marketing buzz reaches millions".
I think we're on the same page. I was simply highlighting things I witnessed from my past experience. Bugs (in Mac OS) get really nasty sometimes and frustrating for the most part. It is brain consuming and the reward isn't always worth the energy spent. Management doesn't even reward bug fixing unlike the "new" stuff. Ask any dev from that specific team (or even iOS team) to fix bugs for a whole year and you'll see the reaction. Some actually do because of a "lower" global performance at work. You can easily get assigned on bugs over "new" stuff if you don't hustle 24/7 but that's a whole new topic :)
But to add up to my previous comment - only a few people get to implement new stuff each year, they don't test their code and fall under very tight deadlines, so they produce buggy stuff. This generates tones of new bugs constantly tracked by QA (QA doesn't fix anything, they just report problems), bugs quickly pill-up in Radar and developers are already assigned onto something else. Most of the bugs reported by customers are duplicates... they're usually already tracked, either not assigned yet or simply lower priority than something else. So what it the problem? Well, everything I mentioned ;)
I think we agree as well; my reply was meant to speculate on whether anti-maintenance sentiment comes from developers or higher up (culture etc.).
> Management doesn't even reward bug fixing
I think that's the moral of the story. It's not that developers are childish, stamp their feet, and refuse to do bugfix work; it's that they're incentivized to not do it.
I find this all quite hilarious, seeming I'm running this on a 2009 white Macbook running on High Sierra, running like a dream. Granted I only use it for some media stuff, but it works great on this oldest supported device for HS by Apple. Yes I had to bump the HDD to SSD and RAM to 8GB, but it is still usable.
Yosemite: https://fieldguide.gizmodo.com/the-worst-bugs-in-os-x-yosemi...
Mavericks: https://www.wired.com/2013/10/mavericks-issues-and-fixes/
El Capitan: https://www.imobie.com/support/mac-os-x-probelms-and-solutio...
Personally, I remember this as far back as the first upgrade I participated in, from Leopard to Snow Leopard: http://www.ilounge.com/index.php/backstage/comments/problems...
I had trouble finding lists of Sierra's issues due to naming conflicts with High Sierra, but off the top of my head I think that was the version that broke Homebrew's permissions scheme, requiring manual intervention on the part of many users.