Hacker News new | past | comments | ask | show | jobs | submit login
Broken (tyler.io)
1354 points by bangonkeyboard 42 days ago | hide | past | web | favorite | 545 comments



Things are so broken here at Apple. I joined about 4 years ago.

I am awed by the fact that we manage to release any software at all, let alone functional software.

The biggest problem is communication. No one fucking communicates.

- No communication between orgs. Tons of bureaucratic tape to cut through just to get a hand on someone working on a different product

- Barely any communication between teams. Literally every group of 4 people is in a little silo with no incentive to go outside it

- Broken management structure. I have had many managers (a red flag in itself) but even worse none of the managers take suggestions from engineers. Everything is purely top down. If an engineer realizes there is a problem on a macro scale they cannot fix it. It is literally impossible to unite more than 1.5 teams to get anything done.

- So what happens is that you’re working on a product that’s part of another product but you never talked to any other teams or orgs on how to make your product fit in

- 10 different teams working on the same products and services. Zero unification means you are literally wasting developers and internally fragmenting every tool. Even worse, these teams compete for internal market domination

- Culture of secrecy means nothing gets fucking done. You file a bug report and you can’t even see it any more for some orgs

This is only the tip of the iceberg. There are fundamental and serious problems at Apple that no one in management gives a shit about solving. Any time engineers try to congregate or work on anything constructive with another team, they are shot down.

The only time I have seen cross-team developers working together has been to deal with critical bugs.

Because of the lack of communication, none of management’s goals align. They are all out of sync and poorly thought out. So year after year your manager has something they want you to implement but the feature for the year is bullshit because it makes no sense and is just there to pad the manager’s resume.

And you can’t speak out about this. Apple doesn’t take well to employees complaining. Even then, because of the lack of organization there is no one you could raise these issues with.


It absolutely isn't the individual employee developers job or responsibility to try to fix corporate culture. Almost anyone on here or reading this is a line worker developer and trying to take on the job responsibilities of C level staff is setting yourself up for disappointment and failure.

Any company larger than a few dozen people is entrenched - there will be a hierarchy and the top will dictate the order of things. If they are paying you to write pointless software then you are either content with the paycheck and probably a lot of free time at work if nothing really matters or can go somewhere else to find meaningful work.

But seriously, the larger the company the less you should ever consider thinking yourself as some engineering talent can change the system. You change the system in those circumstances by realizing the failure, networking with your peers, and starting your own company to do it better. Assuming you didn't sign a deal with the devil by noncompeting your way into being stuck. You were hired to write code, not fix corporate culture. Largely because most large corporations have layers of management dedicated to insuring it is not fixed.


In my first job I was put into a team of about 5 people in a company of roughly 10,000 employees. After a year or two, another junior person joined my team and he started making contact with senior people all over the place, not quite C-level but only one or two levels down. Not even just in the software department but product development and in sales (to get an idea of user requirements). He sat next to me so I heard all the phone conversations (and winced!).

As time passed, it gradually became apparent to us mortals that he wasn't just being humoured - the people he was contacting came to respect him and even consider him their man "on the inside". He ended up doing pretty well - not promoted instantly to senior management, but certainly promoted faster than me (and ended up moving to sales).

Now admittedly lots of things are different from what your comment is suggesting. That company was big, but not as big as Apple (?) and intra-company communication nowhere near as bad by the sounds of it. This guy also didn't try turn over the whole company culture; for the most part he just spoke to the right people to progress things within our team's products. And finally, a key ingredient was him; if I made a conscious decision to act like he did, I'm sure it would not have gone well.

But my point is just that you shouldn't always write off fighting the corporate hierarchy. For the right person, in the right situation, it can actually make sense.


My argument is more that if you do "fight" the hierarchy, you are doing it as charity. People are paid exorbitant amounts to analyze and structure corporate cultures efficiently, way more than any of the grunt developers will be, and trying to take on those kind of job responsibilities without the compensation for it, even if you are the one in a million that succeeds, just means you did an amount of work that would in professional business be worth a lot to the company for free, and they don't have any obligation to compensate you for your effort because its not your job to do.

And thats all predicated on you succeeding, of course. You weren't offered the responsibilities to fix corporate culture, and thus trying to do so in the first place often just serves to piss off your higher ups who feel you are disrespecting their authority to do it themselves.


Yes it isn't but who else will do that for them? Culture is only possible when people talk to each other and/or exchange thoughts in other ways. If normal people can't do that, there's no single culture, there's many little cultures. Remember that organizations are people and not some magical beings from a different dimension.


Culture is one of those things where once it's broken, it's usually easier to let it die than to fix it. Go join some smaller organization that doesn't have a broken culture and help it succeed. Once enough people do that, the small company becomes a big company, the big company rots on the vine and eventually goes bankrupt, and the cycle starts again.


That assumes that the big company doesn't have enough inertia to keep going on and on, strangling the small business or buying it outright (thus incorporating it into the broken culture, Borg-like).


This. Culture is one of the only things that every employee can change by themselves, just by going out of their way to find and talk to people. It takes work, but also takes absolutely zero permission or red tape, and it makes a huge difference. Talk to a new stranger at work every day, and in a month, you'll be solving problems nobody knew about.


This isn’t consistent with my experience. Imo the fish rots from the head.


disappointing to see this grey, because i think there's a lot of truth in here.

i think a lot of developers are used to wielding great power with technology, getting immediate visceral feedback, shipping, and whatever else. this gives them an impression that fixing people problems is just as easy - the equivalent of opening up the ol' IDE and rocking out for 8 uninterrupted hours, getting an MVP up.

the differences, though, are crucial. the compiler doesn't lie to you - you missed a semicolon; that's not the case with people. the in memory database doesn't have another agenda; again, not the case with people. the UI doesn't aspire to be something greater, or protect itself; managers often do.

i really appreciate that individuals try - i just don't think it's really worth it, if the org is > 15 or so people.


People really hate to acknowledge that working for most corporations, especially big public tech giants, is not them being welcomed in to change the world and blaze trails but to write code for their boss.


> Assuming you didn't sign a deal with the devil by noncompeting your way into being stuck.

Well, in California it's basically impossible to enforce a non-compete, so there's that going for anyone who wants to do this as a current Apple employee.


I think the issue is that culture at Apple is very much not supposed to be this, and this probably isn’t what the C-suite is intending to push. So you’re not really going against them; rather, you’re going with them but against the current status quo.


If upper management wants culture to change they will take the action necessary to change it. The last people to be powerless to inflict change on a corporation are the executives running said business.


You should do some reading about the idea of the "frozen middle", which is pretty much what's being described as the problem. It's as resistant to edicts from the top as ground-up initiatives from the bottom.


Thanks for pointing out the proper term for this disease!


First upper management has to realize there's a problem. Often they're so insulated from the day-to-day operations that they're clueless.


Well, the GE story in the Business Adventures book clearly shows that, sometimes, they are utterly powerless


In the sense that everyone wants cake and to eat it sure.. But really everything anyone wants is stack ranked and their management prefers the apple demo to the world format long after the luster has gone in no small part because of the secrecy, silos and paranoia it requires. In a financial sense their management is right and an employee wanting to be part of something less bland and Oracle like is wrong.


> Any company larger than a few dozen people is entrenched - there will be a hierarchy and the top will dictate the order of things.

To a point, but have you ever tried changing something purely through a dictate from the top? People will just say ‘Yes boss!’ and then keep working exactly as they had been.



This is what unions are good for.


When I worked there under SJ, the Mac OS org (then under Betrand Serlet), it was sort of open amongst the org itself. It was really easy to walk to someone's office and strike up an interesting conversation. Many late nights were spent working through collaborative problems. Or randomly, I had a friend who would pop by my office and spend hours explaining how he figured out some complex Javascript compiler bug of the day.

It always felt like we were in a mission to ship Mac OS together. What Apple did do back then was create these special versions of the OS that had a few key hidden/secret products that SJ was going to demo, like iTunes or iPhoto. So while I could install the latest internal developer build of the OS, it would have a feature or two missing. You would then get radars that mentioned the code-name and explained a bug that you had to fix for the feature, but you had to fix the bug blinded and send the bug back to verify. (Radars could never be closed until the original creator verified them) The secrecy didn't really get in the way and it made for an interesting culture.

Then it all started to change when Forstall was promoted to VP of the iPhone effort. He took what was probably meant to be a short term secret launch team culture and expanded it to create this massive secret island in the company. The program office and by extension, the original founding engineers were all promoted to management that expanded on the secret culture. I think if management meant to open the culture back up to the same level as Mac OS in 2009, they would have been burned by Samsung and Palm WebOS making exact copies of the software coming out at the time. So the hyper locked down culture persisted and SJ passed away. Then Forstall was fired and Federighi was promoted to replace him and merge both the Mac OS and iOS orgs finally killing off any of the remaining openness that once existed.

Then came all the ridiculous tools such as checking someone's security clearance when you had a meeting with them. [Apple Confidential] :-P


I’m on an internal infra team, and even tho we’re supposed to be making tooling for the rest of the company, it’s just as much of a clusterf*.

I’ve been interviewing with other teams, but with this disaster of a release and us demonstrating that our corporate values are just empty words in the face of opposition from China, it’s likely time for me to make my exit.

The overt sexism that I’ve been witness to in iCloud management certainly doesn’t help either.


> The overt sexism that I’ve been witness to in iCloud management certainly doesn’t help either.

Yikes! Is this something you could share the nature of?


There is shockingly little female representation, especially in management. I’m well aware software engineering skews male, and infrastructure even further so, but the numbers are absurd.

In one section of iCloud, there’s zero female managers in a 200+ person org. In another, after recent re-orgs, there’s only one remaining female manager in a 600+ person org, and even then, she only has a small team of engineers. No representation in upper management whatsoever.

With multiple recent re-orgs caused by poor management, there has been ample opportunity to address this imbalance, but upper management has doubled down, cancelling nearly every project managed by a female manager before the changes, or giving their projects to male managers.

I raised this as a concern with HR months ago, but they have yet to take any action or even follow up.


I should note that most developers here really do care, and that’s probably why products can be released in the first place. You have to have really dedicated people willing to cut through the organizational bullshit to get things done.

All of the engineers I’ve met here are smart and innovative. Only if we could organize, things would be much better.


What do you mean when you say "organize"?


Not OP, but I'm guessing they mean something like 'make sure the right hand knows what the left hand is doing'. Making sure the effort to produce something is done collaboratively by all teams instead of each team working in isolation without coordinating with others, with the risk of duplicated effort or one team tripping up others.


Or you know, things like unions.


I think they meant to say, ‘organise’.


This sounds largely as I feared. The organization was built to serve the will of an omniscient god-king who knew better than anyone else. And now he's gone, and the organization is still set up to make orders trickle down from on high (despite no-one qualified or interested being there to make those calls) and hasn't learned make good decisions by communicating internally either. So the acolytes and high priests try to just keep doing what they were doing before god died.


If you read other insiders’ comments here you can notice that this is not the case at all. Under Jobs the Mac OS team was more open and collaborative, so probably this happened “by chance” and the death of Jobs was probably a force toward closeness.


It sort of makes sense to me. The god-king likely demanded answers and accountability, that must make people cooperate somewhat. With stagnation, power plays and petty turf-wars take over.


Agreed. When Sj died a power vacuum opened. His presence really was so big that there is no way a single person or even a couple people could fill the charisma and power he wielded over the company. Anytime there is a vacuum, the power plays and turf wars become the goal over delivering product.


> The biggest problem is communication. No one fucking communicates.

Elon Musk says something interesting about this here:

"Product errors reflect organizational errors."

https://www.youtube.com/watch?v=cIQ36Kt7UVg&feature=youtu.be...

He's specifically talking about how the product subsystems are effectively mapped out by the product departments and that they should try to interface with each other with minimal constraints.

But, my take was that there needs to be a LOT of communication between departments and an ongoing debate between them as well.

Edit: The more that I think about it...

This might a big reason why Musk companies defy the odds, and why it is so difficult for incumbents to catch up.

The over the air updates of Tesla are a good example of hardware & software departments working together to make something very difficult to compete with (if you're a regular old school siloed company).


Not dismissing what Elon said but it is commonly know as Conway's law: https://en.m.wikipedia.org/wiki/Conway%27s_law


Yeah, I’ve been trying to teach my organization that if the input to my team is crap, the chances of anything but crap coming out are fairly low.

  Crap >> Team >> Crap
We’re front-end, and basically the lowest part of the pipeline, so literally everyones crap gets dumped on us.


On two occasions I have been asked, — "Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?" In one case a member of the Upper, and in the other a member of the Lower, House put this question. I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.

    Passages from the Life of a Philosopher (1864), ch. 5 "Difference Engine No. 1"


When my friends at apple talk to me I feel any of them could write your post except...none of them would say "I am awed by the fact that we manage to release any software at all, let alone functional software" (well, maybe one would).

I'd like to work with these folks again* but the incredible secrecy would bug me. I understand that some things have to be secret, and I don't at feel I need to know what that group over there is doing but I'd like to talk with my (NDAd, same company) friends about what I work on! So I don't even apply for a job there. But some people seem to consider it OK.

* The subset of friends there who used to be colleagues of mine, I mean.


I’m not surprised by that line. Honestly I think many employees would say the same of their employers if they could.


> - So what happens is that you’re working on a product that’s part of another product but you never talked to any other teams or orgs on how to make your product fit in

I'm surprised to read this, because I've always thought tight integration and clever synergies between product lines were precisely one of the things Apple excelled at.

And I was wondering how exactly they managed to make that happen in such a famously secretive organization, where very few people have the 10.000 feet view required to come up with these ideas.

How do you make Mac Catalyst or Sidecar happen with 4 people silos who hardly ever meet and adjust ? How do you redesign the iOS photos app to the capabilities of the new hardware, in a way that will make sense once software+hardware become a product ? How do pictures of unreleased Airpods end up in recent iOS beta releases ? I mean, at some point you've got to make these things work together, and one decision on one side that's oblivious to the other side's constraints might make things impossible to them, and they'll want to push back. This is how "normal" companies function.

One more example, not something especially clever but more something that would have been a huge bummer if it hadn't happened : it seems like the Pro Display XDR has charging capabilities way beyond what any current Apple device might require, and it's speculated that it's for the upcoming 16" MBP : https://www.macrumors.com/2019/10/04/16-inch-macbook-pro-96w...

Again, how do you even achieve that if people don't communicate ? Through extremely well defined internal interfaces and specs ?


> How do you make Mac Catalyst or Sidecar happen with 4 people silos who hardly ever meet and adjust ?

You don't. What happens is those features are broken or just barely work, and only once they become public can Apple finally get them working. Look at the new iCloud shared folders that's now been pushed to next Spring after early Catalina betas were wiping out people's iCloud.


the fact that GP is complaining about it may be an indication. by hiring good people who see the structural problems and work around them works. you can brute force work around poor management and a bad structure.

Edit: you can see this explained in one of snapples sibling posts.


> How do pictures of unreleased Airpods end up in recent iOS beta releases ?

By the OS Mastering team dropping the ball…


I feel like there are people who get around the secrecy/isolation veil, they just have long-standing friends in other projects and can rely on them to push features and bugs though the bureaucracy. If you don’t have that, well, expect your Radars to sit in “punted from this release” until the end of time.


Seems crazy to hire the best engineers and pay them top dollar only to put a bunch of organizational barriers in the way they have to try and navigate if they want to get anything done.


The prestige from hiring very highly paid people is obviously higher than the one attained when hiring lower paid ones.

Managing 20 people on-site at Apple is different from managing 20 off-shore cheapos (even though they could potentially be just as skilled).


I've never heard Apple referred to as paying software engineers top dollar.


Apple absolutely does not pay top dollar for most engineering staff.


Or just escalate to your Product Manager.

It's not like cross communication doesn't happen it's just via a formal process.


That's hilarious and cute. I'm pretty sure I'd be told off if I tried to push cross-project collaboration up the management ladder beyond making an introduction happen.

This thread is really helping me understand why I can never see eye to eye with anyone on Hacker News. What I had assumed would be engineering companies are companies with engineers but without engineering culture, and there is zero autonomy to be had without getting into management in an over-stratified org-chart.


And what do PMs tell engineers on their team once they're back from their super secret meeting with the PMs from the other team ?

"Do exactly this, I can't give you context but it'll make sense on release day" ? Sounds like extreme Fordism (and basically a hellish way to work for creative minds) and a great way to prevent people from spotting problems early.


I'm on an internal tooling team @ PlayStation and man do I appreciate how much the company encourages cross-site and cross-team collaboration. Reading your comment made me cringe :/ so so sorry for you folks working at Apple.


I have a buddy who's an old Apple QA who tells me similar jaw dropping stories about the silos and secrecy. When I marvel how they can possibly ship all those wonderful products he says it's done through grueling brute force manual QA testing. Which also sounds insane but somehow it has mostly worked through the years. But maybe they're reaching the level of complexity where brute force manual testing is not scaling well enough.


yet, interestingly enough, Apple are never looking for manual QA testers, just Senior Automation Engineers. Or so it seems to me, a lone brute force manual QA tester who every once in a while wonders what kind of an effect he'd have at Apple.


From what I understand about Steve Jobs - he the great integrator. He got lots of disparate folks to work together.

I mean, he got music folks talking to computer folks.

But I think he did this on a smaller scale every day and that might have been his secret sauce for apple.


I worked at Apple for a few years and he was not running around bringing teams together. At the senior leadership team level sure. But not at the engineering or middle management levels.

It sounds quite similar to what OP was experiencing i.e. lots of siloed, secretive teams across a very large, 100K employee organisation.


Hmm, Apple was nowhere near that big in Jobs' era. Everybody was in one or two floors in infinite loop 1 and 2


Absolutely. Based on various anecdotes it seems apple succeeded nearly in spite of him, but he surrounded himself with people like Wozniak, John Carmack, Tevanian etc people who’s confidence in what they knew could always win out against Steve’s bluster. Steve brought people together, but also brought creative tension. He was also pretty good at spotting opportunities and trends. Apple really does feel rudderless without him. There’s no coherence of vision between products. Quality doesnt even really seem to matter. Apple still operates in a very top down manner, but it just doesnt seem as though there’s anyone at the top with much in the way of interesting ideas. Watch did okay, but was hardly a game changer. Earbuds are a success, but they cant be making that much on them ... Steve’s gone, and more’s the pity for all concerned.


> No one fucking communicates.

We all laughed when Tim Cook said that Apple will double-down on secrecy[1]. Who's laughing now? Turns out he was dead serious.

[1] https://www.cnet.com/news/tim-cook-were-going-to-double-down...


I used to work at Dyson and it was a similar story. I think the secrecy thing is key. They went on and on about "protecting our secrets" (ha) and operating on a need-to-know basis. It just meant communication was really bad, and getting help from anyone who didn't know you was impossible.


Ever considered emailing Tim Cook?

Don’t throw your managers under the bus, stay optimistic, anchor your message in whole company success. You might be surprised at the impact you can have & the replies you’ll get.

Here’s a formula you can try...

Hi Tim,

At <company presentation> you talked about <goal>, which I’m excited about!

I/we’ve been experimenting with <new approach>. We’ve seen <results>. What other teams should I/we collaborate with to help Apple get to <Tim’s goal> faster?


Cook is a corporate bureaucrat. in the best case scenario, he’d point you to your management chain, and in the worst, summon an inquiry into why you contacted him in the first place.


Two observations:

1) You seem credible.

2) There's not really a mad rush for questions.

It seems like the magic really is over. ( :/ )


Perhaps a silly question, but if it is so broken, then why do you keep working there? Is it purely a money-thing? Is there still hope that things can improve?

Not judging, but I am genuinely curious about what drives engineers to stay.


If things are so broken, as you describe, why not just pack up and go work somewhere else? Personally I love many of Apple's products but wouldn't be willing to put up with that kind of workplace just for the sake of it being Apple.


Over the years I've been surprised a number of times by seeing Apple hire a particular software engineer that I wouldn't hire for my own little company. You're still kinda junior @ 4 years there, but.. have you seen seen any slippage in hiring standards?


Apple hiring is independent per team. You can literally interview forever at apple and the way each team interviews is wildly variant. Had some pretty cool interview loops. I got rejected and then got an offer after interview 4 or 5 with a team on a high profile app that I really liked.

Apple doesn't give good offers although, so I took another one at a company / team I also really liked with a better offer.


Ah, interesting!


Yeah, this rings pretty true.


The best workaround I've found for this in large companies has been to sidestep the formal management structure as needed to actually get things done and working properly, and then give management credit for it.

The cleverest way to get rid of bad bosses (besides moving to another group yourself) is to get them hired by another company or promoted to irrelevance (though both strategies can backfire.)


Have you worked at a company of similar size before?


was thinking.. the complaints listed above sound very much like an aerospace manufacturer/defense contractor i worked for before. little silos everywhere, and further complicated by gov security clearances. getting anything done was like mud.


Of all the elements you mention, the most deadly is the inability to speak out about the problems. The first step in improving anything is to acknowledge the areas that need work.


thank you. this really validates my decision to work as a backend Python engineer at a well-established startup.


What is a "well-established startup"?


It’s an enterprise full of hipsters that tell themselves they’re changing the world in order to be able to take the punishment (or to be ok with unethical work). :P


Probably a decent-sized one with a proven business model, or at least revenue stream.


that's samsung.


And yet most of Apple's products are consistently best-in-class, so I guess it's working for them? And a bumper crop of bugs in a point zero macOS release doesn't count as a disaster. They've been shipping super buggy point zeros for two decades.


Fans like you are a major factor in why Apple won't change. They can do no wrong.


You do know that hardware and software teams are not the same thing, yeah?


The integration of hardware and software is and has always been one of the things that make Apple's products best-in-class. I'm not sure what your point is.


You are probably segregated away from where the real work is happening. This is by design, because you haven't proven yourself yet or they don't trust you yet. Good work is being done in Apple every day. You're just not a part of it and are not seeing it.


After four years? Seriously? Why would a bad hire not be laid off at that point? What's the difference between "real work" and "not real work"? How many engineers are on-site at HQ who don't do real work?


There might be teams working on "secret" things that are better, but as far as I can tell the "we don't trust you yet so you can't work on anything until you do" is largely a myth.


Yeah, like iOS 13, Catalyst, and Catalina. It’s all rather amazing. Ship on time, ship all features that were announced, heck, their hardware teams can barely keep up!


> Culture of secrecy means nothing gets fucking done. You file a bug report and you can’t even see it any more for some orgs

> And you can’t speak out about this. Apple doesn’t take well to employees complaining. Even then, because of the lack of organization there is no one you could raise these issues with.

Sounds similar to a cult.


No, it sounds like any large multinational. In the end, all commercial entities turn into the same thing where they only differ in branding, segment and origins.


Agreed. Any sufficiently large organisation is, by definition, staffed by average people.

Why would we expect that organisation, as a whole, to be above average?


Right. The poblem is when you focus on and optimize for hiring excellent developers only. Then all the non-excellent people that contribute to the average are in middle-to-higher-level management.


> The final (well, first) Catalina release along with the outright awful public beta makes me think one thing. And that is Apple’s insistence on their annual, big-splash release cycle is fundamentally breaking engineering. I know I’m not privy to their internal decision making and that software features that depend on hardware releases and vice-versa are planned and timed years (if not half-decades) in advance, but I can think of no other explanation than that Marketing alone is purely in charge of when things ship.

I don't work at Apple, but this part hit home for me. In the past few years my jobs have revolved around shipping features at all costs with zero regard for engineering feasibility.

We all like to criticize CEOs for prioritize short-term stock prices over long-term company goals, but I'm beginning to think the average employee or middle manager has even more perverse incentives to make poor short-term decisions. I've seen a lot of engineers and managers swing for the fences to deliver headline features that can't possibly be completed on time with any standard of quality, testing, or long-term support. It doesn't matter, though, if your goal is to add that accomplishment to your resume so you can pivot into the next higher-paying job elsewhere. After that, your mess becomes someone else's problem and you're off the hook. Up or out.


This reminds me of the fascinating NYT story about the lead up to the original iPhone announcement event: https://www.nytimes.com/2013/10/06/magazine/and-then-steve-s....

A choice excerpt:

> The iPhone could play a section of a song or a video, but it couldn’t play an entire clip reliably without crashing. It worked fine if you sent an e-mail and then surfed the Web. If you did those things in reverse, however, it might not. Hours of trial and error had helped the iPhone team develop what engineers called “the golden path,” a specific set of tasks, performed in a specific way and order, that made the phone look as if it worked.

I guess I post this here as a means to say, while what OP is talking about certainly sucks, Apple seems to have a long history of this. Doesn't make it better, and certainly what he's outlined seems extra bad, but not completely unexpected.


There’s a difference between a golden path for a demo for a product that will ship in 6 month and a public release thats still very buggy.

And a public beta that will mess your files on other devices(!?)


Great read! Thanks for sharing. Highly encourage anyone who is on the fence about reading the whole thing to go for it.

Steve, and others like him, do make me wonder. On the one hand, I work four days a week, never stay late at work, and live a good, steady life. But on the other hand, I see these super-stars, these drive-people-to-the-edge, sleep-on-shop-floor types, and see how much change and drive they create, it makes me start to think that maybe I should work _much_ harder. But then again, I quite like all this time I have to think on things. And of course, we don't get all the details about how this style of work _really_ affects home life; I'm sure we'd have much less respect for these super-stars if we knew they _all_ had screwed up lives away from work.


I know what you mean, the appeal of this total dedication. For myself, I've come to the conclusion that I could not do it for long, not as long as I'd have to. I think for those types like Jobs, Musk, it wasn't even a conscious choice they made to put all they had into their work, they are/were just driven. You'd know it if you were, too. They could never have done a four-day work week with zero overtime, it was never an option for them. So, enjoy your life as it is, this is yours, theirs is different, and don't think you're missing out on anything.


Reminds me of an anecdote I heard about starting your own company. It's great, you're the boss. You can work half days if you want. You even get to choose which 12 hours that will be.


Demos of new products always have golden paths, and even demos of production grade software have golden paths very frequently because maintaining complex software in a demo-able state is hard (i.e. historical reports need to show a plausible history without anyone actually using the system).


Indeed. Apple's MO for a LONG time has been, "The way things have always worked should always be rethought, and we've come up with a better way for you". Many Apple users on forums like this tend to be Apple apologists until they introduce that one change that's too much, and then it's all "Apple's lost their way!" when the reality is it's just Apple being Apple.


That’s not what the parent comment is saying at all.


I've been at Adobe at a time when they shipped Creative Suite, as boxes (even though downloads started to be more prevalent, it was still very much pre-packaged software). People don't give Adobe enough credit for what it was achieving back then - every 18 months, like clockwork, it shipped an entire suite of huge applications, that needed to work well with one another. And they couldn't rely on post-release fixes - because US accounting regulations made it impossible to ship fixes after the GM builds, if I remember correctly. And they did it. Reliably so - at least until CS6 (with the subscription, things changed, you can now ship fixes anytime). What's more impressive than the fact that they did it, is that from the engineering perspective, it was an "of course we'll do it" - there wasn't at any point any doubt that CSnext will be released, and will be released on time. And it had to be awesome - the entire company depended on it.

Thing is, there was nothing really special that Adobe did and I haven't witnessed elsewhere. Extensive testing, pre-releases/ getting the most loyal users involved early. Waterfall, yes - but "waterfall done right" (I don't think the overhead was too onerous). What they probably did "specially" though, even though it sounds mundane and boring... was to relentlessly cut scope. It was no shame to do less than what you initially planed for - but it was a crime to not say ASAP that you may not be able to do what you promised. I know personally of a feature that didn't ship in CS5, at all, because it was deemed "not ready" (even though at the start it was deemed as "required"); almost brought down a development center (that was supposed to ship it), and their biggest sin was not that they weren't ready, but that they didn't say so until it was too late.

The short version of all this, I guess, is that cutting scope can do miracles. I'm surprised that it's not used more widely - and somewhat saddened that even Adobe lost its skill at this (from my pov, the cloud has enticed the management to ship features that are not quite ready yet, and counter-intuitively, I think this actually slows them down now).


I work for a big software corp in SV with a yearly conference where we announce all of our products. Managers give zero shits about product quality as long as we deliver on time for the "big show". Bugs, future maintenance burden, and other shortcomings be damned because they'll probably be long gone after they get their promotion (and probably at another company rinsing and repeating) and by then, these issues become someone else's. This has caused engineers to become incentivized to place priority on delivering over quality. Funny how this short-term growth mindset comes top-down from where it all starts in Wall Street. If we want to fix the system, we need to first fix Wall Street and bring back accountability.


Counterpoint: I've worked at SV unicorns that were not public, and did not intend to go public in the near term. And they had a similar "damn the torpedoes! full speed ahead!" attitude. So I think it is more pervasive than just Wall Street.


Regulation for publicly listed companies means that the bottom is actually higher than for VC-funded companies, where nothing of substance needs to be generally known. (wework?)


Everybody reads the same management books as Wall Street.


I don't think it comes from management books.

The problem is deeper, it's psychological. At the level where full time, professional management begins it starts to become very unclear how to judge a manager's success over short/medium periods of time. Yeah in the very long run "is the product a success" can be used to judge, but that captures a lot of people who aren't the manager and may take years to figure out. You need to evaluate their performance before then.

So people come up with heuristics, like "does this manager meet their commitments". But software is inherently unpredictable so the answer to this question is always random. This results in managers trying to look like they're doing a good job by forcing early releases: they are thus seen to be "meeting their commitments" and must be good managers, with the quality issues showing up much later at which point they can just shrug and say, well, all software has bugs. Unless a layer of management higher up digs into individual bug reports and investigates really deeply to conclude, no, you forced an early release before it was ready, they won't be held accountable.


It's the principal-agent problem coming home to roost in a world without principals. Everyone is effectively an agent because the principals are sufficiently "diversified" that they don't really have substantial direct investment in long-term organizational success. What matters most is the short-term boost that you can use to dazzle the next employer.

Every employee knows that they're going to stay for 5 years max because companies have made it impossible for there not to be a massive market disparity if you keep stable employment. There are no perks to long-term loyalty and thus no reason to consider an employer's long-term interests as identical to your own. The outcome of this is the predictable situation we see now.

Like it or not, the company's culture comes from the top down. If the boss doesn't know or care whether the product is substantial, people who do will eventually be replaced at least until the boss is duly insulated from them. No one wants to rain on the parade of the guy who holds hire/fire authority.

There's a wider thing here too, which is that the market doesn't know or care about any of the technical details either. You need the touted features to work correctly just enough to create sufficient ambiguity, probably about 10-15% of the time. If you invest enough engineering effort to get them working correctly 90%+ of the time, that's a great deal more money and time spent on engineering for no market benefit. A competitor who dumps that money into marketing and sales will come out far ahead because technical quality and reliability simply doesn't drive sales.


SV == Wall St. West.

Same short term thinking, same self-aggrandizement, same convincing yourself of how rational you are, when most actions are driven by emotions, but with more casual clothing.


Wall Street "incentivises" the people that are there to supposedly make them accountable. Personally, that's where I believe the problem begins.


"big-splash release cycle is fundamentally breaking engineering"

I strongly prefer Apple ship functionality incrementally. No more big bang.

Especially be more cautious with new kits (core libraries). Just one or two end user facing features on some fraction of hardware. Then expand over time.

With Apple's installed base, it's an engineering marvel there's so little drama. But I want no drama, which means waiting a bit longer. Which is fine.

Source: Ecstatically happy Apple fanatic.


> I'm beginning to think the average employee or middle manager has even more perverse incentives to make poor short-term decisions.

Quite a lot of that, in my experience, is driven by fear of losing their job in an at-will employment environment.


I was thinking on this the other day, and I think what ultimately boils down to is this: there’s no “why” anymore at Apple.

Go back and watch old keynotes with Steve. Almost always, whenever he’s talking about a new feature or piece of hardware, he starts with the “why.” Not everyone will agree with the “why,” of course, but it’s still given.

Why do we want to get rid of the CD drive on MacBook Airs? (We see OTA software updates and media downloads as the way of the future and it wastes space.)

Why do we want to migrate from PowerPC to Intel? (We need better performance-per-watt so we can build MacBooks that have better battery life and don’t overheat.)

Why do we want to not have a physical keyboard on the iPhone? (Because the buttons and controls can’t self-specialize for each application.)

There are obviously exceptions to this rule, but by and large it’s fairly accurate. Now, watch the most recent keynotes. Has there been even a single second dedicated to WHY we need Apple TV+? Apple News+? Apple Arcade? Thinner keyboard mechanisms?

No, and it’s because we all know what the answer is.


There's still a "why" to those product launches. It's just that the "why" is no longer design-driven and is instead driven by business needs– namely, "we need to diversify our revenue streams away from just hardware sales and into services." This started when Tim Cook (instead of Jony Ive) took over as CEO.


I think the actual turning point was when Scott Forestall was fired & Ive took over software design. Prior to that, Apple had been a software company first. Even Woz talks about making hardware so that he’d have to tools to make software.

Ive created some amazing hardware, but I’m excited to see where the next few product cycles go.


> making hardware so that he’d have to tools to make software

A very good point in the context of neglecting the Mac, the primary tool used to make software for all the other prioritized hardware devices.

If they lose the developer community, who do they expect will build software for iOS, or the iPad that Cook loves so much? Seems very shortsighted.


Yeah, this is exactly what I was alluding to when I said "we all know what the answer is." It's the needs of the business, not customers, that are being put first.


Brilliant summary on the state of invention at Apple.

I think Steve really did give them the edge with design and innovation. Right now there's just no leadership with that kind of bold intent of any one thing in particular. Apple have sort of blended into other premium brands as well as premium brands copying a lot of what Apple do (Matebook X, every phone that copied the iPhone notch).

Are they re-architecting MacOS to be more secure as they've currently pushed for? Or are there middle managers at Apple who are measuring engineering by the number of commits their engineers do per sprint. Only time will tell - If they don't do this right (as with the butterfly keyboard) they really risk damaging their reputation as a premium brand with high quality.


I'm not sure if anyone has seen this yet, but Catalina is letting me login without entering my password!

I have two users on my machine.

1. I "lock screen" from the Apple menu and close the lid. 2. I reopen the lid and it does not ask for password. 3. I start using laptop and lock screen suddenly pops up, but asks password for the wrong user. 4. I hit random key and the screen goes away, and i can continue working.

Also, it looks like a lot of settings don't work on the lock screen / choose user screen. For instance, the pointer speed doesn't match what I have set, font sizes don't match either, and the resolution looks wrong.

In all... It feels like windows?


Despite having a lower quality than before since a few years, Windows has login and locking features that actually work (and I don't even really remember of bugs in there, like ever), so no, from what you describe it does not feel like Windows at all, it feels like some completely broken crap.


While on the subject, on Linux I've also noticed that Xscreensaver (or GNOME screensaver, can't remember which) sometimes goes straight into the desktop after wakeup for a few seconds before the lock screen prompt actually appears. You can even run programs. Really bizarre and feels like this issue has been present for a while now. Has anyone else noticed this or is it just me?


I think this is because xscreensaver confines (grabs) the mouse cursor to the area occupied by the "Please enter your password" window. If this area somehow suddenly happens to be out of your screen boundaries (e.g. when xrandr --offing external screens), the mouse is ungrabbed and can be used to interact with the desktop. It takes a while before xscreensaver notices and grabs it again. Someone should probably figure out the exact steps to reproduce this and report it.

As said elsewhere, locking X is really hard, and xscreensaver architecture doesn't help. This week I managed to crash xscreensaver login prompt twice, unlocking the desktop without entering my password, and that was the last straw, I switched to xsecurelock which separates the login entry into another process, making such bugs much less severe.

Unfortunately I can't reliably reproduce the crash. A hardware fuzzer (also known as a faulty ThinkPad keyboard) was involved, and I don't possess the device any more. I think what it did was press certain keys very often ­— the keyboard matrix is sampled at 125 Hz, so I'm guessing it was pressing the keys about 60 times per second, but I'm not sure which keys they were. If anyone manages to reproduce this, please do give me a shout. :-)


Yeah, saw this the other day on an Ubuntu machine (running whatever the last release was, not LTS) I have hooked up to a TV. Flash of the desktop for a second or so before the login screen. Speaks to something fundamentally wrong with the way the whole thing works, I figure.


I had the same impression. That general feeling that the screensaver is like an app/overlay that is invoked only once the desktop is active. The speed of invocation also seemed related to how fast your machine is at launching general desktop programs...


>That general feeling that the screensaver is like an app/overlay that is invoked only once the desktop is active

That is how it is implemented (in X11)


It's REALLY important to distinguish between Xscreensaver and GNOME screensaver.

Warning: copy-paste this link, if jwz sees an HN referrer, you won't be happy.

www.jwz.org/blog/2015/04/i-told-you-so-again/


I run the real xscreensaver though (I was building it from the source until around this time last year, now I run the one from alpine but I don’t think they’ve done anything weird (I haven’t checked though))

I’m pretty sure I’ve seen what everyone is talking about and it’s bothered me a little too (granted, my machine is configured in such an undiscoverable way that just opening an xterm window is obscure enough to be nearly equivalent to a very short pin, and then you need to know bash. Now that I’m not in college and don’t have anything important that’s probably good enough even without xcreensaver.)


Computers were a mistake.


Indeed. I recommend a Butlerian Jihad followed by rigorous reliance on Mentats.

It's a joke but sometimes I really do wonder.


X locking is notoriously difficult.


I run xlock on suspend rather than on resume, seems more reliable. I'm any case, locking is trivial to bypass by pressing Control-C a few times on resume to kill xorg, hence why I also start X with "startx; exit" so that this drops to the login prompt instead of the shell.


I think this still isn't secure. Try "sleep 5; exit" and press C-Z. You won't be logged out. You probably want to use "exec startx" instead.


Thank you, I took a look at my shell man page and this is very helpful.


I'm interested to understand why. Is it because of the design of Xorg?


Yes, it's one of the reasons Wayland was created. Screensavers didn't exist when X was designed. https://www.phoronix.com/scan.php?page=news_item&px=OTI5MQ

> Right now with screensavers under X it's basically capturing the input and continually redrawing over the display. > With Wayland, Kristian plans for the lock-screen to be part of the Wayland compositor. In having the compositor handle the screensaver role, it can ensure that no window can appear atop the screensaver surface, it can properly detect idling and grabs already, and has complete control over the screen. Unlike the X design, there wouldn't even need to be a screensaver "window" that's on top but the compositor could just keep painting a black screen. For those interested in a "fancy screensaver", a plug-in could be used or an out-of-process Wayland client for drawing whatever you desire.


That bug has been open for 6 years or more.


for the record I used to be able to see a bit of the desktop on windows after waking from sleep before the lock screen came on. can't remember if that's still true, though.


Ah yes, I remember that happening on windows xp a lot when it was waking up or between screensaver and login timeout (when they overlap), you'd even get the transition-to-welcome-screen sound so you know it only just triggered the locking just then. I suspect if you are fast you can run something before it actually locks.


Recent Windows login issues I've had to deal with:

* Non-consensual insertion of Windows Update latency into my schedule. Often I don't mind. Sometimes, though, I really, really do.

* Said updates failing but giving no indication of failure other than taking infinitely long.

* Keyboard layout sometimes gets swapped back to QWERTY with no visual indication. This interacts especially poorly with stringent Active Directory 3-try-lockout policies.

* Network hiccups + active directory (or something) can cause login to spin indefinitely, requiring a restart.

* Login screen background occasionally changes to a random picture from my computer. Usually a wildly upscaled application resource. I haven't entirely ruled out my own clumsiness as a contributing factor, but I've also seen this in the wild, so it's at least a UX issue somewhere.

None of this is as bad, in a theoretical sense, as Apple's no-password fiasco, but it has resulted in a far larger footprint on my day-to-day activity.


There were some good bugs in the login part of windows 2000/XP. Most of the ones I know of involve opening a browser while still on the login screen (my favorite was in the help for the screen reader), which is running as the System user that has complete control over the system.


IRC this was only possible on windows 95 and maybe 98.


They fixed most of them somewhere around XP SP2 or SP3, where they pretty much disabled help functionality on everything on the lock screen.

When Cortana was introduced, they had some issues of being able to launch the browser while the machine was still locked (though as the user that the computer was locked by). You couldn't do much as the previous bugs as the lock screen still covered everything.


You could do this on XP too.


Wouldn't know about the windows lock screen particularly, and I haven't been a windows user since Vista, but such low quality software and UX is what I remember being accustomed to during my time with windows.

To be fair, it was also vista...


Windows 7 was the peak. Stable, simple to use, performant.

Of course it didn't have all the features that were introduced since, but it also didn't have as bad a UX as the more recent versions.


I've never seen a bug like that on Windows. Honestly, it feels like a lot of Mac users live in a bubble where Windows is a buggy pile of crap. Meanwhile most Windows users around the world are getting on with life on a stable OS with a great choice of hardware. While nothing's perfect, Windows is in a really good place at the moment.


As a Linux user I feel like windows users live in a bubble where where computers are more or less expected to behave like diseased wild animals rather than machines.


Good joke. Sounds like you haven't used windows since ME.


I came up with that idea (windows makes computers behave like organisms and not machines) during an internship where everyone was required to use corporate computers running Windows 10.

I haven’t used it on a personal computer since XP though so I guess you’re not far off.


Diseased animal fits Windows 8.0 quite nicely I think.


On the other hand, I could scale my external monitor perfectly with W10, while on Linux the answer is still xrandr. Which makes everything blurry. Unless you scale Gnome/KDE to 2x and downscale.. which makes everything slow.

Gah... i hate computers.


I've spent 15 years on a Mac before switching (back) to Windows *

I must say that Windows has... changed... In a good way. But that's not even the point. The point everyone keeps forgetting, is that OSX is tightly coupled with native Apple hardware, while Windows has to work on a zoo of devices.

[*] I actually blogged about this here, sorry for a shameless plug: https://www.jitbit.com/alexblog/277-back-to-pc-after-14-year...


Except that I can not open "Windows Search settings", since it crashes after loading a few seconds. And all the other shit. It might not be as bad as this Mac update, I don't feel confident doing a comparison. But certainly take issue with "Windows is in a really good place at the moment"


Not OP but there will always be counterexamples for any software with a sufficient number of users. I've never experienced the bug you're referring to.


> While nothing's perfect, Windows is in a really good place at the moment.

Sure, minus the fact that my Windows box would be _spying on me_ if I wasn't a very technically capable person. That's a non-trivial driver for a lot of folks when they decide to pick Apple products. Or, at least, it was until recently with the iCloud / China debacle.


My brand new PC running Windows 10 will simply not wake up from sleep, requiring a hard restart. A google search turns up thousands of similar complaints (and no solution).


I had the same issue. A bios update was what fixed it for me.


You must not have ever used Windows 98.


It was released 21 years ago and its extended support ended 13 years ago


Since Mojave, waking my 2012 rMPB results in a flash of desktop and content before showing the lock screen. This sounds similar, if not a continued degradation.


This has been a thing for longer than that.


The huge spam of confirmation popups reminds me of the "debacle" of Windows Vista introducing UAC. It's kind of inexcusable that, a decade later, Apple did exactly the same UI fiasco.


report to Apple and get the bug bounty


Apple fanboy and software developer here. Also disappointed with the Catalina release. Had do an NV RAM reset, boot into safe mode, talk to Apple support to solve iCloud issues, and also click away dozens of privacy notifications.

But I still really don’t consider windows or Linux as legitimate alternatives because of the ecosystem log in.

Yes, macOS has its downsides. But what keeps me from even thinking about Linux are all these little niceties: - copy something on my iPhone, paste it on the Mac (still regularly leaves people speechless when they watch me do it: “what!? You can do that?”) - my watch unlocks my Mac - my desktop is always in my pocket, all files sync’ed per default, zero config - Start Reading something in safari, Hand it over to my iPhone in a second and walk out the door - All my browser tabs synced across my Mac, iPhone and iPad - I can curate my TV Watchlist on my Mac and it’s automatically available on my AppleTV when I get home that night

I could go on with dozens more of these little things that I can’t imagine living without anymore. I know that it’s probably possible to achieve most or all of this with a Linux/Android/Chromecast Setup. But every time I watch some of my friends do it, it just looks like so much work to set up and maintain.

The only argument I understand here is the joy of tinkering and that feeling of achievement when you get it to work in the end. Apple is a bit boring in that regard. A lot of the integration stuff just works (yeah, yeah, sure not all, but still more so than on any other platform I know).

I’m not 20 anymore and I just prefer to spend my time with other things these days than tweaking my OS.

So although I wish Apple would invest more into quality on macOS again, for me the walled garden just totally works, and I’ll stick with the lesser evil.


Those features are in no way exclusive. The clipboard sync? KDE connect does that for me between my phone, my laptop and my desktop (even though I use gtk desktops). All my files are in my pocket with syncthing, updated in real time through the filesystem watcher. Tab sync? Just log into Firefox, boom, done. All those things are set and forget - enable once, never think of them again. And they just work and keep on working.


The thing is that non-technical people aren't using those features on windows/linux desktops even if it is possible to do them, whereas they are using them for macOS and iOS

It's the barrier to entry and knowledge. I agree for tab sync but for most things it takes a degree of setup and also knowing that the feature even exists to set things up, but even casual Apple users seem to know about and use these features


I don't understand why people update straight away. Leave it a few weeks/months. Especially as a developer.


Early adopters are the reason bugs are reported :)


iOS13 comes with a load of security fixes and new features - people are encouraged to update ASAP.

iOS13 comes with a load of breaking changes in iCloud. iOS12 devices and macOS apps (e.g. reminders) no longer sync with iPhones until you upgrade.


Can I just say how happy I am with Xfce. It's a Linux desktop environment that looks like in the early 2000s. It's fast. There are no unnecessary frills. It just gets the job done. Its release cycles are measured in several years, and keep it minimal. I used to be on Windows, then macOS, then Ubuntu and now this. As a developer with soon to be 20 years of experience, it's the best environment I've had.


So was I until I switch to a 4k monitor and realize that there's no real, stable, functional fractional scaling on Linux, even less so on XFCE, unfortunately.


I main Linux on virtually every one of my computers, and I love it. I wouldn't switch off Linux if you paid me. But this complaint is spot on. It drives me crazy that we still don't have good support for higher resolutions.

And there's nobody else to blame for that -- we've known for ages that 4K and fractional scaling was going to be a thing, just like we've known for ages that touchscreens were going to be a thing.

But nope, let's just measure everything in pixels. It's like the majority of native developers on Linux all looked at responsive design on the web and thought, "I'm pretty sure that's just a fad." Everybody just dug in their heels almost on principle or something and refused to make it a priority, and now we're behind both Windows and Mac when it comes to high-resolution touch devices.

And I still run into people who argue that what we should just scale the physical size of a pixel for the entire desktop by a percentage, just so we can keep building fixed layouts that absolute position all of their elements. At a certain point, it feels more like a cultural problem than a technical one.

Everybody else is doing responsive design. QT already supports `em` units (well, sort of[0]). We could be using them on Linux.

[0]: https://doc.qt.io/qt-5/stylesheet-reference.html#length


Try using Linux on a laptop with an hdpi internal monitor, and a regular old external. It's a guaranteed way to generate a daily urge to throttle yourself. (Using Gnome in my case rather than XFCE, but it's still an unparalleled shitshow)


I used a lot of mixed screens with KDE, from 150 DPI laptops to 30 DPI tvs to my main desk setup having DPIs of 105, 125, 155, and 115 respecitvely.

GTK3 and Qt5 software works great, anything else is a pot shot.


My work laptop is in this situation and it got to the point where I just ran the eDP display in 1080p instead. Honestly on a 15 in screen where I'm only doing code and not editing photos, there's no point in running it in UHD.


I dismissed this initially out of a felt need to max out the use of my hardware. But actually it does solve some problems. I prefer native resolution + 200% scaling when using laptop (also a 15") on its own. But given the laptop screen is further away when I'm plugged in at the desk, 1080p is more than adequate. It also means I can go back to Xorg, which obviates some Wayland bugginess.

So thanks for the idea - it truly had never occurred to me (oddly).


Depends on your distro but on Debian+XFCE I use ARandR (front-end to xrandr cli tool, which is a front-end to xorg's randr). Laptop plus external monitor works great, with auto-config on connect and other stuff.


Protip: ignore "scaling".

Step 1: put 'xrandr --dpi <your actual DPI>' in .xinitrc

Step 2: Use QT applications (Plasma is a fantastic QT desktop)

Step 3: Enjoy your reasonably sized everything.

"Scaling" is a broken concept to work around applications assuming 96 DPI (which is considered scale=1). You don't need it if you use programs that actually respect your real DPI. Unfortunately X11 doesn't properly compute DPI settings, even though EDID information generally contains the screen size - I imagine, for fear of breaking stuff.

(You can correct GTK3/GDK applications by setting GDK_DPI_SCALE=<actual dpi / 96>, but in my view it's a sin that you need to do that)


Thank you. I struggle to understand people that claim linux does not do hidpi.


This is the biggest thing keeping me from leaving the Mac for Linux, and it hasn’t really improved much in the last 4 years. I don’t think most people care about HighDPI (or whatever you want to call it), particularly Linux devs. But I can’t go back to lower DPI screens.


I do care about high dpi but not in the way most people seem to mean: I want as much text as readable on a screen. One of the many reasons for not using Gnome everything is that it insists on making everything SO BIG with enormous amounts of wasted space around it. Those 1600x1200 pixels on my laptop or 1920x1280 on the monitor are there to be used, not wasted. Firefox used to be another offender in this respect albeit one which is easier to tame - just change layout.css.devPixelsPerPx to something sensible (0.7 works quite well), move all controls to the navigation bar and close all toolbars.


I have used GNOME with HiDPI for two years on Wayland and it is great. Even Tk apps seem to support HiDPI these days.

However, this is all assuming that 2x scaling is fine for you, support for fractional scaling is not great yet.


Ubuntu 18.04 using 2x scaling is broken in lots of little ways for me.

From the simple (boot screen runs at 1x so text is unreadable) to the wierd (VirtualBox tangle of bugs), to the frustrating (dolphin file manager, which I use to avoid UI bugs in the gnome file manager).

I found workarounds for some issues, and maybe they are fixed in the next LTS, but there is no way I could recommend 18.04 with 4k to a non-professional.


Nah it's too big on a 28" 4k, trust me I tried everything before going back to Windows, even using Gnome Shell (which I despise) and experimental fractional scaling (which barely works, freezes and scales back to 100% randomly)


> I don’t think most people care about HighDPI (or whatever you want to call it), particularly Linux devs. But I can’t go back to lower DPI screens.

I find this surprising. I'd say anyone who does any serious amount of multitasking (whether a Linux dev or not) would easily want one. I think people do care but they are just waiting on better pricing/availability.


> I find this surprising. I'd say anyone who does any serious amount of multitasking (whether a Linux dev or not) would easily want one.

I would have thought so too, but this isn't my experience. The smartest developers I know at my current job still develop with old laptops with awful 720p screens (I mean, awful beyond just the low res) and can't be bothered to attach an external Full HD monitor, let alone ask for a 4K one. And still they are brilliant, produce great software, and are key when planning profound software changes in the company. These are people who I deeply admire and from whom I always learn something valuable when they speak. Keep in mind they are also developers, not "whiteboard engineers".

My conclusion is that we nerds tend to overestimate ergonomics, because they are easier to see ("pff, I can only type with a mechanical keyboard!", "how can people code with fewer than three 4K monitors!"), but the actual bottlenecks and difficulties of building complex software lie elsewhere.

I still can't stand 720p screens though.


My experience with hiDPI on Linux has been pretty varied. On GNOME, everything just works, and I recommend GNOME to most first-time Linux users. I use i3, and the scaling situation there involves setting some environment variables globally, but is otherwise fine (GDK_scale or something like that). The only issue I usually run into is when Firefox opens my file browser, which breaks the scaling somehow. Other than that, hiDPI is as good as it is in windows. MacOS definitely leads in that regard, but things have certainly gotten better to the point of not really worrying about it in Linux these days.


Seconding i3. I bought into vim when the evangelists came for me early in my career and it has payed wonderful dividends. I can't recommend i3wm for everyone but for those who like keeping their hands on the keyboard and want the wm to just get out of their way, i3 is perfect.


I think things will improve as more Linux devs get HiDPI monitors (ones with good IPS panels are still quite expensive). Ubuntu has some beta-level support for fractional scaling but it looked awful when I tried it. But I live in hope that it'll all be sorted out soon!


> On GNOME, everything just works

This is far from true. Gnome shell itself is mostly OK (if buggy), but only a minority of real apps work properly. The standard Gnome apps (Nautilus, Gnome Shell etc) are fine, but that's where the support pretty much stops.


I prefer low DPI monitors, i just like the pixels :-P.

(writing this from a 23" 1366x768 desktop monitor)

And honestly a lot of things are so broken in hidpi anyway that i do not see a reason to bother.


Qt 5.14 is now in beta and has a toggle switch to do per-display scaling based off DPI on all platforms.

I've had a 4k 27" monitor mixed with 1080p and 1440p monitors of varying sizes for years and have managed to get 90% of software working great, and whenever I dip into Wayland get to use the fractional scaling there it goes up to 99% of software.


Interesting. I've never really cared too much beyond a color calibrator (x-rite). What is an example of a HighDPI screen that will only work with Mac?


2.5K monitor with slightly bigger fonts and a few tweaks here and there is best of worlds on linux now, sadly


I used Mac and Windows before, they are no way better, they all come with their own shortcomings

At least XFCE just works


I've run Linux (i3 window manager) on UHD screen since around 2015 when I bought this laptop:

https://battlepenguin.com/tech/msi-ws60-running-linux/

It isn't simple? no. You have different env variables for QT, GTK and xrander can set its own DPI as well. It is kinda a mess, but I use i3 so I figure I'm in for whatever pain I put myself into.

I've been running Linux on HiDPI devices for a couple of years now and I've got things sorted to where it's no an issue on any new device. I need to give Wayland/Sway another shot at some point. I hope it has better native zoom support than X11.


By using a combo of xrandr and Xfce's hi dpi setting, I've managed to make my linux desktop more stable, functional and consistent across a 4k screen and a 1920*1200 screen next to it than Windows can manage.

Sure, I had to put in a bit of effort, but now that I have the results are excellent, there's no scale jumping or even the weird rendering MacOS does when moving windows around.


Until you get new hardware and have to do the whole dance again.

Now that I'm older, it's just a waste of time.


It's not much of a dance now I've figured it out.

It's worth it to me and like I say - it works better than than windows or even OS X on the same screen setup.

If that's of no value to you, that's up to you I guess. I'm in my 40s so no spring chicken...


I would really appreciate if you could share some of your findings there.


Honestly I just had to switch window scaling to 2x in the Xfce appearance settings, then I run this to fix my layout on each login, giving me a consistent size and no weird rendering -

#!/bin/bash

xrandr --output HDMI-0 --mode 1920x1200 --pos 0x100 --scale 1.5x1.5 --output DP-4 --mode 3840x2160 --primary --scale 1x1 --pos 2880x-150

AFAICT this effectively renders the smaller screen as if it had a much larger resolution (2880x1800), then downscales, giving a good quality image, and then positions the 4k screen to the right of it. The key is the scale factor compensating for different DPI on the different screens.

I tried scaling the other way first, by giving a sub-1.0 scaling factor to the 4k screen, but that meant rendering a lower res and then upscaling, which looked terrible!

The only annoyance now that I have this set up is that occasionaly driver updates change which output is which and I have to update the script so it works again.


Yes, please share! How did you deal with the window title bars?


Not sure I did anything special, I'm using the clearlooks theme with some customisation of the fonts. See the other answer for how I sorted out the res/DPI stuff.


KDE has excellent support for fractional scaling, to a single decimal point. It is also a very lightweight and fast desktop environment, but has plenty of useful features.


I did try with Plasma on both Kubuntu and Arch at 1.2 and in both cases some controls had very ugly and/or blurry fonts and some were fine. The inconsistencies were too annoying.


4k laptop monitor and 4k external monitor, XFCE4 2x works perfectly

I also use 2 different resolutions without problems

KDE plasma gives you fractional scaling

I've heard Wayland supports different scalings on different monitors, but I'm on Xorg so I can't say if it works


typing this from a Linux desktop on a 4k screen, don't understand what does not work ? I can set whatever dpi in my .Xresources's Xft.dpi key and it looks fine


until you plugin a second simple HD monitor and everything looks huge on it


This lol and the tiny title bars that can't be resized in XFCE unless editing the XFWM theme images (unlike Mutter where the title bar size can be changed in a file).

Sorry but I don't have time for this anymore. I would have on my SUSE box in 2003.


Not if you use Wayland. GNOME with Wayland works fine with screens with different DPIs.


Gnome does, but a WM that works isn't much use without apps that do likewise. Linux desktop software support for hdpi is an absurd mess. Some apps can switch resolution when you move them between screens. Others can't. When you combine this with Gnome's inability to deterministically start an app on the main monitor, this means sometimes it's sometimes literally impossible to get an app in the resolution & on the screen you want. Some apps start up with an apparently random resolution (Calibre, I'm looking at you!). What a horror show it is.


Wayland sort-of handles this. That is to say you can set scaling per-screen without xrandr hacks (with Gnome anyway, not sure about xfce). But there's such a variation in support from apps, it's still a mess).


It's actually quite good now using Gnome with Wayland on Arch Linux (latest releases of the various components).

Only a few apps have given me trouble in recent times.


Therefore I got a 4K monitor at 24 inches so that it would be usable at 200% scaling. For 27-30 inches I would aim for 5K.


I feel the same way about i3 and not too far over the 20 year mark myself :)


I like and use XFCE as well, but how is that relevant to this post? haha


What are the language and dev tools that you use that allowed you to freely transit from one OS to another again and again? Thanks.


Yes, it is so solid and also easy to customize for keyboard navigation, etc.

Printing, wifi/vpn, keyring, graphics drivers... everything just works.


What Linux distro do you run Xfce on?


Just Debian.


I'm posting in the Linux thread because it's probably annoying to some to see 'but I moved to Linux' threads throughout this comment.

But the original post is very sobering.

I left OS X a few years back when the new laptop came out. Partly the whole USB-C thing I suppose, but I felt that there was pretty good hardware around and the new MBP didn't really shine (and was fiercely expensive). So I made the move to Ubuntu on some new Lenovo hardware (Carbon X1).

I really look forward to the twice-yearly Ubuntu releases. More of the same, rarely any huge new surprises and just generally more polish. I upgraded to Ubuntu 19.10 immediately once the 19.10beta came out and it seems speedier.

I'm sorry to see Mac users get so badly burnt but I'm very glad I made the decision to leave when I did.


An X1 with Ubuntu is butter-smooth


I think the Apple touchpads are the best in the industry, along with the displays as well.


The displays are good, but you get similarly good displays on similarly priced machines.

https://www.notebookcheck.net/The-Best-Notebook-Displays-As-...

For the touchpad I am undecided how much here is software and how much is hardware. E.g. install ubuntu on a macbook, is the touchpad still great or just average?


Does systemd not bother you?


Xubuntu all day every day (seriously, I've been on Xubuntu for the past 5 years and I love it)


I've been running fluxbox on Debian since ~2002. Works great; does everything I need.


> Can I just say how happy I am with Xfce.

Yes.

> There are no unnecessary frills.

My macOs machine doesn't have them either.

> It just gets the job done.

Mine does as well.

Do you want to talk about frills?


1) After upgrading I had the avalanche of permission requests. I clicked through all of them on day 1. Needless to say it’s running fine if you ignore running any of the new features.

2) For the first time in my 10+ years supporting Macs at an enterprise level, I’m holding off on upgrading my company to 10.15.

I’m genuinely pissed that there’s an over abundance of annoyances when upgrading that it’s not the Steve Jobs experience. Yes he’s gone, but his spirit and ideals for what brought Apple back from bankruptcy 20+ years ago.

I’ll veer off of Catalina into a Steve Jobs decision that made sense in 1997. Apple had too many products and it was too confusing. Steve said two types of users. Consumer/Pro. Two types of computers for them, portable/desktop. Now we have five different iPad lines. Multiple iPhone lines. What made things simple to focus on at Apple have gone away to colleg grads with no experience.


>What made things simple to focus on at Apple have gone away to colleg grads with no experience.

rather the decisions have went to the business people, and that's why Apple went bankrupt the first time around


Catalina is definitely the most badly broken release in the last 10 years.

I can't get Chrome desktop notifications to work anymore, but the worst thing is that all my Apple Music playlists have disappeared from my iPhone.

They do exist in the Catalina Music app, but not on the phone where I need them. I suspect it's because my iPhone 6 isn't supported by the latest iOS release and there appears to be some iCloud version incompatibility that was triggered by Catalina. The iPhone Music app has crashed a few times as well.

They do warn me of just such an incompatibility when I open the Reminders app (which now has a far more convoluted user interface for no apparent benefit). I get the feeling that my decision to just keep using my iPhone 6 until it's not longer good enough will be difficult to sustain.

Catalina also randomly stalls for a second or two and some mouse clicks just don't register for a long time until they eventually do. This seems to have gotten a bit better now so I'm hopeful that it was related to uncached data.


I can add something to the list which is Sidecar i.e. using an iPad as secondary screen. It was a real nightmare to set it up as it’d not detect the iPad despite meeting all the requirements. I found no fixes online so I had the idea of logout & login again (iCloud) so that it might detect it.

Well, it detected I had an iPad but the connection would time out every single time. After rebooting both the Mac and the iPad (another wild guess) it kind of worked - very laggy and couldn’t use the Mac properly as clicking in e.g. System Preferences would open Finder to show me the app under Applications instead of the actual system preferences panel because of reasons.

It’s truly broken and felt like an alpha version. But this has been the trend with macOS and guess it’ll stay like this unless someone high enough at Apple decides to wake up.


SideCar was the reason I was excited for Catalina. But alas, the feature is not supported on 2015 Retina Macbook Pros, only the newer ones with the horrible keyboard, pointless touchbar, only one kind of port, etc.


I'm in the same boat. Installed the betas on both iPad and MBP about a month ago. Turns out my 2015 pre-horrible-keyboard MBP is too old. And none of the published hacks around the restriction work nowadays.


Wish your experience was better than this. For me, sidecar is total game changer and the best!


Same, it worked perfectly first try for me.


Had the same issue initially. Called Apple support who debugged it via remote desktop. Turns out Sidecar uses iCloud for handshake between the two devices (god knows why), and for some reason requires 2FA on iCloud to be turned on for both the iPad and the laptop, on each device. I didn't have iCloud 2FA setup completed on the laptop, so it didn't work until that was resolved.


Not helpful but I had the opposite experience. Sidecar Just Worked with my MBP and 2016 iPad Pro, and drawing on it was extremely smooth. A better experience than Astropad so far, personally.

Did you try connecting to it with a cable?


All the nightmares are nightmares until they're not nightmares. Weird launch day bugs like the I bug on iOS 11 are distant memories.

Yeah, these bugs are bad, but to me articles like this really feel a lot like a broken record. Every September and October, when the major releases come out, there are bugs. And then by the time you're on 10.xx.2 or iOS xx.1.3 a month later they're just a bad memory.

If you don't want bugs, don't upgrade right away. That's not an Apple-only phenomenon - remember the Windows 10 October 2018 update that had to be entirely retracted?

10.14 is still supported. 10.13 is still supported. 10.12 is still supported. Keep using those if you want.


Someone get Apple marketing on the phone. They need to adjust their keynote messaging to: “It’s our best desktop OS ever! Oh and one more thing, don’t use it yet, it’s pretty fucked and we’ll hopefully fix it in a few months. It’s on you if you want to do our unpaid QA. And now, it’s my pleasure to welcome Bono to the stage to demonstrate the new animoji!”


Except the rest of the ecosystem is constantly egging you to get on with it. I upgraded my phone to iOS 13 and right away it wanted to update the format of my Reminders, which would require I also update my mac.

I will “resist” Catalina as long as I can, for the simple reason that it will kill tons of much-loved software (from major games like xcom to small little opensource apps that may or may not ever get an update). But I already know at some point Apple will tell me that I have to move on just to keep some stuff working as before.


> 10.14 is still supported. 10.13 is still supported. 10.12 is still supported. Keep using those if you want.

Unfortunately, not well enough. Catalina contains security fixes that apparently [0] haven't yet been backported to Mojave, let alone the others. I wonder how early or late into the beta those bugs were fixed; potentially, they've been patched for months.

[0]: https://support.apple.com/en-gb/HT201222

https://www.theregister.co.uk/2019/10/07/apple_catalina_secu...


In the last years, Apple has only shipped security updates for previous versions of macOS when the .1 release of their new OS came out. So you either dive into a buggy .0 release, or you live with well-known security issues for a month. It's ridiculous.


> And then by the time you're on 10.xx.2 or iOS xx.1.3 a month later they're just a bad memory.

As are your files. The author states that some of these bugs are causing file corruption.


The author stated that the beta software did that over the summer. The author also stated that using their own personal account instead of a test account.


I’ve gotten calls from my mom after she’s done a software update and had a bunch of things break. She’s more careful now, but I think the expectation set for the general public is that when someone announces an update is available, that it’s not going to be Beta.


> If you don't want bugs, don't upgrade right away.

And what about people who buy new laptops? They're just screwed?

And, your new laptop hoses your iCloud so you lose functionality on your old laptop.

And, 10.12 claims to be supported, but you can't run the latest versions of some software (like Xcode). So, it's "supported(don't read the fine print)".


That cavalier attitude is fine for NEW features, but not for basic apps and foundations of an OS.

iOS 13 shipped with the Mail app broken. That's almost as bad as not being able to make calls.

Would a BlackBerry ever ship with a broken email app?


BlackBerry's first and only tablet (the PlayBook) actually shipped without an email app. Was it courage or a company falling apart, you decide :)


Yeah. I don't use any 32 bit software so I'm good there. Catalina has actually been the smoothest macOS upgrade I've had in awhile when it comes to all my 3rd party apps working. One annoyance was with some of my CLI programs (like Terraform) where I had to open them from the Finder right click -> open to get them marked as safe to run. That was expected though.


But the second iOS X.0 is released, the support for iOS X-1.* is dropped immediately. The massive amount of security patches saved up for X.0 releases are not backported.


I mean Windows 10 May 2019 release was "deemed safe" only at the end of September. It is no surprise. Companies push to meet deadlines which in turn results in buggy releases. It happens in video games quite a bit too where the full release is nothing more than a buggy beta for a few months.


Couldn't agree more. These updates just aren't compelling. It's been a long time since a newer version had speed improvements. I stayed on Mavericks and El Capitan until support ran out. And I'll be waiting again until all the necessary 32-bit software has been ported.


Okay, so you're saying that a dozen other games that I like to play sometimes are going to magically start working after I wait for six months before updating to the new macOS version?

These games are not even that old, released from 2010 to 2016 or so. And Catalina won't let me play them, because some asshole decided that I'll be fine without them, and removed 32 bit support from the OS.

Meanwhile, I can still play games from 1995 on Windows 10 without any problems.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: