Hacker News new | past | comments | ask | show | jobs | submit login
Apple avoids job cuts because it didn’t overhire like Google and Amazon (bloomberg.com)
529 points by mfiguiere on Feb 10, 2023 | hide | past | favorite | 426 comments



> Apple, meanwhile, was more cautious. Its headcount increased just 20% from 2020 to 2022

At Apples size this is still pretty huge. At least, I thought it was until I saw that Amazon, Meta and Salesforce basically DOUBLED in 2 years.

If this data is correct isnt that insane? How can you effectively double orgs of this size over a 2 year period? Its not like a startup with 50 people or something. I guess I was out of the loop on just how big the hiring was over the pandemic. Not that layoffs are good for employees and workers like me, but these companies must still have retained tons of the hires as RIFs in tech seem to be about 7-20%.


Growing too fast is damaging to both culture and productivity. You have a bunch of people starting that are still learning. Depending on the job and the person it can take a full year to be fully competent in a job.

Apple was smart to move slower. They are probably one of the best run companies in the world.


"Depending on the job and the person it can take a full year to be fully competent in a job."

Some of us never become competent.


And if you do become competent, you're soon promoted to another job you're not competent for.


And if you aren't promoted out of your competent zone, you will need to job hop out of your competent zone in order to obtain an actually competitive salary.


This is why I turn down promotions. A promotion (especially to a management position) means another step toward the sort of work I don't enjoy and don't want to do.


You have to be careful how you do it. I was punished for turning it down.

I was told that all I needed to do was work one extra hour per day (in addition to normal support, elevation, etc). Why would I take a position with higher expectations for a 7% raise and a 13% increase in hours - a rate cut? Plus, if I'm a high performer in my current role I should be getting bigger bonuses.

I have a skip level where the department head asked what my career goals were. And my answer was to stay in my role. After all, I worked as a tech lead for a year and a senior dev the year after, with no talk of advancement until what I previously mentioned (which would be a rate cut).

My manager got me from my desk the next work day and you could tell he was pissed. He asked me why I would say that sort of stuff and said that was "stupid". I generally tell the truth and am pretty honest - I guess I'm stupid that way.

I heard through a friend thst they wanted to get rid of me for those comments and I managed to switch teams in time. My career has been in a downward spiral since.


I can empathise. Telling them that you didn’t want a management-style position was probably the right move given all the reasons you laid out, but it’s also an implicit rejection of everything that the people around you are striving for. They equate management with productivity and respect. You’re thinking of it like the difference between a pilot and an air traffic controller. Jobs with vastly different purposes. But still, your decision called their value system into question in a very profound way. It’s not like saying that you don’t want that kind of job, it’s more like you didn’t want to work with them more closely. And most people, myself included, don’t have the emotional intelligence to handle rejection maturely. It all gets very petty.


That all makes sense, but this was for a promotion to senior dev.


I think (would hope?) that you can say this kind of thing is most companies, in the right way if you first ensure that your current level is the level at which "up and out" stops. In many companies that may be at Senior or above (or whatever they call the equivalent.

So if you say you want to stay at "intermediate", that's an "out" you're voting for. If you say it (in the politically correct way) at senior (or whatever is the correct level for it at your company), you can stay.


True, you don't want to burn bridges.

But taking the description of your situation at face value, I would have quit that company. Its values are so different from mine that I'd feel I have no business being there.


I didn't feel like I was burning bridges. I obviously worded it more gently and stuff.

It's consistently ranked as a best place to work. The written policies are great, but they don't matter because there are unwritten backroom policies. They are also one of the largest employers in the area and I can't move due to family issues. I was screwed, but sure, if one has other options, one should take them.


Yeah, that's a tough spot.


> You have to be careful how you do it. I was punished for turning it down.

Not only that - you may be perceived as a threat to anyone who takes the post that was offered to you and, since you turned it down, you have no actual power to control what happens.


I think I have the same type of brain as you. I say things all the time which to me are just obvious logic. And the reaction sometimes has been extremely bad.

I realized my “logic” brain scares the crap out of normies. I learned to ask other people for help with complex negotiations involving politics and emotion.

I realized I don’t process information the same as most people and this has been at the root of numerous career mistakes. Some of those mistakes were extremely damaging.

I recommend learning to run your decision making past people who are better in these areas than you. When it is a political issue especially. If you know you are not great at politics, you need to improve your decision making and avoid this problem in the future.

I learned after YEARS of stepping on landmines accidentally to be far more careful navigating internal politics. The costs of getting it wrong are devastating.

For example. If you indeed did the math and calculated (logically and truthfully) that management was a +7% increase in pay for a +13% increase in work that’s fine.

But if you used those exact words and numbers to a superior; they will view that as arrogant, selfish and entitled. Because their view is you should want to be the best you can be for the company and make a personal sacrifice to show you want the company to do well. That is basic “salary man 101” and doing the opposite of that (analyzing the exact math) is putting a target on your back as a “free thinking non team player individualist.” It’s career lethal to get that label

I suspect you may have a lot more Aspergers than you realize. Or maybe it’s a generational culture thing. And the most irritating thing about people who are on the spectrum is the refusal to introspect and self reflect about it.

I tried to coach a co worker who was similar. Dude was angering everyone. All the time. He didn’t want to hear that he had a problem.

I strongly recommend developing friends you can run these decisions by. It was literally a life saver for me. I realized I was like a blind person wandering around with a cane when it came to navigating politics and needed watch dogs who could analyze the situation for me so I didn’t blow my leg off.

In terms of this situation. It sounds like you did not prepare well for a long term career discussion. That is an extreme red flag. And didn’t take it seriously that a higher level authority was asking you for your plan.

You were being checked on whether you have a future at the organization. You failed.

I have failed these tests myself. I was late to a critical call with a VP. It ended my career. It showed lack of care or preparation and a lack of respect.

One late meeting ended my career.

I can’t blame you. If you are hyper logical in your mind you we’re probably just telling them that “2+2=4.”

The entire political career metagame that normies play does not have anything to do with logic. I personally don’t like it either.

It may also be generational. A GenZ who plans to coast and just get a paycheck while turning in a solid 30% effort might have given the same answer you did.

Some Aspergers dude wouldn’t know that this is career ending as an impression to give: “my plan is to do the minimum ser.” Is what your skip level hears.

It wasn’t what you said it was what they heard.

“I love the company and support you and want to grow into doing more, I’m so optimistic of the future.” That’s what they want to hear.

If you want to coast, at least say what I just said and give that impression. And you can probably carve out a way to mostly get left alone and negotiate your wants and workload. Or negotiate a project that allows you to coast. But holy shit don’t start doing comparative math on working hours versus pay.

Your company might not be doing well. If you don’t know that (they might not even tell you), they may really have needed you to do more.

I recommend working on your likability also.

Suspect your delivery and demeanor are just labeling you as an Aspergers robot.

Companies will tolerate incompetent, lazy, shiftless, unambitious workers who are positive, pleasant, non threatening, supportive, not “too smart,” keep their mouths shut, not crafty or aggressive. They will tolerate that forever. In fact they love it.

I suspect your demeanor and likability are bad. It starts with how you enter every discussion. Keep things light and funny. Never criticize, complain, bicker over hours or workloads.

Negotiate for what you want and have a clear plan and do it lightly and in a gentle way and watch the demeanor of those around you.

Doing Aspergers math on how much work they want you to do versus how much work you feel like doing: Holy shit. Don’t make that mistake again. That was me. Get a clue, learn to softened and likable. Don’t be logical. They hate that and they will hate you.

Be optimistic and clueless and get your job done.


What would you do if this came up again? Would you still demur, but perhaps give a different reason (family needs right now, etc.)?


Possibly the family needs thing (which is true for me now, but wasn't then). I'm not sure it would have even worked though. Once they bring up the prospect of a promotion, they've already used a bunch of political capital to open that spot for you and it's an insult to say no. I get is different for other companies or even just other departments, but that's how it was for that one.

The other thing is, my company measures your engagement and potential by how ambitious you are. The reason they wanted to get rid of me is because they thought with an answer like that I would be disengaged and didn't have any potential.


I don't know if it helps, but what I've done when turning down promotions is to be sure they know that I'm pleased and honored by the offer and value it. And that the reason I'm turning it down is because I feel I can provide the greatest value to the company in the position I'm currently in.

All of which is true, and phrases things so they know you're not turning your nose up at anything and that you have the company's best interests in your mind.


Seems like a good idea, but with the risk that if you say you're choosing to stay "for the company" then the company can insist that actually it's better for them if you go to the new role. If you say it's about your family situation, then it's harder for them to insist.


Well, that's the sort of risk assessment that only you can make. My experience has been that a company won't do that, but you know yours better than I do.

But if I were to work at a place where I felt similarly, I would absolutely be looking for a job elsewhere.


I haven't necessarily turned down promotions, but I've been at my job for over 2.5 years and have no desire to make the jump from "developer" to "senior developer" like most do, because while I'm sure you get a nice pay bump, I know firsthand it just means you're in way more meetings (of which I already think I'm in too many of) and writing less code.

I don't even know what the extra pay would be, but it's probably not enough to be worth making my job less enjoyable.


It depends on what "senior developer" means, though. With my last 4 jobs, I was hired with the title "senior developer" -- but what it indicated was my pay grade, not my responsibilities. I was just a regular developer in them all.


You don't become a true senior developer until you find a way to get out of those meetings.


Until eventually you are promoted to a role where you never become competent and there you stay.


For example, teaching new hires to do the job you just stopped doing in order to teach them - only for them to take over teaching the next batch as soon as they become somewhat competent.


Ideally it takes a full year for them to realize that you're not competent, at which point you hit your 1 year cliff and can snag 25% of your options or RSUs before you get PIP'ed out.


I've been at my company for 10 years. It went: competent -> high performer -> competent (promotion) -> high performer -> competent -> low performer


This is known as the Peter Principle.

https://en.wikipedia.org/wiki/Peter_principle


I mean, not quite. My performance at the same level went from high performer to low performer. So I was doing the role very well for a while, even getting good reviews for filling the role above mine (intermediate dev filling tech lead role). Performance gradually atrophied following my disenchantment and mistreatment.


Did you go from high performer to low performer at the same role? what has changed?


Things that changed: subdivisions/teams, tech stacks, managers, areas of the business, enterprise procedures/processes. Basically everything. They even outsourced my job, then laid off my entire old department to ship that entire business unit to another contract company.

Then at home, kids and multiple family health issues. I guess my age plays a role too.


If you're at Amazon you'll walk away with 5% after one year, and only 20% after two years.


True, but misleading. Amazon gives big cash bonuses to new hires in years 1 and 2 in order to compensate for the back-weighted vesting schedule.


As big as 20% of their RSU grant + the bonus they would get at another company would be? If not, not misleading at all.


Comp is meant to stay stable at the agreed upon target for the first two years. There is a cash bonus for years 1 and 2 used to offset the vesting schedule. The final 2 years are really dependent on stock performance. For example, I basically will need AMZN stock to get increase by 25% from todays value by Jan 2025 to be making what I did for the first two years.


From what I've heard, yes they size their Y1 and Y2 cash bonuses such that your TC will be stable over the first 4 years as long as AMZN stock goes up 15%/year.

eg one of the first Google results for "teamblind Amazon offer eval" turns up someone who was offered >$100k of cash bonus in each of their first two years: https://www.teamblind.com/post/Offer-Comparison---Amazon-vs-...


This is misleading. You get a prorated sign up bonus for the first two years that makes up for the back heavy vesting schedule. With all of the tech sectors stock price dropping, this has been a preferable outcome over the last two years.


Plus the cash sign on bonus worth as much as 50% of your stock. You actually come out ahead if you work at Amazon for only two years, that's why a lot of people duck out after two years.


And you hope the stock doesn't go down by 30 percent after 6 months.


Those are probably the ones being let go.

That has always been a concern to me about being laid off in one of these massive layoffs efforts. If it is pretty much understood that in these moves the companies are removing dead weight, does that stigma get attached to you as you search for a new job.


It’s pretty hard to be choosy at that scale. I don’t get the impression that these large scale layoffs were targeting individual low performers, but are rather slashing entire business units. (I might be mistaken, but that’s the impression I’ve got)


Yeah, from what others have said, entire teams are being cut.


Yep. This wasn't about low performers, at least not in my company.


If they can't hire competently, why do yoy think they can fire competently?


I think it's easier to identify a low performer after a year of work than it is to identify a high performer after an hour of whiteboarding.


Low-performers are not relevant to this discussion.

The accusation is that the companies hired more people than they knew what to do with / could apply productively in their business.

If such a basic task is beyond the capability of decisionmakers, who is to say that they are capable of collecting the feedback from rank-and-file employees required to assess who needs firing?

Mangers have many concerns apart from Performance, suppose we have Steve and Bob, and Bob is 50% more productive than Steve, but Steve is in the middle of delivering a critical project, millions of dollars are on the line. Meanwhile Bob is in between projects, and today they decided they must fire some staff, who do you think will get fired?

Also, is performance one metric? Suppose Steve writes worse code, but the business folks love his presentations, he is really good at explaining the issues facing the team. They can put Steve in front of a corporate client and win business. They never see much of Bob, even thought he is doing most of the 'real' work, so who is more valuable?

Unless you are dealing with someone totally incompetent, these questions are usually 'which skillset do we need most' and they are not so simple.


> who do you think will get fired?

Ah, hold on, I know this one.

Steve will get fired, because the manager will think they can just drop-in replace him with the high-performing Bob.

Bob will then reach burnout due to the high demands and short deadlines, and he'll rage quit.

Then they'll hire Chad in a panic, who performed well at a whiteboarding interview.

Six months in, they're thinking of firing him, not just because he bro fists everyone in the office, but there's no one to replace him with.

So they promote him to manager, because of his obvious leader qualities, and assign him to hire his replacement.


While you're generally right, you'd be surprised. You can go pretty far by throwing out a few buzzwords and sitting through youtube how-tos.

IME, it's when the crunch time hits, or an outage, or some other nasty surprise or fire, where you see the competency show itself (or not).


This is not as true as you think, but it does depend on the specific company. When I've been part of the hiring process at various companies, I've never seen anyone penalized for having been laid off.


My experience is in cloud and I can give some insight on what we were thinking at this time. In cloud, infrastructure is key and our theory was that we needed to beat the other companies in terms of scale and capability. So we were aggressively trying to scale to get ahead of the competition. The environment was highly competitive all the way down to even the PCBA manufacturers. As an example, at one company, to get their "A teams" you need to be their top customer in terms of volume.

So while it looks odd by comparison, cloud providers were in a bit of a different environment than Apple.


there is a cultural mindset especially among startups that it's better to be alive and having to make layoffs than to move conservatively, have a faster mover overtake you and push you out of the market, and be dead. "The graveyards are full of businesses that moved appropriately for the market conditions", to warp a saying.

apple is an enormously stable company in ways that (to be very honest) even most other tech blue-chips are not. even microsoft or google have the sword of damocles hanging over them - given a sufficiently long and severe position of mismanagement it is possible that even google or MS could be unseated or even go under. arguably that is the trajectory MS was on in the 00s, and google could be similarly unseated by AI.

intel would be a great real-world example of that. Even 10 years ago the thought of big blue losing control was unthinkable, un-thinkable. And here we are, they're circling the drain right now and need to make very deep cuts and refocus on the essentials or they're going to be in bankruptcy in 5-10 years, during a down market in general and a specific market that they glutted during the pandemic (a lot of customers have enough PCs/laptops/servers for quite a while). And in contrast Apple is still making money hand-over-fist despite the market conditions - now that's stability.

I'm not going to say that google or MS doubling their size on a short-term timescale sounds like a good idea but it's a ride-or-die industry, you're pretty much either getting bigger or getting smaller, and mere homeostasis is a rare luxury.


This is kind of a west coast SV/social/adtech type of startup mentality.

Working east coast in more traditional tech companies this has never really been a thing. It's always been conservative, staying profitable, etc.. It is the older way.

Apple, MS, etc.. were built this way too of course.


Well, of course, how many of the east coast minicomputer companies are still around? That said, I agree in general, there's probably a general difference in approach between east coast and west coast. (And the mix of companies--and government--is generally different and reflects that.)


"I think Apple is one of the strong companies and will stay a strong company, I think it's ungodly well-managed." - Charlie Munger


> I think it's ungodly well-managed.

So satan is involved? That might explain how Jobs came back after getting boot #1.


I use "Eternal September" for the cultural effect on businesses that suddenly gorge themselves on new hires that have no exposure to the existing culture or context. The negative effects are loosely parallel to what happens when a social medium breaks out of being a subculture and goes mainstream.


Apple does a lot right. However, I find their insistence on keeping their teams in an office to be strange. My experience with Apple internally was an overly positive and optimistic culture, to the point where they had a hard time identifying, or discussing, what did not work well.


Aside from all the talk about the crazy ones, the misfits, the rebels, etc. I found Apple to be very conservative and buttoned up, which I found very reassuring. I've never worked anywhere else in the FAANG-sphere, but it seemed like the polar opposite of "Googley".


Apple built one of the most expensive office complexes in the world. They are going to want people in the office.

Additionally they deal a lot with hardware design. Being in the office is going to be easier for certain things, particularly when you are as secretive as Apple.


I predict that Apples in office culture is going to be key to their success.


This comes directly from Steve Jobs legacy. He talked a lot about the value of small teams and how they self regulate but can breakup if they grow too quickly.

https://youtu.be/wTgQ2PBiz-g


However, if you hire twice as many people, potentially twice as many people will be competent the following year.


> Growing too fast is damaging to both culture and productivity.

This seems to be general sentiment here lately. Do you have any evidence to support this or is purely conjecture?


Not that I have a rigorous source, but I feel we can generally agree the following:

New hires take anywhere from 3 months to 8 months or more depending on the position to ramp up

During ramp up period, there’s a productivity toll taken on the team via teaching the new hire

Thus during this time, productivity is hampered

——

For culture, I think it’s a bit more subjective. But if you have a team of 5 who then gets 1 new hire to teach over 6 months, generally it allows them to “mesh” better in the culture.

Both in terms of team mates learning about the new hire, and the new hire having room to fit in within the culture, and bring their own value to the team (assuming an already existing good culture.

Now say you have 3 new hires on a team of 5. If we further presume productivity is also hampered more as more people are onboarded at once, that creates additional stress on the team to teach them.

Further, it could be harder to have the new hires feel part of the team if less time is spent getting to know each one. But assume this isn’t the case:

Business will most likely expect at least the same productivity from the team. Now they’re stressed from teaching, higher expectations on their output (onboarding a team member is tiring work), and the business is going to soon expect productivity to increase further after headcount is upped by over 50% in this example.

Now take that, and add on new teams created to interact with, additional communication layers, and it’s pretty straightforward to assume growing too fast can be negative


Okay cool. But, none of this addresses the opportunity cost. What happens if you don't hire fast and you miss revenue opportunities.

As the saying goes "revenue solves everything".


Cash flow solves everything, or profit solves everything are perhaps better phrases, since apparently people forgot that the purpose of a company is to make profit, not just revenue.


You can have run a company for zero profit and it can still be a benefit to society, so no the "purpose of a company is to make profit" is not necessarily true.

"If we want to know what a business is, we have to start with its purpose. And the purpose must lie outside the business itself. In fact, it must lie in society, since a business enterprise is an organ of society. There is only one valid definition of business purpose: to create a customer. The customer is a foundation of a business and keeps it in existence. The customer alone gives employment. And it is to supply the customer that society entrusts wealth-producing resources to the business enterprise.

Because it is the purpose to create a customer, any business enterprise has two – and only two – basic functions: marketing and innovation. These are the entrepreneurial functions. Marketing is the distinguishing, the unique function of the business."

- Peter Drucker


> You can have run a company for zero profit and it can still be a benefit to society, so no the "purpose of a company is to make profit" is not necessarily true.

On one hand, fair. Technically correct.

On the other hand basically all laymen use the word "company" to mean for-profit corporation, and pretty much everyone who wants to talk about corporations that are designed not to make a profit use the term "non-profit".

I think you've ignored the context in which the comment was made.


> I think you've ignored the context in which the comment was made.

I would argue you did. The parent was being pedantic about the saying "revenue solves everything" and tried to make seem as if profit is the be-end-all. My overall point is still consistent with Drucker's "the purpose of a company is to create customers". If you're in a bull market and you hire like crazy to achieve the creation of customers, then hiring like crazy is NOT WRONG.

> On the other hand basically all laymen use the word "company" to mean for-profit corporation

P.S. - Amazon ran without profit for 10 years. Did we magically call them a non-profit for 10 years and then a company thereafter?


> and tried to make seem as if profit is the be-end-all

That statement was paired with a teleological statement about the purpose of a company. When paired with the common usage of the word "company", the statement becomes tautological. Admittedly, this isn't the most insightful realization ever, but the point from the comment was that people forget that a profit needs to be extracted.

> My overall point is still consistent with Drucker's "the purpose of a company is to create customers".

And, pray-tell, why do you think companies want customers, if not for eventual profit?

> Amazon ran without profit for 10 years. Did we magically call them a non-profit for 10 years and then a company thereafter?

Amazon made an intentional decision to re-invest profits, as a gamble that it would lead to greater profits later. That's not refraining from profit, but just operating on a time-scale that isn't quarter to quarter.


> [Apple is] probably one of the best run companies in the world.

I would argue that perhaps Apple's business side of things is arguably well-run. I personally disagree; I think that since Tim Cook has taken over, Apple's business focus is more on short-term gains instead of long-term quality.

But I would also strongly argue that their software side of things has been trending downward (especially in terms of quality) since Tim Cook took over.


>...more on short-term gains instead of long-term quality...

Considering that Tim has been CEO for 11 1/2 years now, and things are still going incredibly well for Apple...it seems pretty hard to argue that he is too focused on the short term.

(FWIW, I mostly agree about their software quality.)


I people tend to have a bit of rose tinted glasses on about Apple's software quality of yore. There have always been big stability issues with new features and apps. And they've simply never been good at the UX for anything that involves a network connection or has to sync across multiple systems (besides Safari). The big difference is that more and more computing requires syncing and networking, which is their traditional weak point.


> There have always been big stability issues with new features and apps.

But they got fixed. At times there were 2 full years between major Mac OS X releases. 2 years of bug fix updates. And that was when Apple only had 1 major operating system! Now with the relentless yearly schedule, there's never time to fix bugs before they start working on the next big thing and adding brand new bugs. The problems just keep piling up year after year. The software is now underwater in technical debt.

Part of OS stability is simply not shipping new major versions too often. The new major versions are always buggier than the previous major version with many minor patches. Not to mention, Apple now pushes everyone to install the latest OS immediately. In the old days, you had to go out to the store and buy a new Mac OS X version on disc, so everyone wasn't upgrading to 10.N.0 on day one. Slower adoption means that fewer consumers have to suffer the early growing pains of new software.


Been on macOS since 10.3. In general, the .1 or .2 patches were solid, and if one release seemed like a stinker the next one tick-tocked its way into stability/quality pretty reliably, and staying on N-1 version while N was out was a pretty good choice if you wanted to keep your setup as-is without falling TOO far behind.

Also, Safari was really bad, early on. It was rendered unusable for me when a runaway bookmark sync problem with iCloud turned the default set of bookmarks into 30+ thousand copies.


You're thinking of recent Apple. You have to consider the first half of Apple's lifetime too.

Especially in the second half of the 1990s Apple was in very very tough shape due to poor software and uncompetitive hardware. Their fans wouldn't admit it but the company definitely knew what they were up against.

There's pretty much been no point 2000-present where Apple quality has been worse than it was Pre-2000, and their prices have never been more competitive either.


Speaking of software quality, anyone notice in Apple Music how the categories all start with Apple? Apple Dance music, Apple Country, Apple Pop... like are they trying to make it intentionally difficult for me to use my eyes to find music? I know that I'm using Apple Music... why does every category need to remind me of that while simultaneously impeding my ability to locate things alphabetically? Of course, when I use Siri I can just say "play Dance music" and it works. Hmm.


It could be worse. If you were on Android, about half of Google's 1P apps start with "Google", and about half don't, even ones that probably should (like Google Home).


> anyone notice in Apple Music how the categories all start with Apple?

No-- where are you seeing this?

Almost nothing in the Apple Music app on iOS or MacOS starts with "Apple Music" for me. Only really the "Apple Music" radio stations, which is necessary to distinguish them from the generic playlists and third-party radio stations.


Soon in supermarkets near you: the Apple Apple. Beautiful, healthy, tasty; no buttons!


I hear they even named one varietal of apple the "Macintosh"... someone must be a fan I guess.

(/s)


Cutting it into quarters requires a special knife that costs $500


But it’s the nicest knife I own… and I’m saving the box.


And what's the last revolutionary consumer technology that has come out of Apple? The original successful Macs, the iPod, the iPhone, iTunes, those were all under Jobs.


I don't know about revolutionary, but Apple's service revenue alone for 2022 was larger than the revenue of McDonald’s and Nike combined. Things can be wild runaway successes without being revolutionary.

https://finbold.com/apple-services-2022-statistics/


You might dismiss it, but for me AirPods are the biggest QOL improvement from tech since smartphones. Bluetooth headphones were atrociously hard to use before them.


apple's bluetooth experience is so refined it's practically revolutionary in general. I used to fight bluetooth mouse/keyboard problems all the damn time, I'd have keyboards that suddenly stopped responding or would repeat a key over and over again, mice that would suddenly scroll like an inch at a time, etc. The best experiences were on an Intel NUC that might only have an issue once a day or a couple times a week, but it still was occasionally temperamental and barring that every single other platform I've used ranged somewhere between "highly flaky" and "completely unusable". When it's happening multiple times per day it's just not acceptable.

even a dell latitude business laptop is something else I'd consider that should probably be close to the intel nuc level of "mostly works with minimal flakiness" but it flatly was not... and the intel NUC was extremely flaky compared to everything apple I've used. And everything besides the NUC that used Intel wireless chipsets was a complete disaster that was basically unusable. When I have to put a USB bluetooth adapter in my B550 motherboard to get something that works... yeah.

know what I haven't had to fuck with since getting an intel macbook for work, and a M1 MBA for around the home? bluetooth problems. somehow apple can figure it out, even when using that same flaky intel hardware. nowadays I can just reliably assume that any connection problem means my peripheral has a dead battery for some reason.


Watch and airpods are both rather successful products that capture about 30% industry market share every year.


What revolutionary consumer technology has emerged since the iPhone, from any company?


The iPad


M chips, airpods, watch


When the iPhone was introduced , there were already 1 billion phones being sold each year. Sj said he wanted Apple to sell 10 million to capture 1% of the market.

There can’t really be anything bigger than the phone market that has 90%+ world wide penetration.


Quite the standard you’ve set there!


M1, M2, AirPods


I don't know, people have been lamenting the poor quality of Apple software and how much better it was in the old days since I got my first Mac in 2007, at least that's when I became aware of the complaints. If it was really on a substantial continual down trend from then, by now nothing would work at all. Apple continues to invest massively in software, that means a ton of new code, which means a ton of new bugs. You can't develop new software and not have quality issues.

Apple continues to make extremely long term highly strategic investments. You don't get to develop a highly original, industry leading SOC architecture without long term investment in technology. Bankrolling new process node development and thereby locking in capacity in strategic contracts doesn't happen by accident or on a whim. Development and implementation of the Apple watch happened entirely in the post-Jobs era.

So I just don't see it, they continue to commit to heavy long term investment in capital intensive technology projects and innovative new products. Not everything works out. The touch bar and butterfly keyboards were notable misses. But then that's always been true. I look at the phones, desktops, laptops, watch, Airpods, iPads, etc. They're the most Apple-y products Apple has ever produced, and not a single one of them is a me-to product slapped together to ride a trend.


Yeah - people have been saying the mac software has been in decline for as long as there's been mac software. If you have enough users some fraction of them are going to get hit by some rare bug and declare the software is crap, even if it is 0.001% of the users.

My m1 mbp is the best laptop I've ever owned, and I've owned a lot. I don't really use any mac software though - just a terminal, emacs, and a web browser. Macos just stays out of the way and causes no problems. I routinely get 1 or 2 months of uptime - only ever needing to reboot to apply system updates. Maybe if I used more mac software I would see this supposed decline.


It was a very dark era for Apple, but the Power Computing years were the best time to be an Apple fanboy IMO. You had commodity firebreathing PC hardware running OS 8 better than Macs did, more ports and connectivity, the software was best-in-class, and this was the era with no spyware, telemetry and all the BS we face now.

And yes, I was still buying legit Macs and software, and a lot of stock when it was at $11. It was a blast, I tell you.


Uhh, speak for yourself. As a fanboy myself back then, it was kinda terrifying. Sure Power Computing was able to bump the bus speed beyond what Apple seemed to to able to do, but they also cannibalized Apple’s hardware market which they needed to stay alive. Also it became clear that the wheels were falling off the “new” OS rewrite (Rhapsody) project with no real plan for remedying. It was scary times when Microsoft had to help bail them out. Honestly, buying NeXT was the accidentally smartest move Apple executives made. I remember how cool BeOS was but was glad they chose NeXTStep over it. Mostly, because I was a Unix fanboy by that time too.


You can pry my Motorola StarMax out of my cold dead hands.

... the fact that the StarMax product hasn't been powered on for over two decades is beside the point. I want my nostalgia!


Rhapsody came about after Jobs came back, and after he ended the OS license debacle. The wayward OS was Copland.


> the Power Computing years were the best time

No, it really wasn't.

I know people who were techs for retail stores at the time.

The sheer volume of shit cheap-ass clones that crossed their workbenches during that era....


> no spyware

What spyware is on your modern Mac?


Ehh, Apple has some big misses. Their software objectively sucks. OS X has gotten buggier and slower over the years. I traded “it just works” for the speed and customization of linux years ago and would _really_ like the benefits of Apple hardware but can’t bite the bullet on Apple software (Asahi isn’t in a good enough spot yet to be my primary computer imo).

The iPad pro line is a travesty. IPadOS is horrible, Apple need to take it out back and shoot it. The hardware fits a great niche but Apple have ruined it with their gimmicky software.

IPhones have been getting bigger, heavier, and last less time with each generation. I recently upgraded from the 11 to the 14 pro, and the hardware feels strictly worse. It’s snappier and has a digital island, but it’s significantly heavier and is the first iPhone I’ve ever really been worried about dying during the day (even after turning off the stupid always on display).

AirPods have probably been their biggest win in recent years. The original were amazing, Pros, and Pro 2 were significant upgrades and they’re a big reason (along w/ apple ecosystem integration) I won’t jump ship for android. Maxs are horrible, heavy, and I just don’t understand why people like them at all (bose + sony both make better headphones for cheaper here).


> OS X has gotten buggier and slower over the years. I traded “it just works” for the speed and customization of linux years ago

If you have been using Linux for years, how come you've been experiencing increasing number of OS X bugs and performance degradation? I thought you've been running Linux for years?


For my use case (education) iPad Pro is a godsend. I don’t see anything wrong with it.


What feature of the pro makes it worth $150 ($650 if you have the larger screen) more than the comparable iPad?


The basic iPads don’t have laminated displays, are smaller, heavier and come with older CPUs and less memory. So the decision for me is between iPad Air and Pro. Frankly which makes the best buy varies. My current iPad is an M1 Pro that’s just over a year old, but my brother bought one last summer and got an Air, and if I’d been buying one then I’d have done the same. Sometimes it comes down to which was refreshed most recently. I just look at the size, weight, screen features, processor, memory, storage tiers and price then see what is the most compelling package. 14 months ago it was a Pro. 6 months later it was an Air.


Yeah, I mean you’ve more or less reiterated what I said. The hardware is very nice. And if you play a game where you calculate the hardware/$, the iPad pro makes sense.

When you really ask yourself what workload you’re doing on an iPad that requires anything more than an A16 is … I think you’ll find there isn’t much. You’re limited to iPad apps. You can’t even install a browser that isn’t a safari skin. Multitasking is horrible. The mouse/touch interface has weird janky things (like text selection).

You don’t have an escape key or a function row, so even many web apps are broken.

The iPad pro is about the same price as an M2 Macbook Air, and significantly more expensive than an M1 Macbook Air. While the hardware looks comparable, the software makes the hardware more or less useless.


For something that’s useless, I get an incredible amount of use out of it. Honestly a pro is probably overkill for me, but I use my iPad a ton every day. I use multitasking a fair bit. I’m actually typing this in a floating browser panel on top of Teams. iPads last seemingly forever, my hand me down iPad 2 was being used by a relative and got retired a few months ago. So I go for the best experience because multiplied by the use I get out of it, it’s worth it.


120Hz + Apple Pencil.


So, 120hz is only on the 12.9. If it’s worth twice the price to you, that’s your decision.

The apple pencil difference is magnetic charging and magnetic storage on the side of the ipad. Which is nice, but it also means you have the camera in the absolutely most annoying spot in the world. Good luck using faceID while you’re browsing and you’re always going to have the worst angle/look like you’re staring into space during video calls.

iPadOS just doesn’t cut it for multi-tasking/any pro workload that would justify an M2 chip


The only claim I think is objective here is IpadOS. And even then to say it's a "travesty" is hyperbole.


People have been lamenting about MacOS software quality since I got my first Mac in 1992.


Usually people decrying short-termism mean focuses on quarters rather than years. Are you saying Cook’s 10+ years that took the company from $100B/year to $400B/year are the short term, and it is at the expense of the next 90 years?


> Are you saying Cook’s 10+ years that took the company from $100B/year to $400B/year are the short term

Yes, however it's clearly successful.

> Are you saying Cook’s 10+ years that took the company is at the expense of the next 90 years?

Apple is slowly pulling away consumer ownership of their devices and is turning into an advertisement company. I honestly believe that is at the expense of the next 90 years. And this is rather more the point that I tried to make: short term profits at the expense of long-term customer satisfaction.


> turning into an advertisement company

Apple 2022 revenue: $398B[0]

Apple 2022 ad revenue: $4B[1]

It is possible, with a lot of effort, that Apple could go from advertising representing 1% of revenue to maybe 10% over the next 10 years. I don't think that's realistic, but maybe. "Turning into an advertising company" is maybe overstating?

Besides, the whole reason Apple is in advertising is that free-with-ads is an important business model for media and developers, and the other players are pretty horrible. Me, I hate ads and will always pay directly for apps/media, and I don't like what Apple's doing in ads. But I think you're being a little over-dramatic there.

0. https://www.macrotrends.net/stocks/charts/AAPL/apple/revenue

1. https://www.vox.com/recode/2022/12/22/23513061/apple-iphone-...


“Apple is doomed any day now”. That’s been the narrative for over 40 years.

So which customers aren’t satisfied? The people on HN that want to run Linux on the Apple Watch?


Look, if Apple just made the watch open source with fully-user-upgradable parts and modular CPU + display + battery, and a command line interface, they would reach dozens of new customers.


> dozens

Even if there are 7 dozen extra customers, that's maximum (looks up price of most expensive Apple Watch) of 7 * 12 * $799 = $67k.


And the cost to acquire those customers including the development costs?

Apple made a profit of $99 billion in 2022. $67K is the amount Apple made about every 20 seconds.


> “Apple is doomed any day now”. That’s been the narrative for over 40 years.

And they almost went out of business at least twice, once they survived because MS bailed them out.

Apple pre-iPod was a source of amusing commercials. Pre iPhone they were the catalyst for change in the music industry, but they were still a consumer goods company with one product that had pretty wide, but still overall limited appeal.

Apple still doesn't have a diversified revenue stream. If Microsoft had to stop selling Windows today, they'd still make a fortune from Office. If Microsoft had to stop selling Office and Windows today, they'd still make a (much smaller) fortune from Azure.


Just Mac revenue by itself would be the envy of most companies.

But you can easily look at Apple’s revenue breakdown between Mac, iPads, services, wearables, etc and see that it is well diversified.

And Microsoft never “bailed them out”. Before the MS deal was signed, Apple had already gotten a line of credit for a couple of billion. The $250 million that Microsoft invested in Apple was nothing.

Apple turned around and used $100 million to buy out PowerComputings Mac license. It didn’t become profitable for another two years. The little money that Microsoft invested didn’t make a difference.


Microsoft’s financial contribution was nothing, but Office on Mac was huge. Had there been some third OS that was thriving so Microsoft didn’t feel the need to prop Apple up for antitrust reasons, the world would be different.

But so what? I can’t believe someone is seriously arguing that the failures of the Scully era are somehow damning to Apple under Cook. Sure, every company can fail. Apple is the most profitable company in the world right now, so it’s an odd time to sing doom songs.


Office had been on the Mac since the 80s. Microsoft didn’t just wake up one day in 1997 and decide to port Office to the Mac.

They promised to keep doing what they were already doing.


What % of those wearables and services are tied to devices?

If iPhones vanished tomorrow, how much trouble would Apple be in? Wearables are 100% attach rate to iPhone, and I'd presume so are the vast majority of service subscribers.

> Before the MS deal was signed, Apple had already gotten a line of credit for a couple of billion. The $250 million that Microsoft invested in Apple was nothing.

As others said, the software support, browser + office were huge.


They would still have Mac revenue that is more than they ever had pre-iPhone.


No one picks azure unless they are a windows shop.


Are you saying that "any day now" and "next 90 years" are the same narrative?


Seeing that Apple and Microsoft both have a history of staying relevant for over 45 years, Microsoft has been one of the most valuable companies in the world for two decades and Apple for over a decade, I would bet on these two companies before any other company.

But it’s not good enough just to be right, you have to be right at the right time. How many products categories failed during the dot com bust that are doing well now?

It’s like me predicting the heat death of the sun. There is a big difference between saying it will happen eventually and that it will happen tomorrow.


You mean “short term” of a decade? Apple created two hugely profitable segments since Cook - wearables, services.

The Macs are better than ever.

What else should he be doing?


> one of the best run companies

They off-source all of their factory efforts to other companies. The other companies are doing the hiring and firing.

Probably the best run company is Qualcomm. I have never heard them do layoffs. And they are based in San Diego.

edit: Just googled it and found https://www.thelayoff.com/t/1l5vLinG

edit2: also, https://www.thelayoff.com/apple


>The other companies are doing the hiring and firing.

And this is evidence they're not so well run? On the contrary...


> They are probably one of the best run companies in the world.

I would disagree with this. Although their risk is spread much better than it was in the early 00s, a huge amount of their recurring revenue right now teeters on the brink of destruction in the form of their 30% app store cut and the walled-garden nature of their app store. All it would take is a morally principled, non-lobbied regulator to step in and regulate that down to something reasonable like 5% or 3% and require allowing third party app stores (both actions would be equivalent economically because their current fee only works because of their monopoly), and I don't see someting like this not happening in the next decade.

It will be a huge and deserved adjustment when this does happen, and stock speculation will only amplify the blow in the form of lost stock value. There are many, many long-time holders of apple stock who might see the threat of such an event as a time to finally sell. Then we'll see how much value is really there.

Additionally, they have sunk billions of dollars in a brand new campus at a time when WFH is here to stay. That investment will also likely turn out to be worthless in the coming decades as the value of CRE races to $0.


So you're saying Apple isn't a well run company because they haven't properly managed their regulatory risk around the app store? What should they do hire more lobbyists? Not have built a $100 billion/year business because of a vague threat of future regulatory change?

We've seen pushback on the 30% but mostly for a select group of major companies who want to offer alternative payments. The majority of apps aren't going to be asking customers for their credit cards separately vs a single "subscribe" button, nor do they all have CCs already in their existing DBs. Even worst case if 25% of the revenue gets lost due to secondary subscription payment support (assuming they don't somehow still take a cut), that 75% is still a monster that could support Apple for a long time.

Not to mention the rest of their businesses.


I'm saying they have over-extended to the point where they are begging for regulation. Much wiser to not over-extend to the point where regulators are pissed off. Now they could lose it all instead of having a more reasonable rate.

At this point they should self-regulate, like any misbehaving child who realizes they are in the wrong should before they get punished. It's not a risk and it's not going to hurt their stock if it's self-imposed. People always praise anything that comes from within with Apple so it would actually be a pretty savvy play, given what's on the horizon.


Have you ever sold anything in a retail channel such as Walmart or Target? Do you know how much they charge to put your product on their shelves?

Welcome to the world of retail!

If Apple tried to make it so that you couldn't make an app and offer it both on Android and Apple, i.e. it had to be an Apple exclusive, then I'd say you have a point. The regulators aren't concerned with the 30%. Their concern is with the fact you can't go to another store and shop for a better price. THAT is what they're trying to fix. Watch out though - deregulation and market competition all-too-often leads to higher prices. People always act surprised by this.


So what should they have done? Conceded to a group of Epic type companies on a VIP-type basis so regulators don't eat their lucrative subscription model? Without bigco lobbying the gov I doubt there'd be as much regulatory pressure. So the best bet might have been to work with the major companies hiring lobbyists and getting PR.

Rumour is Apple is opening up side-loading so the "walled garden" thing might be neutered.

Not sure what other regulatory risks they have.


No no they should have never had it above 10% to begin with. The case with Epic never should have needed to happen. They've been over-extended all along.


I think you lack a lot of historic perspective of how software was sold and distributed, and the cuts involved.


As a solo developer at the time, I was astounded by the 30% cut when it first came out in the early 2000s and I'm still just as astounded now.


I agree that the cut is too big, but its entirely reasonable to think that setting and keeping the cut at 30% is the rational profit-maximizing course even in a long run that includes new laws throttling them to some lower number.


Playing the game according to the current rules seems preferable than playing the game according to rules that may or may not come into existence.

Also, they will still be one of the, if not most, profitable businesses in the world even if they lose all the App Store revenue, so it does not seem reasonable to expect more.


Companies of this size need to play the long game, but quarterly OKRs and a beuracratic structure ensure they only ever play the short game. The 30% is great this quarter and next quarter, and every quarter until the quarter where it goes away completely. Keeping it this high is extremely risky behavior, literally taunting for regulation.


If you look at Apple’s results and products and think they are playing for next quarter, I do not know what to tell you.

The App Store model has served them well for years. If it stops serving them well next quarter, so what?


> this quarter and next quarter

Apple Silicon was a decade-long play, largely by Cook. Even if the App Store is threatened, Apple remains plump.


This old myth will never die.

If it was true, these short sighted companies would fade away in the long run.

Empirically, top companies stay on top for decades.


If facebook qas so well run, it wouldnt need to buy up all the competition to stay relevant, like Whatsapp and instagram.

Once you are that huge and cam but competition, you don't fade away unless you go full-on Lehmon Brothers


Commenting on the campus bit.... Apple makes devices, not just software. Good luck developing device from home and dealing with the constant headache of shipping samples around the world. The campus was not a mistake.


Trusted engineers get to take home prototypes. Usually also loaned a maxed out Mac Pro so they have something with enough memory to unpack MacOS's built in system process recording mode. (digging through a 5 minute recording of MacOS took ~~200GB of system memory to unpack and evaluate.)


I assume by your answer that you work at Apple or know someone who works there so let me clarify that I'm talking about the messy work that happens before you have a take home prototype for a device. I'm talking about building the prototype with in-house machinery before asking the manufacturer to make samples at the factory. If you do work at Apple, do you hand assemble prototypes at the office or do you just ask the manufacturer to build them and overnight them to your house? If so, do you have lab equipment at home to inspect and test it?


> regulate that down to something reasonable like 5% or 3%

How is 5% "reasonable?" Who determines that? That's not something up for a vote -- that's up to the market.

Let's say you have an iOS app.

If you sell it for $5 on the App Store, you keep $3.5. And you sell 100,000.

App Store nets you $350k.

And let's say you sold the app direct to users and did everything yourself:

$5 sales price, you keep $5 however, you also have to collect and remit sales tax for jurisdictions in which you sell. Your street price is $5 + tax, and you keep the $5 but you incur some expense for sales tax compliance.

You have to use a payment processor. That's 2.9% + $0.30 per transaction. So if you sold 100k apps at $5, that's $455,500 in revenue after credit card fees are paid.

However, there's also chargebacks. Chargeback ratio averages roughly 0.6% across all industries (it's much higher for digital goods and CNP transactions -- so let's call it 1.5%) So from 100k sales, you really sell 98,5000, so your revenue is $448,667 (after cc fees.) However, you have 1500 in chargebacks that cost $20 each (per stripe.). So that's $30k in chargeback fees. So now your revenue is $418,667. Apple rarely loses chargebacks and refunds are extremely rare. So that sales loss is negligible.

You also have to set up a server to serve the downloads and pay hosting, storage, and bandwidth. As well as shipping updates to all of your customers. As well as ensuring updates are compatible for any iOS version your customers might be using. There's going to be some level of customer support required. That's a non-zero cost.

Finally, how are people going to discover your app? You aren't on any app stores, so you have to advertise. You'll need to market. And -- convince people that you aren't going to infect their device with malware.

Let's just assume that you sell just as many outside the App Store as you would doing it yourself. (That won't happen, but let's assume your marketing is so good that it's the case.)

The difference is $419-$350k. (That's assuming zero expense for hosting, marketing, customer support, etc.) So you're "losing" $69k per year. After tax however -- $280k from the App Store, $335k doing it yourself. So $55k additional. Yet you have hosting, customer support, marketing -- etc. $55k per year as a salary for doing those tasks is tiny and that doesn't include the cost of the hosting and bandwidth.


> If you sell it for $5 on the App Store, you keep $3.5.

Try $0. Look at the App Store charts. The 200 top grossing apps are ironically all "Free": https://appfigures.com/top-apps/ios-app-store/united-states/...

This is the real problem, the crApp Store race to the bottom. Apple's cut was never really the problem. The cut is obscene, but only because the prices are already obscenely low. I'd be happy to give Apple 50% if I could sell my software for higher prices.

> And you sell 100,000.

Good luck with that.

> You also have to set up a server to serve the downloads and pay hosting, storage, and bandwidth.

Cheap.

> As well as shipping updates to all of your customers. As well as ensuring updates are compatible for any iOS version your customers might be using. There's going to be some level of customer support required.

You have to do these things anyway.

> Finally, how are people going to discover your app? You aren't on any app stores, so you have to advertise. You'll need to market. And -- convince people that you aren't going to infect their device with malware.

You have to do all of this stuff regardless of whether you're in the App Store. As an App Store developer myself, I can tell you that Apple absolutely does not do these things for you.

Notice how many Mac developers, given the choice, choose to distribute themselves rather than via the Mac App Store. They don't see the App Store as a win like you do.


> How is 5% "reasonable?" Who determines that? That's not something up for a vote -- that's up to the market.

If you ask Apple themselves to split the payment processing from the hosting and 'marketing', they claim it's 3% and 27%.

So if you could use their payment processing by itself, that would put you at $485k before server costs, not $419k. So you can have very nice servers and over a third more revenue.

You'll have to decide whether being in the main app store is important enough to pay a big percentage. And advertising and updates and customer support are costs either way, so please don't imply you only pay them if you avoid the app store.


> So if you could use their payment processing by itself, that would put you at $485k before server costs, not $419k. So you can have very nice servers and over a third more revenue.

Most companies don't do their own payment processing. They have accounts with companies that provide merchant services (e.g. WorldPay, Ayden, various investment banks) under specially negotiated contracts. These contracts would cover contingencies that wouldn't apply to "normal" customers and there's obviously a premium to pay for that. What you're demanding is the financial equivalent of adding millions of people to an individual's credit card account.

Why would you think a for-profit company (or for that matter, any financially sensible individual) would should share the financial benefits its negotiated for itself with millions of others? Not only would Apple be introduced to additional legal and financial liability, but so would the payment processor.


You're ignoring my point almost entirely. I'm sorry for not adding "the equivalent of their payment processing". If Apple can do it for 3%, I'm sure other companies could do it for 4.5%, and the rest of my argument doesn't change at all.

Though I do have an answer to "why"! It's because Apple claims they want to be in charge of payments to protect their users. Even if we ignore the "charge slightly more and make tons of money" plan.


> Though I do have an answer to "why"! It's because Apple claims they want to be in charge of payments to protect their users. Even if we ignore the "charge slightly more and make tons of money" plan.

There's a difference between being charging payments as a service and acting as a financial fiduciary on behalf millions of developers. I don't think you understand the legal gravity of the latter.


I'm merely suggesting they do the same thing they already do with payments, but skip the "downloadable in the app store" part.

Since the app store is apparently nine tenths of their fee.


I mean it is public information how much Apple makes from services. It doesn’t take much research to realize you are wrong.


It was all totally insane. The actual narrative of the layoffs is this:

High performing public tech companies went totally insane and doubled their head counts. Everyone else was a cargo cult with an open checkbook so they insanity-squared their head counts. No one could possibly manage such a gigantic influx of knowledge workers so resources were obviously being poorly allocated industry-wide. This first had an impact on low performing public companies, followed by the higher performing public companies, and then the cargo cults.

The places that didn't go insane were not only unaffected but stand to benefit immensely.


If someone is insane, is their next move sane?

Are they a good investment?


This remains my chief concern - part of this admittedly is anger that those responsible for the sackings have generally not been sacked, but the rest of it is whether or not anyone in charge truly knows what they did wrong in order to avoid the same situation again.

When one authorizes a 40,000-person increase in headcount, what did they think they were paying for? What projects were these people allocated to, and why are these projects now expendable where they weren't before?

My suspicion is that a lot of the hiring for the past two years were for work and projects that simply wouldn't pass even a cursory smell-test. I also strongly suspect that there really should be some replacements in leadership positions - not because of some sense of retribution for the layoffs but because I honestly don't think many people have internalized what went wrong in a way that would make them more reliable in the future.

Many of these layoffs seem like slash-and-burn tactics in response to criticism and investor panic rather than a real, sober assessment of the excesses of the past few years and why they happened.


Your last sentence is actually why I might -- might -- be a bit more forgiving here -- in some cases (most?) the layoffs are being driven by pressure from activist investors. I have firsthand knowledge that at least one of the major companies that did a layoff recently had no internal desire or financial need to do it, but is being heavily pressured to lay off staff as a cost-cutting measure, even though the business is perfectly viable without it.

A lot of the world's ills today seem to be boiling down to sociopathic trust fund billionaires treating literally everything in life as a game to be won.


I can't speak to Meta, but Saleforce and Amazon probably needed to hire during the pandemic to keep on with demand.

Amazon saw a massive spike in their ecommerce business due to lockdowns so they had to hire to ensure they had the labour to deal with all the extra orders and deliveries. And even with this extra hiring I remember my deliveries being delayed often in 2020.

Saleforce is a weird software company because my understanding is a lot of their staff are there to help clients with integrations, manage accounts and to upsell. Again, assuming Salesforce saw a spike in demand during the pandemic they probably needed more employees because of the nature of their business.

My guess is that Apple also had to ramp up manufacturing significantly during the pandemic to meet demand, but obviously they don't employee people directly to do their manufacturing which probably gave them more flexibility when it came to meeting pandemic demand without increasing head count.

For the most part I don't understand why people act like tech companies were hiring recklessly during the pandemic. Are people forgetting how much demand was pulled forward by stimulus and lockdowns? These companies were forced to hire or lose market share. That's really the only two options they had.

Personally I'd be blaming politicians and the Fed for creating an impossible economic environment to operate in. If you were a company in 2020 your demand was either skyrocketing from lockdowns or at zero because you were forced to close your business. There was no in-between. Tech employees as initial beneficiaries of lockdowns are only now seeing the impact of job losses, but these were happening on mass elsewhere in the economy while lockdowns were in place.


I definitely blame the Fed.

It's pretty hard to be in charge of a public company and say no to the shareholders who want you to take advantage of the sudden bubble in demand.

Companies would have been punished for not growing, and now they are being punished if they don't shrink.

Who created that perverse system of incentives? The Fed, by making a number of mistakes in the past decade that kept the economy overboosted, by boosting it even more during 2020, and then by taking drastic action to kill the monster they created.


Where this logic irritates me the most is in the housing market.

Similarly to those who blame companies for over hiring to meet pandemic demand, I've seen some people suggest the average home buyer was being reckless for buying a home during the pandemic, because "they brought the bubble!"

I guess it's strange to me how the Fed can pump every asset class to the moon, force companies to over hire to meet demand, and cause a huge inflationary problem for consumers – all while promising not to raise interest rates – then rug pull everyone but get no blame for any of it.

Instead I find I'm invited to blame the average business or home owner for being "reckless" for hiring to meet demand for buying a home for their family, while praising the Fed for "doing what's necessary"? Give me a break.


> I guess it's strange to me how the Fed can pump every asset class to the moon, force companies to over hire to meet demand, and cause a huge inflationary problem for consumers

There were no consumer inflation issues for the past decade. It took a once in a life time pandemic supply shock to cause those issues. You're being unfair.

It's easy to blame the fed for all your problems. The fact is inflation was stable and the economy was just fine prior to the COVID19 pandemic. I don't think any amount of rate raises prior to 2020 would have had any appreciable impact on what we're facing now. Most of these layoffs are happening in the tech sector. Others are healthy. This isn't an economy wide problem yet. It's hard to not blame the individual companies for poor management.

> Instead I find I'm invited to blame the average business or home owner for being "reckless" for hiring to meet demand for buying a home for their family, while praising the Fed for "doing what's necessary"? Give me a break.

The fact is if you want what you want and want it now, and everyone else does, you have to pay. It's easy to say "oh they should've raised rates in 2017 or 2014" in 2022. It was much harder to say back then.


> It's easy to say "oh they should've raised rates in 2017 or 2014" in 2022. It was much harder to say back then.

it's funny that you brought this up unprompted because this is the obvious counterargument - the fed really really should have been doing moderate tightening of the interest rate in 2017-2018-2019 and everybody knows it, even you. Everyone said it too, I completely do not get your "it was much harder to say it back then", this was specifically something that everyone said was a bad idea except for one individual (let's call him Individual Number One) who wanted to goose the economy to get numbers up for his re-election the following year, and instead leaned on the fed to cut interest rates instead, throwing gas onto an already-roaring fire. And then when the pandemic hit the fed had absolutely no room to maneuver.

https://www.washingtonpost.com/business/2019/10/30/federal-r...

this was all extremely predictable, for literally everyone except Individual #1, who didn't really care.


There were plenty of times to raise rates even before the Trump administration. But there were no major negative impacts until the COVID19 pandemic.

The point is it's easy for us to say this now. Had the pandemic not occurred we'd maybe still be in the position we were in 2019. You can argue for resiliency but you can't time the market. You could argue rates should've been raised in 12, 13, 14, 15, 16, 17, 18, 19. If you optimize too much for resiliency, then you overcorrect and cause recessions, which no one wants. And there'd be people just like you parroting how "i told you they shouldn't have raised rates!"

Now let's look at the facts. Despite sharp rate increases, the economy still grew last quarter, inflation has is normalizing, and the job market outside of tech is still robust. Frankly, hard to think of a better outcome. The asset inflation prior to COVID19 was slightly problematic. But the inflation we face now has less to do with that and more to do with demographic shifts and larger societal changes that needed a spark to begin.


Inflation "normalizing" is actually likely to be due to a change in the way it is measured: https://twitter.com/jasonfurman/status/1624093631929020416


> There were no consumer inflation issues for the past decade. It took a once in a life time pandemic supply shock to cause those issues. You're being unfair.

Not really... Inflation began spiking in late 2020 / early 2021. It was completely foreseeable given how strong economic demand was and how quickly CPI was raising that inflation would be high in 2021. When CPI shot past 2% the Fed basically told businesses and investors not to worry about rates and that they were going to allow inflation to run hot and stimulate demand.

What you're saying suggests you don't understand what happen. Inflation surprised no one in 2021 the Fed was actively encouraging it for months while it was well beyond 2%. The Fed was basically telling people that if they had cash in the bank that they would continue to erode its value through inflation.

This is why businesses were scrambling to spend and there was high demand for hard assets like real estate. Its simple asset allocation. If you believe your cash is going to have negative real yields for the foreseeable future then you need to get out of cash and buy stuff which can benefit from the economic demand ASAP.

> The fact is inflation was stable and the economy was just fine prior to the COVID19 pandemic. I don't think any amount of rate raises prior to 2020 would have had any appreciable impact on what we're facing now.

I'm actually of the opinion that monetary policy was appropriate prior to 2020. I'm not someone who believes interest rates have been too low since the GFC although I'm aware this is a common talking point of people critical of the Fed.

The way I see it the Fed has a duty to keep inflation at their 2% target. The supportive monetary policy prior to 2020 was in pursuit of this goal and therefore appropriate imo so long as we care about that 2%. I disagree with what they did in 2021 and beyond precisely because they didn't do their job and disregarded their 2% inflation target in pursuit of keeping demand high.

> Most of these layoffs are happening in the tech sector. Others are healthy. This isn't an economy wide problem yet. It's hard to not blame the individual companies for poor management.

It's a tech problem for a reason though. Tech is naturally more interest rate sensitive than other sectors due to the funding and growth dynamics, but other rate sensitive sectors like real estate are also suffering. But the largest reason tech over invested in 2021 was because demand that was being pumped into the economy was largely being pushed into tech because of lockdowns.

I'm also of the opinion this will spread into the broader economy later this year and the full cost of the Feds mistake has not yet been felt.

> The fact is if you want what you want and want it now, and everyone else does, you have to pay.

I guess I don't really understand what you're saying. I mean do you think it was just all a coincidence or something? Why do you think people and business all suddenly wanted to buy things at the same time? Can you think of any reason why that might of happened?

Maybe people's cash was being erode by negative real yields? Maybe the government was stimulating demand via QE and fiscal stimulus? Maybe the Fed was buying hundreds of billions of dollars of assets while inflation was running above 2%? Maybe the central bank was explicitly telling people that they would continue to erode their purchasing power and couldn't give a damn about their inflation target?

Honestly it's hard for me to even understand how someone can even blame stock investors, crypto investors, real estate investors, tech companies, bond investors, consumers, etc all for being too greedy without even questioning why they were all so suddenly so greedy at exactly the same time.

> It's easy to say "oh they should've raised rates in 2017 or 2014" in 2022. It was much harder to say back then.

As I mentioned, this is opposite to the position I hold. Rates prior to 2020 were mostly appropriate imo and if anything I believe monetary policy should have been more accommodative.


It's easy to say now change policy in the throes of the COVID pandemic. But in reality this move would've created economic shock on top of a supply shock and during a public health crisis. That simply isn't sound and is extremely reactive. And it's hard to say this would've had any impact at all on inflation, which was tied to factors external to fed policy.

The fed should be cautious by making any changes. I believe the fact they didn't rush to judgement has led to a much better than normal situation.

> Inflation began spiking in late 2020 / early 2021.

If by late 2020 / early 2021 you mean April (Q2), then sure. Pretty broad range there when there wasn't spikes in inflation until then.

20/20 in hindsight. A lot of people saying they should've done X or Y. It's easy to say that from behind the desk commenting on HN, all due respect.

> I'm also of the opinion this will spread into the broader economy later this year and the full cost of the Feds mistake has not yet been felt.

I've been hearing this all year from certain political entities. Maybe your blind squirrel will find a nut this year, but so far all the doomsday predictions have continually gotten pushed back. The job market is extremely strong, growth was still strong. If we go into a slight recession, I have to ask, so what?


> The fed should be cautious by making any changes. I believe the fact they didn't rush to judgement has led to a much better than normal situation.

The Fed was not cautious during the pandemic. Not at all! They dropped interest rates to near zero, massively increased QE, and shot the stock market over the moon. Then they sat there watching inflation spiral out of control in the misguided hope that it was "transient" before finally clamping down harder than they would have needed to if they had done the right thing in the first place.

It was a massive overreaction, and a significant change in policy (at least in degree if not in character, especially because the lack of emphasis on inflation policy), and they did it in a hurry before they had a chance to really figure out what was happening. Now we all have to suffer for it -- and the impact of inflation on American families should not be handwaved away or minimized.

Back in 2021, people like Larry Summers were also dismissed for making "doomsday predictions" that turned out to be true. He wasn't the only one who noticed -- but he was the only left-leaning economist brave enough to challenge the new administration by saying what a lot of them already knew. This isn't hindsight -- people knew the Fed was wrong years ago.


When Google announced 12,000 layoffs, for anyone reading past the headline this came on the back of hiring 41,000 people over the last two years. 41,000 is about 1/2 the total employees of Salesforce as of October 2022.


I joined Amazon in 2017, and even back then it was doubling every 1-2 years. Which still seemed absolutely nuts to me.

It did later mean the cult-like indoctrination of the leadership principles made sense. When you’re growing at that rate (even not considering the attrition that happens along side it) you can easily end up in a place where _most_ of the employees have been with the company for 12mo or less. If you’ve any hope of maintaining a consistent culture over time you need to get in embedded into people quick. Because the new hires are looking at people who’ve only been there 9 months as though they’re the OGs of the company.


I know that everyone is looking at this as "by the numbers" and there is truth to be had there, I offer a different take though.

Apple's culture is not all that amendable to high growth of headcount[0]. Its not in their DNA. Steve Jobs in his return back to Apple setup a cultural legacy around hiring smart, hiring smart, and being intentional. That its better to invest in hiring the best you can and people who fit. When I was there, it was another huge boom time for most of the industry, and Apple still wasn't hiring fast like Meta (then Facebook) or Google was. Some departments in Apple do scale quickly, but typically there is a targeted plan behind that scaling.

In essence, they don't like to take shots on scaling the people aspect of the business, because they value culture fit extremely high and intentionally put it ahead of many other metrics internally.

That was and I believe still is the case

[0]: one can argue no company can take rapid headcount rises, but setting that aside, clearly a lot of companies thought it could work


It was a symptom of the bubble economy, particularly in e-commerce and online services, that materialized during the pandemic and was egged on by a series of mistakes made by the Fed.

Demand looked like it was growing massively with no end in sight, valuations were insane, and money was easy to get.

A lot of companies didn't stop to think that maybe it was all just a bubble. So they hired to avoid missing out.

Basic things like running a sustainable business with a sane balance sheet were not scrutinized by investors. Now they are, and in the layoff emails to employees we see CEOs talking about how that is important, as if they suddenly realized it for the first time.

To be fair, some of those companies probably did make a lot more money than they would have if they hadn't taken advantage of the situation -- and they probably would have been punished by their shareholders if they didn't.


I wonder what it tells about employees selection and their level. Was it harder to get into Google/Meta 5-10 years ago than within the hiring boom?


Working and then getting severance seems better than not working...


> How can you effectively double orgs of this size over a 2 year period?

Apparently you can't.


Given the number of orgs that have done this quite successfully in the BC (before Covid) era, it seems like you can.


I dunno about that. I was at Facebook for five years, it doubled in headcount basically every year. The culture when I joined was very, very different to the culture when I left. And even after the layoffs they're still double the size they were when I left.


Which ones for example?


Correct on all counts. The hiring was insane and even with the cuts the number of employees is still higher than 2020


> How can you effectively double orgs of this size over a 2 year period?

That's the cool part - you can't.


If your attitude is adjust your workforce at will then who cares. If I can hire twice as many people and get X done twice as fast then do it. At some point it is likely X is done and I don't need the people. Dump them on the streets. Unethical maybe but not irrational.


So, like everything, that’s not the whole truth. Apple was much more restrained, but that’s because it’s always been restrained. I commented this before here, but not all tech companies substantially changed behavior wrt hiring…

We forget that many of these companies have been doubling every few years already. Google and meta were startups in the 2000s, apple and Microsoft are 20+y older.

Basically, most of these tech companies have been hiring at a constant rate for years, and also experiencing constant attrition. Once the economy soured, and then hiring freezes started, attrition rates had crashed, and the employee count wasn't as affected by freeze as desired. The layoffs have been roughly a reset towards the headcount before the freeze for many companies.

While the whole article is paywalled, I'll quote an excerpt from ben Thompson:

> The popular narrative right now about these layoffs is that tech companies dramatically over-hired during the pandemic, but while that seems to have happened with Amazon — and for arguably very good reasons given the way that e-commerce shot up during lockdowns in particular — the reality is that the rest of the tech companies largely increased at the same rate they always had. Sure, the number of employees they added was large, but that was a function of keeping the same hiring rate off of an ever increasing base.

> In short, no one was giving up a job at one of the big five tech companies this year as fear spread about a broad-based slowdown in hiring... These companies, though, adjusted more slowly to the slower rate of attrition, which means they accidentally increased their headcount... the relatively limited size of the layoffs to date actually reflects that: these companies are not returning to their pre-pandemic levels of employees, but rather to where they would be had they kept up roughly the same rates of hiring this year that they have over the last ten

[0] https://stratechery.com/2023/tech-layoffs-big-techs-hiring-r...


Do you have a link to data supporting the doubling? I would love to see the numbers!


It was in TFA. Unfortunately I can't get past the paywall again. There was a graph that had employee growth. I left out a couple companies but Meta and Google were mid 90's% and SF was something like 104% from 2020-2022. The source in the article was "Bloomberg" presumably they are from public filings.


The numbers are in the article...


Ok, thanks. I admit am lazy and mostly read just comments :/


Was there any federal program that encouraged hiring at these large corporations as part of the COVID bills?


companies borrowed billions (startups via equity raise, publics via debt offering at almost zero % rates) and quickly increased headcount in the name of chasing growth


Yes. Apple's size is still large but isn't ten times as large as Amazon, nor has it grown as much in a short space of time.

But surely someone must have questioned the industry mass over hiring with the Covid bubble and VC cheap money fuelling it for decades to result in these mass layoffs?


Apple's underlying economics haven't changed, so they have not had any reason to over-hire in the first place.

Amazon, Google, Meta, Shopify and others are all e-commerce based. There was a clear trend that e-commerce had shifted up due to the pandemic, and looked to be permanent. That bet was wrong, so they all had layoffs.

Other companies were willing to hire lots of developers because the ROI on devs was good compared to low inflation and low interest rates. Then both of those things changed, so the ROI on a dev compared to investing in other things became much worse- so there's less money for developers.

Meanwhile, Apple has a mostly-stable massive chunk of the consumer device market and has always had a profit margin that is startlingly high. Not much has changed as far as they're concerned.


I still can't believe major corporations thought the shift to ecommerce from the pandemic "looked to be permanent". More people than not spent 2 years complaining they couldn't leave the house and rushing out to the pub/shops/holidays the minute restrictions dropped.

It was certainly a possibility, but staggers my mind that companies of this size all decided it was a sureity.

If I were to be uncharitable, this is the effect of senior staff at these companies having been completely detached from the pandemic experience of the majority of the population. Lots of people still had to go to work F2F, lots of people spent it cooped up in 30m2 apartments with flatmates they hate.


That's not exactly what they bet, because anyone knows that it would be dumb, including them.

What you have to look at is e-commerce adoption as a % of retail. This % is increasing every year since the end of the 90s. It was about 10% pre-pandemic.

Of course they knew that the huge jump from COVID wouldn't be permanent. What they bet however is that once things re-open, SOME of that conversion to e-commerce would be permanent and that trend of % of retail on ecom would keep increasing at a faster pace than if COVID had not happened. Which wasn't the worse bet to take, you'd think that if people get used to a certain way of shopping and we invest in all this ecom infrastructure, some people would prefer the convenience of it in the long term.

Tobi explained it well in his announcement for layoffs last year at Shopify (which came much earlier than other tech companies, likely because their revenue is much more strongly related to ecom revenue) https://news.shopify.com/changes-to-shopifys-team


I don't think they thought it was permanent as much as no one really knew when it would stop. They had to make a choice whether to hire enough to sustain the current growth rates or scale back growth in which case competitors might be able to capture that market share. No one had a crystal ball regarding when exactly the growth would stop and what the post-growth period would look like, so from that perspective using all available evidence, the play was to capture market share and worry about the future when it happens.

It's a reasonable strategy, the risk being their cost structure gets unbalanced and they might have to lay off people. Contrary to what people here seem to think, laying off people isn't the end of the world, and many of these companies are very comfortable doing it once their growth calculus changes. It was a calculated risk and if we are being honest, it paid off very well for most of the companies which are currently doing layoffs. In many cases the alternative would be to forfeit growth just to potentially save jobs down the line - but what would that look like for companies like Amazon? I don't know if people remember but when the pandemic hit Amazon was scrambling to meet the demands of customers and prime shipping times shot up from 1 day delivery to sometimes more than a week. Those situations would give competitors like WalMart an opening to capitalize on taking market share.

At the end of the day, no one had a crystal ball, and while companies probably shouldn't have assumed whatever growth rates of the quarter were permanent, to ignore the growth and not hire in that environment carried it's own risks. And besides, are the current growth rates permanent with all the macro-economic factors at play? Of course not, most likely the economy will pick up at some point, but companies don't know when exactly that will be, so the prudent thing is to prioritize their workforce on high priority revenue generating products and balance their cost structure around the current economic realities.


It doesn't matter if the management team thought the shift was permanent or not.

What matters is not getting fired.

It's hard to get fired if your actions are in line with the consensus.


> Lots of people still had to go to work F2F, lots of people spent it cooped up in 30m2 apartments with flatmates they hate.

I assume by `F2F` you mean "face-to-face"?

I think that's a matter of perspective. In my echo chamber, people live with flatmates that they get along with just fine and are happy that they don't have to go to an office to do work that could be done at home.

> More people than not spent 2 years complaining they couldn't leave the house

So... perhaps they do indeed want to "leave the house". But I think they still don't want to go to the office when they can work from home.

I think e-commerce is indeed shifted-up relative to the start of the pandemic. I think things that e-commerce can't do well are still where people want to leave the house. As you said, pubs/window shopping (not quite shopping itself)/holidays.


> I still can't believe major corporations thought the shift to ecommerce from the pandemic "looked to be permanent".

My theory is that it had less to do with e-commerce overall than more to do with all the big spenders in VC and crypto that essentially vanished once QE and ZIRP regime changed completely (over a very short period).

Not unlike the dotcom boom where large companies at the time like Sun/Cisco took huge losses because they were selling shovels for the gold miners (startups) who suddenly all went out of business. It's no surprise that so many of the Super Bowl ads were crypto - there was an immense amount of such cash swishing around.

Apple didn't dip into that market at all (naturally they are B2C) so they neither hired for that gold rush, nor fired as the gold rush faded.


A more concrete uncharitable explanation: if you're one of those execs who only cares about next quarter, convincing people that a temporary situation is permanent would be one way to leverage your ideas.


The question isn't "is it permanent" it's "what are the chances it's permanent". I can imagine that if they come to the conclusion that there's even a 10% chance then it makes sense to invest resources as if it will happen because they can always just lay people off, but it's much harder to catch up.


Not only that, but you have to recall there was the chip shortage before and during the pandemic. Given Apple's reliance on HW sales, it would be foolish to ramp up hiring when you can't obtain the materials for your products. While at the same time people moved to buying everything online and having Teams/Zoom calls for all work and school. The situation was just very different for Apple vs Amazon/Google/Microsoft/Zoom.


There was a chip shortage for old machines, bigger fabrication sizes, and because car manufacturers and others stupidly canceled their contracts with chip makers, anticipating a downturn when instead there was a huge boom in demand.

Apple was largely unaffected by this because they are first in line for new fabrication processes, did not cancel their existing chip orders, and uses mostly modern fabs anyway, not bargain basement 40nm+ stuff.

The M1 and other 5nm and smaller chips largely kept pace with demand. It’s never been difficult to find a Mac throughout the pandemic.


I don’t work at Apple, but Cook at least portrayed there was impact to Apple publicly.

See https://www.businessinsider.com/apple-imac-and-ipad-producti....


> Meanwhile, Apple has a mostly-stable massive chunk of the consumer device market and has always had a profit margin that is startlingly high. Not much has changed as far as they're concerned.

Apple had a big bump in pandemic-related revenue too, inspired by mass WFH hardware purchases. And now Apple revenue has gone down as well: https://sixcolors.com/post/2023/02/apple-results-and-charts-...


But they were selling the same thing, just because more people buy laptops doesn't mean you have to hire more hardware engineers. The design is the same, you scale your outsourced manufacturing instead. With e-commerce you absolutely need more engineers to maintain the software and you need to buy more servers and hire more sysadmins etc.


Your assertion makes no sense. If anything, physical goods should take more people than does cloud software.

These dotcoms hired more people because that's what dotcoms do when they can.


Is software supposed to be a low marginal cost business? Why would I need more engineers if my site has 50% more traffic?


Strictly speaking, software as a low-cost marginal business originally referred to shrink-wrapped application software. If your website is basically a buy button with a download link, you fall into that category. But now your distribution needs have scaled, so either you hire more people to deal with shipping out CDs, or make sure your website scale. And if your software is the distribution (as is the case for websites, SaaS, cloud apps) you might need more engineers.


> you absolutely need more engineers to maintain the software and you need to buy more servers and hire more sysadmins etc.

Apple makes software, you know. ;-)

They also have many internet services, such as iCloud, App Store, Apple Music, TV+, etc.


None of which scaled significantly due to the pandemic. Everyone at companies that use Macs had Macs already, so if they're not using the one at work anymore there's not a net increase of Mac users.


Um, Mac sales skyrocketed during the pandemic. Anyway, Apple makes other products that use internet services too, like iPad, and... have you heard of iPhone? ;-)

App Store downloads also increased a lot during the pandemic.


> looked to be permanent

I don't believe that any CEO/CFO with half a brain actually thought this. Sure, things wouldn't return to pre-pandemic levels necessarily, but the COVID-induced state was not going to permanently change human behavior.

> ROI on devs was good compared to low inflation and low interest rates. Then both of those things changed

this is more the reason; the "mea culpa, how could we have known" excuses is just to cover a regression to the mean they knew was eventually going to happen.


For CEOs that thought this would be permanent see Shopify and Amazon. Both have said this publicly and Amazon spent billions on transportation infra. They are not stupid.


Amazon is a different case altogether (see my other post about them) as they deal in physical goods; that transportation infrastructure allows them to capture greater market share even if online sales drop.


Amazon, Google, and Meta all seem to make riskier bets, too.

Both strategies make sense—make big, risky investments, or grind out more profits in existing, highly lucrative markets.


I get putting Meta on that list because of its bet that worse Second Life can be a hyper-lucrative business model, but the other two?


> Amazon, Google, and Meta all seem to make riskier bets, too.

What are those risky bets I wonder?

Meanwhile Apple rolled out a completely custom CPU and transitioned all its hardware to it.


> Meanwhile Apple rolled out a completely custom CPU and transitioned all its hardware to it.

Amazon literally did the same with Graviton (to be fair they acquired a company doing this and used their work as the basis, but still). Google literally developed TPUs in the same timeframe, while also trying a few other things (Stadia, rip, comes to mind).


Neither of those were risky in the sense of "if it fails, it takes the company with them".


Apple making their own CPUs would also not have taken down the company. They took small incremental steps; at any point, a failure of their own CPU program would at worst have meant using a commercially available SOC (with probably great terms), or alternatively skipping the SOC refresh for a year.

In fact, by your criteria, I doubt that Apple has made a single risky bet this millennium. Even the iPhone failing would not have taken down the company at that stage.


Look, if you want some new hardware using latest node it’s expensive. You must sign nice expensive contract with foundry and pay pay pay before anything comes out. It’s risky, there is no guarantee, that all produced chips work as expected. And it’s done in secret for years with insane stress for engineers. They can leak or chip can be a failure. There is no way to get any SoC on the marked in desired quantities for Apple with short lead times. Changing processor was risky for them.


> Changing processor was risky for them.

Which specific Apple-made SOC do you claim was a company-ending risk? The A4? Surely not, it was just commodity components they integrated. The A6? Clearly not. That's the first SOC where they included their own CPU design, but if it had failed they could just have used the A5 as a backup, given they were already manufacturing it on the same process as the A6 was released on. And so on.

Why do you think Apple didn't recognize whatever risks you think were there, and mitigate for them? Are they really so badly managed that they'd just risk a trillion dollar company on a roll of the dice rather than have the appropriate contingencies?

Why doesn't every single chip maker run exactly those same risks? Heck, why doesn't every user of commodity CPUs run those risks? The things you claim that could happen to Apple's SOCs could also happen to Qualcomm. Why isn't every user of Qualcomm's chips up the shit creek if a Qualcomm CPU fails validation?


Riiiight. iPhone is not risky, but small scale Graviton production by Amazon is.

Gotcha.


I don't know where you got that from. You defined the criteria. By that criteria, neither Apple or Amazon ran a risky CPU project.

Now, the reason is that your criteria for what is risky are pretty stupid, as evidenced by not even the iPhone meeting that bar. But don't blame me for it: again, you are the one who set the bar at the failure of the entire company.


> Other companies were willing to hire lots of developers because the ROI on devs was good compared to low inflation and low interest rates

Was the ROI good though? Did any of these companies seem to deliver new products or features faster? Do their financials suggest that these devs somehow improved operational efficiency (e.g. improved gross margins)?

Or was it just that they hired because they could afford to?


I think it was more of a tragedy of the commons where some companies started aggressively hiring, and other companies followed suit to avoid losing access to talent. The pandemic was a huge societal event, but nothing indicated it was going to be the new normal and CEOs who grew headcount expecting it to go on indefinitely made a huge, and arguably an easily avoidable, misstep.


Most orgs claim that they need X% more money to serve X% more users, and they use that money to hire X% more devs. Since the users and money are both there, they get it in the short term, and nobody looks at the big picture until money tightens up


> Apple's underlying economics haven't changed, so they have not had any reason to over-hire in the first place.

The bloomberg article has a visual showing revenue-per-employee going up from 1.17M/employee (2017-2019) to 2.51M/employee (2020-2022), so ... some of the underlying economics changed somehow?


revenue-per-employee going up from 1.17M/employee (2017-2019) to 2.51M/employee (2020-2022)

The pandemic probably contributed to that, but also during that period they went from their worst line of Macs in recent years to their best.


exactly. decompass companies hired because they were seeing a boost in the amount of consumption, literally tripling overnight in some cases. has been well documented as "the quickening" where decades of growth was observed in the matter of weeks.

https://www.mckinsey.com/capabilities/strategy-and-corporate...

CEOs that didn't meet those needs would be fired for ceding market share instead. This demand explosion did not manifest in hardware companies such as Apple.


Was it even mostly devs that were hired/fired? Anecdotally on the firing side it’s mostly recruiters and business ops even for these companies


I can only speak to my experience in the engineering org at Meta and past experience at Apple.

One key difference between Apple and Meta is that each individual (on the ground) team at Apple is responsible for its own hiring. Whereas at Meta, they have a general company-wide engineering pool that is mainly driven by recruiters to fill.

A manager at Apple will get a batch of resumes and start hiring with a recruiter and the team members will then interview each candidate and decide to move forward or not with a candidate.

A manager at Meta, on the other hand, is constantly trying to sell their team to potential engineers looking to either switch from another team internally, or come from the general pool (bootcamp).

Make no mistake, both companies get headcount budgets driven by org-wide budgets set by the very top and HR uses that budget with very complicated formulas for determining how fast they need to hire to maintain or grow headcount to stay ahead of people leaving. Also, engineering managers absolutely have endless work to assign to new hires. So given the headcount, they will definitely make use of it.

All that to say, I do think Meta's internal team structure made it far easier to over-hire and Apple's intentionally inefficient structure prevented doing the same.


Reaction: Sounds like Apple's method encourages managers to select and hire one person, who is a very good fit for his team's actual current need. Vs. Meta's encourages managers to cast a net, haul in a load of fish, and hope that one of 'em is a passable fit for his team's current need. But many keep a few extra fish regardless - since he'll have more needs in the future, and fishing is a crap shoot.


Or the reverse take, at Meta, ICs find the projects on teams that excite them and they are the ones empowered to decide what is a good fit for themselves.


That's likely still true at Apple, but you gotta start somewhere.


Anyone know what internal mobility is like at Apple? At Meta it's pretty easy to move teams once you hit the 1 year mark.


Apple is the same.


> "Also, engineering managers absolutely have endless work to assign to new hires. So given the headcount, they will definitely make use of it."

This seems like the crux of the problem. Of course engineering managers always want more headcount - I am one and I would be very happy to get more headcount. My backlog is ten miles long and keeps growing.

But I also understand that if I head enough engineers to actually take care of all the things I want, I will have likely overhired, and am chasing increasingly marginal returns on investment. Some things should not make it off the backlog, because honestly the ROI likely isn't worth it.

"Optimal" staffing at the team level is almost certainly not optimal for either the company or the product. A team that doesn't have to constantly drop things because of lack of resources is likely building a lot of stuff that is poorly-justified.

In a healthy company the desire to do everything is tempered by financial reality and sound judgment - it sounds like both got tossed out the airlock at many companies.


LOL, this is also very true.

I do question how you manage your backlog though, if there are things on it that should never be implemented, maybe you should clean it out? :-p


Could make an internal stakeholder unhappy. Much easier just to let it sit in the backlog forever.

Sometimes I come back to a very detailed bug report I wrote about something I found in another team's system. It has been fun to see it grow to almost 4 years old now. Although I am not happy it gets pushed back every release, I would certainly be less happy if it was just closed as "this will never be worth it".


I had the impression Google and Amazon use the same model as Meta. But in working in the field since the 1990s I have never encountered or worked at any company that used their model. I think their model is likely superior in terms of hiring the most people in the shortest time. With the shortcoming that it might make it easy to overhire, and it might not fill the job with the best person as often.

Lots of small to massive very successful companies don't need to use their model of hiring a pool of people and then only later figuring out which role they will fill. If you're only hiring someone when you have identified a specific role and identified the correct person for that role it might take longer to hire someone, but you won't overhire as easily and you're probably more likely to have a higher hit rate in terms of getting the right person.


The funny thing is my personal experience was the opposite. I have a lot of experience in embedded software. The recruiters I interacted with at Google knew about my specialization and targeted positions that were embedded-related (like, at waymo or their other projects). However, the actual phone screen interview was with a web programmer that had very little C++ background and seemed pretty uncomfortable with me using that to solve their quiz. I ended up being weirded out by that and not moving forward. I figured I'd have to go back and practice a lot of stuff I hadn't touched in a while to actually impress their interview panel.

Meta was different. I interviewed with someone from the reality labs team and it was super specific. Our discussion and the software quizzing they did was right up my alley. Like, they specifically had a reason to want to hire me, and given what they described, I had a reason to be very interested in the project they were hiring for. It was a big shock, because the last time I spoke to someone hiring for FB they seemed to have no idea what they were doing lol. I still didn't move forward! But I was really tempted to.

I had a similar experience when I talked to someone at Apple. Unfortunately that time they wanted someone even more specialized than I was (they wanted someone to work on things to manage multiple JTAG devices - that was already innately familiar with doing that sorta thing at a hardware level). Still, 10/10, I'd apply to Apple again. I'd also apply to Meta again... although I think they basically killed the entire reality labs thing. I was so impressed by Meta. Still am.


I wasn't trying to make it about me.. I am not in embedded software, though I've dabbled in it and that would have been a nice path to take instead.

But we all specialize in something and I don't really think I've gone down a path where any of those big companies would really ever be a good place for me now. I think whatever path you go down in terms of specialization would certainly influence your interactions with their recruiting.

I have been working/interviewing since the late 90s.. when I interviewed with various companies is over a 25 year period at this point, so they were not the same companies they are today. I interviewed at Apple in the late 90s when they were a mess, and I thought they were a mess. I didn't own a Mac, they were too expensive for me as a student, so I'm sure that didn't help. I turned down a job at MS out of college because they put a delay in the process and I'd gone elsewhere by the time they contacted me. MS would have been a better place for me to go to start. Google & Amazon I interviewed at > 10 years ago when public/industry opinion of them was much higher. By the time FB/Meta really got going I declined.. I have not had a strong opinion of them. However, of all my friends who have worked at the FAANG companies the ones who work at Meta seem the happiest, with the ones who worked Amazon having had the most unhappy experience and Google seemingly in the middle.


Ahhh I remember when Google was the shining light of the software world.

I'm not sure I'll ever get over my anti-MS bias. I respect them a lot for some things, but I had a couple of real shitty experiences with their university recruiting team about 10 years ago, and I don't like the way they make everything to be managed by others. That they sell consumer software feels like a bit of an anomaly sometimes.


Have to question your life choices if you are the "manage multiple JTAG devices in hardware" expert. (In HW, we are all that person.)


Google is very similar. You go through a pipeline to send you to the hiring committee, then once you are cleared and leveled, you can now interview with managers who can read your scores etc.


> Also, engineering managers absolutely have endless work to assign to new hires

Assuming there isn't a sharp dropoff of the ROI on the work juuuuust past the point where the current team can tackle it, that means that the work in the backlog is valuable and worth paying for. By chopping people they are saying actually, this work doesn't need to be done. Whether that work needs to be done or is valuable has not magically changed. That means the company was wrong all along about whether the work in the backlog was worth doing, and had all along had too many people on the team -- despite painfully fighting for headcount to get that work done, and despite careful calculus about which teams to spend budget on, all those managers were basically categorically wrong all along. How can we reconcile all that?


This doesn't seem a plausible hypothesis. there are tons of companies which adopt the Apple model, mind included in this model. actually you have to argue to make the case for why your role is important which makes qualifying for head count even more unlikely


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: