Hacker News new | past | comments | ask | show | jobs | submit login
Programmers should plan for lower pay (2019) (jefftk.com)
145 points by luu on Nov 5, 2022 | hide | past | favorite | 307 comments



> What makes me nervous, though, is that we don't really understand why programmers are paid this well

I understand why developers are paid this well. The marginal cost of software is (close enough to) zero [0]. This means that once you make software once, you can sell it infinitely - minus some scaling costs, hence being close enough.

We don't pay people based on how hard they work. If we did, stay at home parents would be filthy rich.

Sure the very very top end pay will fluctuate depending on stock market conditions since those include a significant portion of equity, but the reality is programmers deliver expensive products that businesses can sell easily that pretty much everyone wants. Trends will come and go, but making high quality tech products will continue to be profitable in my lifetime (barring WW3) and thus programmer salaries will stay high.

[0] https://ckluis.com/the-marginal-cost-of-software-approaches-...


>We don't pay people based on how hard they work.

This.

Something being difficult to achieve does not make it valuable.

In practice, this is often the opposite. The most difficult jobs are also the less productive/less well paid.

The ability to write code is the closest thing we ever had to magic. There are many levels of mastery, but it is undeniably powerful.

The strange thing is that not everyone seems to enjoy acquiring the skills needed to do that.


> Something being difficult to achieve does not make it valuable.

And value to what? As an ex-physicist I can confidently say that doing that was substantially harder than the ML I do now. But I also think that what I did provided more value to society as a whole. Just my opinion though. But it doesn't pay well (<$100k/yr) and I'm not surprised I saw a mass exodus of people trying to get more money with their math skills (mostly in DS and CS). If pay was correlated strongly to difficulty mathematicians and physicists would be paid much more than Hollywood actors. But for some reason these conversations always revolve around the (barely a) millionaire class and never include the billionaire class. It feels surreal in that it looks like we're fighting for crumbs and arguing who deserves it more.


There’s a difference between valuable to the market and valuable for society. For all we know ecological research will become an existential issue but that doesn’t mean consumers or businesses give a damn. Meanwhile advertising is worth trillions.


Be careful. You’re approaching communist thought crime.

https://en.m.wikipedia.org/wiki/Use_value


Both concepts have their problems; it does not have to be an either-or choice.

The problem with market value is that it downplays the needs of the poor, and is sensitive to all kinds of regulation (intuitively, if regulation makes the same thing more expensive, it doesn't make it more valuable).

The problem with use value, as defined by Marx, is that it ignores the differences between individuals, and technically the needs of individuals at all. It simply means "what the one who controls the society considers valuable". (If you think this is a strawman, just read the Wikipedia article carefully.) Of course, it is assumed that the one who controls the society is omniscient and can determine the value correctly, but that's just answering the question by passing the buck to a hypothetical omniscient entity. (And if you look at actual communist societies, it turns out, perhaps surprisingly to many, that the ones who control them are neither omniscient nor benevolent.)

What seems to work best, so far, is a hybrid system where:

* the value of million little things (often not useful per se, but as a raw material for something else) is determined by the market;

* some form of social security ensures that anyone's needs have a non-zero weight;

* the "socially useful but not market profitable" projects are paid for by taxes and charities.

In other words, the "use value" can be implemented on top of "market value" by financing things by taxes and charities. In the opposite direction, the "market value" can also be implemented on top of "use value" by trading the government-assigned things on black market; the only problem is that this is typically considered illegal. The fact that both things occur naturally in different kinds of societies suggests that both are necessary. (And the proposal of "free market with unconditional basic income" is how they could exist side by side without having to implement one of them as a hack on top of the other. The last component you need is some kind of Kickstarter.)


Nah, the difference between value to the market and society is because we broke price discovery by holding rates low for decades. The current paradigm is literally central planning, which, last I checked, is closer to communism than capitalism. If the central feature of capitalism is the invisible hand we shattered the wrist.


In this context I was talking about something being valuable from an economic standpoint.

That is the only way I know to put a price on something, and this is clearly not a reliable process (see all the scams) but is there anything better?


But if you look at it with all else being equal, difficulty certainly helps increase pay far more often than not. Difficulty level constrains the supply side of the equation. though there are additional factors on both sides.


There's no shortage of people who enjoy the acquiring the skills to write code? I'd say it's more the tension of "finding ways to get paid for it in this ever churning environment."


> There's no shortage of people who enjoy the acquiring the skills to write code?

There certainly is a shortage, or salaries would be lower. Anecdotally, at least 90% of the people I talk to say something along the lines of “ooh I could never do that” when they learn what I do for a living. Most people seem scared of coding, and I’m not entirely sure why.


From what I have seen, most people simply do not like coding. It's a very difficult mode of work for most people and I'm not surprised.

It's very time consuming, requires you to think in logic and the work is relatively solitary. It's a nightmare job for a lot of people.


> There certainly is a shortage, or salaries would be lower.

Sounds backwards. A shortage is a situation where price is unable to rise. If there is a shortage, salaries should be higher than they are.

Price gouging laws are a good example of where shortages can occur. We experienced a shortage of toilet paper at the beginning of the pandemic because the law (in many jurisdictions) prevented vendors from charging the fair market value. If vendors were legally able to sell them for $1,000 a roll, the market would have been easily satisfied as the average person unable to pay $1,000 a roll would have bowed out of the market, leaving just the right amount for those who still were willing to pay up. Instead, because of the law toilet paper remained as affordable as it ever was, so it became a race to see who could get it first instead of letting a higher price scare some people away.

Within the scope of labor, healthcare can be prone to shortages as governments have been known to impose price ceilings on practitioners to try and make healthcare more accessible to all, so that the rich don't buy up all the doctors' time with a better offer, leaving no service available to the poor. This is quite uncommon generally, though, and not something I've ever seen within the software industry. We're quite happy to let Google outbid Mom and Pop's Used Book Emporium for developers.


I mostly agree with you (and made a similar comment), but there is something more going on I think... since developers are so valuable, shouldn't more and more people become developers until the cost for paying a developer goes way down?

Like you said, we don't pay people based on how hard they work. However, we also don't pay people based on how much value they create... we pay people the lowest amount we can that is enough to keep them from going to another job. The value they create just sets a ceiling on the wages (you can't pay someone more than the value they create or you will soon go out of business)

So there are two possibilities as I see it... either there are simply so many possible business opportunities that the number of developer jobs is practically infinite (new jobs are created as soon as a new developer is ready), or there is something that limits the number of new developers.

I think it is a bit of both. I think there are fewer people who have the capability to be developers than we like to admit, and there are a lot more possible jobs than we have currently.


There are limitless free resources available on the web such that pretty much anyone who wants to be a software developer, COULD be a software developer.

So why aren't there more people for these highly compensated jobs?

I remember seeing someone say 'Running is free, but not everyone can be Usain Bolt'.

A bit elitist to be sure, but being a good developer requires a certain type of thinking that simply isn't common in the general population.


Completely agree. Most people are just not cut out to be developers. Not in a "you're not smart enough" sense, but in a "your mind does not operate this way" / "you won't enjoy the work" sense.


> There are limitless free resources available on the web such that pretty much anyone who wants to be a software developer, COULD be a software developer.

Mostly correct, with one objection: time is also a necessary resource to become a software developer. Some people who may have the talent do not get the opportunity to learn simply by having to keep a full-time job (or two) since they were 18 (and getting a shitty education before that). Most of us have enough free time to learn things (otherwise we would not have enough time to discuss on HN), but it is not universally true.

That said, although giving everyone a realistic opportunity to become a software developer would be fair to the unlucky individuals, I do not assume that it would change the balance in general.


I’d disagree — I think most folks are too busy and stressed to learn to code.

Access to quality, free information isn’t enough, sadly. The average person the world over isn’t in a position to make the jump due to economic factors over intellectual limits.


But there are many many people at least in the western world who absolutely would be in a position to make the jump. And yet they choose not to.


I'd argue that having the confidence and ambition to make the jump is also a function of privilege to some degree.


Well, you could argue that having any kind of talent or ability (inherent or or not) is a privilege then.


> since developers are so valuable, shouldn't more and more people become developers until the cost for paying a developer goes way down?

There is a natural ceiling of the number of SWEs due to aptitude for programming. Some people simply don't have the grit, tenacity, persistence, or even intelligence for being able to learn to program, much less do it in a professional environment.

I've taught many people and of those, perhaps 80% just can't handle it. They see the money, start trying to learn programming, and they give up after a few lessons because they couldn't understand for-loops for example, or they just get bored and conclude it's not worth the effort. I've seen this firsthand.


I think math and programming are the same in terms of difficulty in logical thinking. The hardest part isn't actually intelligence, it is tenacity and endurance. You must be prepared to sit a whole eight hours in front of a computer to solve a nerve wrecking problem no matter how trivial it's end result is.


> we pay people the lowest amount we can that is enough to keep them from going to another job.

First off, what a terrible system. Why would we want to support that. Second off, this clearly isn't happening when you look at those taking in 8+ figures.


If you have a better idea than supply and demand to determine price and hence allocation of resources, the world is all ears.

Until then, it will be like democracy. The worst system, except for all the others.


>If you have a better idea...

Start using supply and demand to determine the price of money. Supply and demand elsewhere is a sham if you set ceilings and floors on the interest rate.


The problem with this comment is that you're making it a binary option. That we keep the cronyism we have or nothing. It is like how people divide things into socialism vs capitalism. There's a lot of nuance and complexity here. Government regulation matters because we want high competition. There's ways to have supply and demand price determinations without them being wildly out of control and without plutocrats.

There's a continuous spectrum of things you can do, not discrete, and definitely not binary. So my suggestion is, and I'm far from being alone, is reign in the plutocrats. No one cares about them while the average person's life is getting better, but boy do we care when that stops happening. Especially when we see their lives continue to get better while the quality of our lives decrease.


I do not see how

>Government regulation matters because we want high competition. There's ways to have supply and demand price determinations without them being wildly out of control and without plutocrats.

is mutually exclusive to

>we pay people the lowest amount we can that is enough to keep them from going to another job.

This system is the same as shopping at a store that sells you the same goods at a lower price than another store. I do not see what this has to do with a “plutocrat”. Buyers have been buying goods and services at the lowest prices they can find for at least thousands of years.


> mutually exclusive

Governments often set floors for wages. In fact, this is called the minimum wage.

> This system is the same as shopping at a store that sells you the same goods at a lower price than another store.

That's how it works in a perfect world, but not how it works in reality. In a perfect world monopolies wouldn't be an issue. The reason they are an issue is that with a monopoly it is very easy to manipulate the market that you control. The reason we generally break up monopolies is because they are actively manipulating said market and stiffing innovation. We tend to not go after monopolies that aren't doing this.

> I do not see what this has to do with a “plutocrat”.

A plutocrat is someone who's power is derived from wealth. A plutocracy is a form of government where that power, derived from wealth, is instrumental to political power. You may think that this is universal to all governmental systems, but it isn't. Often we see the reverse (wealth derived from political power vs political power derived from wealth).

Anti-trust laws, IP laws, the SEC, etc are all designed to prevent this from happening. I think the problem you're running into is that you're taking perfect models (spherical cow in a vacuum) and assuming that they work identically in the real world. This is naive just as it is naive to assume a spherical cow in a vacuum falls out of a plane at the same rate as a living cow. There's an approximation and you can get some useful planning from the simple model, but it is going to have significant errors in the real world experiment.

What it comes down to is a lack of sufficient complexity and nuance in the model you're using. An econ 101 course or even a B.A. in economics will not be enough to sufficiently understand the economy in high detail. It is really easy to overestimate the utility of our knowledge and mental models because we tend to not measure likelihood of events based on random samples but instead have a large selection bias (getting random samples is also quite difficult).


I agree with all of what you wrote, but I still do not understand what the that has to do with

>we pay people the lowest amount we can that is enough to keep them from going to another job.

Minimum wages, labor laws, safety standards, anti trust laws, and all of that can exist, but I am still going to expect buyers to pay the least a seller is willing to accept. And conversely, a seller is going to demand the most a buyer is willing to pay.


> Minimum wages, labor laws, safety standards, anti trust laws, and all of that can exist, but I am still going to expect buyers to pay the least a seller is willing to accept. And conversely, a seller is going to demand the most a buyer is willing to pay.

What you're not getting is that this is theory, but not the world in practice.


I am not making a moral argument, just explaining how things work currently.


A lot of potential developers opt out of CS for social reasons. Large swaths of women won't want to do a 4 year major with a 5-9 to 1 ratio of men to women, especially when the men skew so unkempt. This is just one example.

In CS, you trade social skills and social value to escape the worst of capital exploitation. Go to a bar, tell someone you make 300K, and watch their face the moment you mention to tech worker part.

Where and how you got your money matters. People don't respect us computer touchers.


Luckily tech is one of the few fields where having the corresponding degree is not necessary and you can make a good life for yourself without following a traditional path to upper middle class (like law, medicine, finance etc)

> go to a bar, tell someone you make 300K

you must be having way different bar conversations than I've ever had - maybe they just don't like you as a person lol


I have no idea where you live, but in my 15 years as a professional software developer, I have never experienced a single instance of someone not respecting my career.


In the UK it isn't respected at all. It's about the same as being a taxi driver.


Well, reporters do disrespect us.

When I had TV appearances they were calling me "Pulli Traeger" (wearing casual outfit). In publicity you are respected how you look, not what you can. Experts are frequently looked down, as they rarely can express themselves to a TV audience. On smaller private TV it was fine though.

Also upper management in Europe used to disrespect us as overpaid "coders" or code monkeys.


Perhaps you get no extra respect for being a programmer, but you also do not get extra disrespect. People who disrespect a bad looking programmer would also disrespect a bad looking non-programmer. The upper management probably disrespects all their employees.


In KZ I feel like people think that I'm getting money unfairly.

Sometimes I think the same. Like I can get $2k per month sitting in the office meanwhile road worker will wake up at 3AM, work 12 hours in rain and winter, trading his health for $300 per month.

Anyway I avoid mentioning my profession unless asked directly. It's not a question of respect, but rather question of envy which i don't like to be target of.


Agreed I've been a software engineer for 20 years and not once in my professional career have I felt like the industry was disrespected. This is in the United States.


I'm not sure it's so much "respecting the career", as the stereotype of the nerd with poor social skills is still strong enough to raise red flags for many. Not all of course, and it probably depends a lot on how you interact with others before they learn what you do. I can't say if that's fair or not, but it doesn't seem any less so than any of the other ways people choose partners or friends.


I love how you found a way to blame men for womens choices.

Those darned smelly CS grads with their non-matching clothes, if not for them more women would be in CS!


I urge you to go speak (in depth) with women who did CS and then worked 5+ years in tech.

IME doing this, many are happy with no regrets, but all have truly horrific stories to tell.


> all have truly horrific stories to tell.

Unless the situation is different for women in law, or women in medicine, it still does not explain the lack of women in CS.


My impression from people in medicine and law is that they are much better.


Except that's exactly what it is. BO in CS classrooms is the worst. Every other department had less of this problem at my large public university.


Not just the BO, a big part of it is the personalities; aggressive know it all nerds are the worst. This is anecdotabl buy my now wife has always said she felt like she was taking a chance on me due to my job in software. And judging from the people I've worked with over the years, I have to agree. One of the worst things about software development is other software developers.


What's hilarious is what you're actually saying (and don't realize it).

You're describing many of those who are what we would call "non-normative", if you will. aspergers, autism, and the like. Yes, it's statistically understood those who choose this career are more likely to fall into that category.

So what's really being said here is that women are being chased out of well paying STEM field because they're too judgemental to deal with those who are non-normative.

Is that really the fault of men, or maybe we should blame these self-same non-normative folks?

----

Can you imagine if someone were to claim the reason so many men choose not to go into nursing is because of the smell that happens roughly once/month?

Think about why one is acceptable and the other is not.


What's hilarious is what you're actually saying (and don't realize it). I didn’t mention gender, you did. GP didn’t either.


Well if we're providing anecdotal counterpoints I would argue that software development has the least number of "type a personalities" especially when compared to other engineering fields such as mechanical and electrical - just by virtue of the fact that it encompasses such a wide swath of careers,

for example you've got lots of people who would fall under the umbrella of software development that mostly just do web design and WordPress, they're about the least engineer-type people you can have.


This must be a parody. If this was really the only barrier to women being in CS then a university could just sanction individuals with heavy body odor and tell them to take a shower and voila you have your women in CS utopia.


This is a pretty cynical view of things, and definitely isn’t nearly as true anymore. Software engineers, depending on where in the country you are, are often extremely well respected now.


My university actually has a near 1:1 ratio in its CS course. It helps that we are in the middle of a large metropolis...


I am genuinely curious. Where and when did you go to school?


>>>> The marginal cost of software is (close enough to) zero

I call this "the magic of software." The reason is that it triggers magical thinking on the part of managers. But I think the magic can be fleeting, depending on the business use for the software, for instance:

* Software that's used internally within a business has a ceiling on volume.

* Platform obsolescence sets a ceiling on the volume that can be sold before a product has to be maintained or re-written.

* A bigger user base is more diverse, thus involving more "difficult" users, exposing more bugs, and discovering hidden features that become future maintenance requirements according to Hyrum's Law.

* A software product that's attached to hardware sales rides on the sales volume of the hardware.

* Higher volume attracts competitors, who force you to add features.

If any business could get rich cranking out copies of software, they wouldn't be prioritizing every other possible revenue model such as data harvesting, advertising, subscriptions, and so forth.


The way I look at is that each piece of software, lets call it a feature, has a non-zero operating cost associated with it. Maybe that's a team of engineers that build and maintain it, a team of marketers/salespeople who get it out to customers, a team of designers/product managers who decide what should get built and how, and a team of customer support that then supports this feature. AWS bills, etc

I think with good management, this feature can be sold with a close enough marginal cost of zero. There are costs, yes, you can't just sell it to 500 users without support, but over time as the feature matures, there should be less support and less engineering power required to power that feature. You should be working towards being able to sell this feature without incurring (much) additional cost per unit sold

Now in reality, we rarely sell individual features, and with every additional feature we add the overall complexity of the product increases and so development and support gets more expensive.


I can probably afford to pay 200% of what I currently pay to go out to dinner, but I don't because there is sufficient supply and competition. The lack of supply is why you are paid more, not because startups are rich.


> The lack of supply is why you are paid more, not because startups are rich.

It's both. If software companies were poorer across the board, without changing the supply/demand ratio, salaries would be lower.


If the supply/demand ratio stayed the same, salaries couldn’t go lower. But you are right that if software companies were poorer that there would be a necessary reduction in demand.


I’m not totally convinced. It seems that sometimes when profits fall across an entire industry, the best thing to do is to operate at the tighter margin instead of firing workers to maintain the previous margin. I concede that might be rare, but the restaurant industry comes to mind. If rice gets more expensive, you could fire workers, pass the increase to the customer, or eat the cost. Often the best choice is to concede and eat the cost and say goodbye to the “cheap rice bubble”.


I expect a lot of software companies don't make any money, only living on investment dollars in some hope of making money someday, so margins may not matter much.

But if the companies were to be become sufficiently poorer, the amount they are able to pay has to come down at some point, and once that happens you've got a reduction in demand. That doesn't necessarily mean firing anyone, demand simply measures desire and ability to pay a given price to attain what is wanted.


Silicon Valley can only exist because a software business can be located anywhere but investors and workers must utilize agglomeration.

What this means is that software companies are closer to the money than other businesses.


Investment and workers can be located anywhere. Even Silicon Valley companies list themselves on New York stock exchanges to access capital from around the entire country and the globe.

What's unique about Silicon Valley is the density of software-minded people willing to adopt products made by their friends. It sets up a "if you build it, they will come" situation. That promotes the early adoption needed to grow into something larger. If you were located in Iowa and your friend group was made up of farmers, they likely won't care about the software you are building, and you'll stall out quickly in trying to find users.

On the flip side, if you are building something that would be of interest to Iowan farmers, you will have a greater chance of success having that farmer friend group than a bunch of Silicon Valley tech bros to share with.


It is both. A buyer needs to have the funds to be able to pay.


While I agree with this in principle, it should be mentioned that a great deal of software, perhaps even the majority of software written depending on who you ask, is used internally and is never sold directly. In fact, there are accounting rules in the US that deal directly with internal use software that classifies it as a long-term tangible asset[0] (software is normally classified as an intangible asset).

So along with "selling it infinitely", you can also think of it in terms of using it "infinitely" even though it's considered a tangible asset. It's kind of weird when you think about it, like a piece of equipment that in theory never wears out. Obviously in real life that's not entirely true due to changing business rules and so forth.

[0]: http://files.fasab.gov/pdffiles/handbook_sffas_10.pdf


I think that is definitely true, but it doesn't explain the wild geographic variation in pay. Why are SV programmers paid maybe 5-10x what programmers in Europe are. They definitely aren't 5-10x better and the cost of living isn't 5-10x as much.

I definitely think we'll see movement of work outside SV and outside the US to save costs. Musk could have made Twitter profitable simply by moving all engineering work outside the US.


This isn't really true in the sense you think it is because the corollary of the marginal cost of software being close to zero is that someone only needs to code a quality opensource competitor once to materially impair the value of some software.

What is really driving how well programmers are getting paid is several companies (google, meta, microsoft, apple, et al) managed to get big enough (probably temporarily) to gain vastly outsized profits from the mechanism you outlined above and they all had a policy of drastically overpaying for talent in order to keep said talent from going to a startup that creates the next idea that destroys their outsized profit machine. This was supercharged by way too low interest rates driving private equity valuations on companies that have no hope of actually achieving profits commensurate with their valuations also desperately overpaying for talent in the hopes that they can create the next big thing fast enough to catch up to their valuations. They weren't that worried about their burn rates because the low interest rates made it easy to raise more at very favorable terms at a higher valuation so everyone was happy.

The latter trend is coming to a hard stop right now with higher interest rates (likely higher and for longer than most people think) and the former is also starting to change, so expect broad material weakness in programming salaries going forward.


Everything digital has marginal costs near 0. Books, TV, movies, art, newspapers, photos, etc. (obviously, talking about the digital version of all these). Yet a lot of these fields are known for low pay.


I don't think this fully explains the phenomenon, but I do think there are a couple of factors you're overlooking here:

1. Control over the "distribution channel" is a major factor at least part of the time. The fields which are most notorious for low pay (e.g., books, music, photography) are the ones where the producers of the content are disconnected from the customers and instead sell via distributor. At least some of these middlemen are reaping the rewards of the low marginal cost (e.g. Spotify?), but I'm not sure if it applies to all of them (e.g., book publishers?). More to the point, software developers tend to have more direct control over the distribution channel (notably SaaS) and when they don't the distributor tends to takes a hefty chunk of the revenue (e.g. mobile app stores today, big-box retailers 20-30 years ago).

2. The "make it once, sell it infinitely" model doesn't really apply to newspapers (and maybe some of the other categories) in quite the same way. Yes, the marginal cost per view is essentially 0, so I can "sell" today's paper an infinite number of times. But the value of and interest in that content rapidly depreciates to 0 as soon as it becomes yesterday's news. The newspaper model isn't "write once, sell forever". You need to continue to "feed the beast". It's the ongoing production cost to continuously generate new content that dominates.

3. There may also be a correlation with "content" versus "services". Whether due to distribution mechanics or popular perception, it seems like application providers (e.g., Adobe, Microsoft) are able to enforce a pay-as-you-go subscription model but "pure" content providers are often trapped in a (lower-margin) advertising model because "information wants to be free" (or more to the point, is easily plagiarized by competitors or pirated by consumers). Digital or not, most of the "low pay" examples you cite fall into the content rather than service category.


I agree. Also, observe that companies that employ programmers for internal use (not part of the product) don’t pay nearly as much.


Add to that something like "the fog of development," namely that, in the beginning, software can be sold as "magic" but one can expect that this is likely to have a strong tendency to drop off as open source etc. alternatives become more widely known, especially by "regular" users.


Not only is the work extraordinarily financially valuable in many cases, software is actually pretty hard to do _well_. It's based on highly specialized knowledge and experience. It requires sustained, high levels of discipline and effort. And it is often tedious and/or frustrating work in practice.

It's a corollary to the Dunning-Kruger effect: don't assume programming is easy for _everyone_ just because you happen to be good at it.

I think "Chapter 1: The Tar Pit" of the venerable _Mythical Man Month_ (https://archive.org/details/MythicalManMonth) speaks to this in terms that are still applicable even after half a century. Here are some of the challenges Brooks describes:

1. "Most have emerged with running systems — few have met goals, schedules, and budgets. Large and small, massive or wiry, team after team has become entangled in the tar. No one thing seems to cause the difficulty — any particular paw can be pulled away. But the accumulation of simultaneous and interacting factors brings slower and slower motion. Everyone seems to have been surprised by the stickiness of the problem"

2. "One must perform perfectly. [...] If one character, one pause, of the incantation is not strictly in proper form, the magic doesn't work. Human beings are not accustomed to being perfect, and few areas of human activity demand it."

3. "Other people set one's objectives, provide one's resources, and furnish one's information. One rarely controls the circumstances of his work, or even its goal. In management terms, one's authority is not sufficient for his responsibility."

4. "The dependence upon others has a particular case that is especially painful [...] He depends upon other people's programs. These are often maldesigned, poorly implemented, incompletely delivered, and poorly documented. So he must spend hours studying and fixing things that in an ideal world would be complete, available, and usable."

5. "[D]esigning grand concepts is fun; finding nitty little bugs is just work. With any creative activity comes dreary hours of tedious, painstaking labor [...] [D]ebugging has a linear convergence, or worse, where one somehow expects a quadratic sort of approach to the end. So testing drags on and on, the last difficult bugs taking more time to find than the first."

6. "The last woe, and sometimes the last straw, is that the product over which one has labored so long appears to be obsolete upon (or before) completion. Already colleagues and competitors are in hot pursuit of new and better ideas. Already the displacement of one's thought-child is not only conceived, but scheduled."

I think it's notable that even with five decades of global-scale investment in this field we haven't really _fixed_ any of these challenges. Our tools and practices have improved, certainly, but at best in proportion to the increasing demands of scope/scale/complexity. Context-aware, cloud-connected, toolchain-integrated IDEs with features like programmatic refactoring, REPL-like interactivity (or whatever other modern tooling you'd like to point to) are a giant step up in both usability and productivity compared to the batch processing of literal punch cards workflow that Brooks based his observations on. But I suspect those "woes" still resonate with most people involved with software development today. Projects still regularly fail to hit scope, schedule or budget. Small, obscure errors still sometimes lead to catastrophic failures. Reliability continues to be expensive to and is often fleeting.

The global system of systems that comprise the modern "technology infrastructure" is an ever-expanding, ever-shifting, frequently-unreliable and increasingly-critical house of cards. It's almost certainly the most sophisticated and complex artifact that humanity has ever produced. It's easy for those of us with significant interest, aptitude or experience with it to forget that. But this shit is _hard_. It's a wonder that it works at all.

Toward that end, while you can wave your hands around "how [little] work it takes to become [a programmer]", the fact is it takes a substantial amount of time and effort (and an moderately uncommon degree of aptitude) to become a competent one.

Are some software engineers over-compensated? Probably. But the difference between software engineering and medicine (for example) has less to do with the degree of specialized knowledge or level-of-effort required and more to do with credentialing and our ability to evaluate competence. You can take a 6-week "coding bootcamp" and call yourself a "programmer". You can't take a 6-week first-aid course and call yourself a "doctor". But they are both pretty far removed (in terms of experience and learning) from even a competent "journeyman" in their respective fields (say, a "principal engineer" and "resident", respectively - i.e., someone you could trust to take on a non-trivial level of critical responsibility with relatively little oversight).

To be fair the stakes are generally higher for a typical doctor than for even the most "senior" software engineers, but there's plenty of literally life-or-death software applications, both direct (e.g., medical diagnostics, 911 calling infrastructure, GPS, computer vision) and indirect (even something as mundane as network availability or the reliability of data search and retrieval). Not to mention the enormous number of systems for which failures in delivery or at run-time can have a substantial financial impact. I can't really speak to relative compensation: I imagine many doctors should probably be paid more and many programmers could probably be paid less (and the med-school/residency pipeline should almost certainly have a more sustainable work/life balance). But it's foolish to think that the upper-ish tier of software engineers (say, 40th+ percentile maybe?) doesn't represent a skilled, specialized, and therefore valuable labor force.

It's no surprise that there's a large number of highly compensated software engineers. There's a lot of extraordinarily valuable (from a financial perspective) work to be done and it takes substantial knowledge, experience, and aptitude to be able to do it well.


I wish articles like this would reference the antitrust lawsuit over collusion to artificially keep software engineer salaries low in the 2000s.

https://www.theguardian.com/technology/2014/apr/24/apple-goo...

I also don't see any references to scale. Your labor can impact the lives of x people as a lawyer, where you're constrained by the amount of hours in a day and geographical location. As a software engineer, your labor can easily impact x^6, or even more if you work for a technology market leader.


What I find weird about these conversations is that it never discusses what the higher ups and the companies are making. If CEOs were taking cuts and big corps were losing money, then yes, this conversation about paying employees less makes sense. But when the CEOs are still making record incomes (at least according to Forbes) and companies are increasing (or holding constant) evaluations (baring some noise), this doesn't make sense. It just comes off as "We're going to take a bigger part of the cake now and you're going to like it." But for some reason the conversation is about who deserves more of a shrinking slice of a cake that's growing. Somethings not right here.


FWIW, a lawyer can achieve results in some circumstance. I paid $52 for an “NFA Gun Trust” from National Gun Trusts. It’s literally just a templatized PDF with “Settlor Name Here” and “Trust Name There” across a few pages.


I can't understand why the above would be down-voted...

There exists fields of law where the income earning capacity goes beyond a function of (HOURLY_RATE * HOURS_WORKED) that can approach something more of a SaaS function. Another example is all those drive-time radio ads you hear - these lawyers often do not work in your area or state but work as a "Name you can trust" and refer on to local attorneys of variable quality. Franchise operations... not all law is a billable hour.


> Programmers are paid surprisingly well given how much work it is to become one

This is really only true in parts of the US and within specific domains of programming. Parts of the US and domains that Jeff has continually worked in. Anecdotally, when I was working as a programmer in Dallas my take home wasn't near what it is in the Valley, but I do the exact same thing out here and make 3-4x what I did there.

There's some class-bias going on, in that companies in the Valley somehow justify subsidizing home prices with salary and stock despite Valley programmers being no better than any other programmer. Some other part of that has to do with companies inability to embrace remote work and satellite offices being expensive to build and expensive to coordinate work.

There's also some unfairness that has to do with intentionally putting lots of programmers in one spot and having a lower order of them that can accomplish the hard things. That's to say, not all programming jobs are overpaid, but if you got paid $600k+ (I know, an arbitrary number), then yeah, you probably shouldn't plan for the skies to be so sunny forever with the particular domain that you got that pay in.

> Specifically, I'd recommend living on a small portion of your income and saving a multiple of your living expenses.

This I think is just very good general life advice. Never plan for eternal sunny skies.


> There's some class-bias going on, in that companies in the Valley somehow justify subsidizing home prices with salary and stock despite Valley programmers being no better than any other programmer.

This isn’t true. My first job out of college was working for an old tech company that had decades of experience writing decent software. The vast majority of programmers there were mediocre at best. There were a few stars that were pulling most of the weight of the teams I knew about. The original programmers that wrote the older software had long since moved on.

At my job now, working for a similarly sized company whose headquarters are in the Valley, the average programmer that I work with is quite good. They are much better than programmers I’ve worked with in other parts of the country over the last several decades. The few exceptions now also work at the company I’m at or our competitors. This idea that all programmers are equally good is nonsense. Can you find good programmers outside of Silicon Valley? Yes, it’s possible, but it’s much harder than in the Valley. Talent flocks there.


Because it's about the type of company rather than the size of the company.

A tech-oriented company is going to avoid mediocre developers, a non tech-oriented company just wants to get something delivered regardless of quality.


I don't think what you're saying is true. I worked with some really skilled programmers where I'm from. I've never seen a skill disparity between where I'm from and the Valley. There is a massive, unexplainable pay disparity though.


Yup, and it's probably convex - something like 1 + 1 + 3 = 5, but 3 + 3 + 3 > 12. That would also explain concentration of tech in a single area, or a few areas of the world.


It makes the most sense to me as multiple distinct labor markets. It’s easy to find skilled programmers in the Silicon Valley labor market, so that’s where a lot of hiring happens even if the prices are higher. Equally talented programmers can be found in other markets, but it’s more difficult to locate them.

Part of why it persists is because the gravitational pull of Silicon Valley became self sustaining. Programmers move to that market because that is where the hiring happens. The hiring happens there because all the programmers move there.

I don’t live there, but that’s how I make sense of it all. It’s the same reason I shop at Target instead of Amazon despite higher prices.


> Specifically, I'd recommend living on a small portion of your income and saving a multiple of your living expenses.

General life advice? This implies everybody should save at least 66.6% of their post-tax salary.


The idea is to have, e.g. three to six months of living expenses held in reserve cash. Once you’ve built the emergency fund, spend all you want. Ladder your emergency fund into I-bonds or something if worried about it eroding from inflation.

It’s very solid general life advice. Prepare.


Well, and save for retirement.


Child support for a single child alone, which is what society thinks a single child needs to live, is about 30% of salary post tax. So you'd be spending 2-3% of your salary on yourself if you had even a single child. Sounds pretty rough.


It's pretty weird that "what a child needs to live" is determined as a function of the parent's income. As if kids of rich people needed more money can kids of poor people to survive.


> This is really only true in parts of the US and within specific domains of programming. Parts of the US and domains that Jeff has continually worked in. Anecdotally, when I was working as a programmer in Dallas my take home wasn't near what it is in the Valley, but I do the exact same thing out here and make 3-4x what I did there.

But isn’t it only a matter of time until this changes due to the increase in remote work? Either high paying markets will decrease pay or everywhere else will have to raise it. It is incredibly weird to look for a job and have the option to work remotely for a San Fransisco based tech company for literally twice the salary I have seen offered locally for essentially the same work.


I doubt it. Getting a job at those Bay Area companies is still harder. Like it's widely accepted you'll be studying leetcode for a while and know your data structures & algorithms like the back of your hand and be able to solve any leetcode problem impromptu within 30 minutes on a whiteboard.

It costs a lot more to run a company headquarters in the bay area than most other places. That alone is a bit of a flex. It's like how NYC banking firms pay better than banking firms elsewhere. Being able to have a HQ in an expensive city is a flex and you have higher standards because of it.


One of the things that I think is unique about programming is unlike being a lawyer or a doctor, the difficulty of programming is not "front loaded" and lies beneath the surface, not easily visible to the average person.

It's easy to see why a doctor gets paid so much after you see the brutal hours they put into residency, the difficulty of becoming a doctor is out there for everyone to see in the form of med-school and residency programs. It's different with programming because on the surface level it just looks like guys playing on the computer all day, to the average onlooker it doesn't seem that difficult.

But most of that difficulty is hidden under the surface. It might be easy to become a programmer, but to get to the level where you are an Senior SWE making what doctors make will take you quite some time and effort.

While I wouldn't put programming on the same difficulty scale as becoming a doctor, I think it would be far to say it's about as hard becoming a Senior SWE (in terms of years of experience required) as it is going through law school.


Not to mention with exceptions - a good programmer in my particular niche is someone who probably spent YEARS fucking around with embedded devices and Linux machines. Most of the hardcore nerdy types had at least 10 years of experience with the technology they're working on before they even graduated college.

It's not impossible to do the same in medicine or law....


> unlike being a lawyer or a doctor, the difficulty of programming is not "front loaded" and lies beneath the surface, not easily visible to the average person.

I think this is very much like those professions, especially law. Lots of people think they can be their own medical professional or lawyer, to inconsistent (at best) results.


As the article said it's a cartel limiting entry into medicine. The cartel places an artificial barrier in terms of a giant hurdle requiring uncompromising quality. Much like the FAANG interview process limiting who they hire, (and thereby inflicting the need to pay high salaries upon themselves).

Just south of the border doctors make a fraction as much. Perhaps the quality can't compete in certain areas like the ICU, but I'd also bet their care is fine for most things.

But overall there's no cartel that can stop salary arbitrage for programming. Variants of IT/Engineering are most of the top majors in India, and they don't need to immigrate here to be hired, though whatever cartels do exist in tech are working hard to help programmers do so.


Doctors and lawyers have to sacrifice more for similar pay because their work doesn’t scale. A lawyer and doctor serves one customer at a time.

A programmer at your average SaaS app produces work that serves many, many customers at once. And for long after they performed the labor.

This effect is also very apparent when you look at the avg cost of goods and services by category over time. Things that don’t scale — legal, healthcare, education — have skyrocketed in price relative to everything else.

So long as this state of affairs doesn’t change, programmers and anyone else whose work can be replicated and distributed effectively infinitely for no cost will continue to make more, more easily.


Then why are journalists not paid well?


Unlike programmers, their work doesn't generate revenues as easily, otherwise every blogger would be rich.


^ this is the best explanation I've seen so far in the thread.

Software translates directly into money. The company I work for makes physical devices - they have all kinds of engineers, but what we're making goes directly into making the company money. Like I can straight up say, they make millions off the stuff we produce for them.

If your work makes the company a lot of money, it's not frivolous to ask for a small chunk of it lol.


one could also say that tech companies get an unjustified amount of the money that should go to the journalists through their capture of ad networks.


It's hard to say whether it _should_ go to journalists. Even before the digital era, newspapers largely made money through advertisements, not subscriptions. They were monetizing eyeballs just as much as the Facebooks or Googles of the world do. In their case they brought in the eyeballs through their content (whether responsible journalism or tabloid-trash) and monetized them through also showing ads to the same eyeballs.

The difference is that journalism no longer has a pseudo-monopoly on the kinds of things they historically did (content, distribution, eyeballs etc.).

My grandfather read the whole newspaper every morning. In one day I think I read a LOT more content than him but it is spread over a wide variety of surfaces, print, websites (no single author) etc.


And we’re arguably far more tolerant of broken journalism than we are of broken code.


Maybe we could try labeling broken code "opinion".


They generate value to society, not profit. Any amount of profit extraction comes at the expense of value creation.


Collusion. Good luck breaking that up, though.


A lucky lawyer can serve a large group in a class action suit. Or serve one client, but one that is a huge corporation.


Programmers are incredibly productive. An hour of a doctor's time, an hour of a lawyer's time an hour of an investment banker's time only help one person. An hour of a programmer's time impacts tens of thousands of people.

Programming at a high level which gets good money is hard. It's hard to reach that level, it takes lots of time learning, reading and practicing, it's hard to do the work. Also many programmers work more than 40 hours weeks.

If a good programmer helps a company win a few millions per year or he helps the company to make economies of few millions per year, then he deserves a share of those money.

The demand is larger than the supply and it is likely to stay so since all sectors of the economy demand more automation and not many people like to code or are willing to become programmers.

The "winter of programming" might never come. Even if it comes it is likely that won't happen during our lifetimes, even if we live a lot.

People said long time ago that software is a fad or a bubble that will burst. Still software continued to rise year after year.


> Programmers are incredibly productive.

Some developers are. Most aren't. If you work in a startup, or on some new project within some big company, you are very likely to be producing vaporware that will impact zero users, either because the software is bad or because of some issues in marketing or management beyond your control. In either case, your real economic productivity is negative.


This argument rings hollow when you look a level down.

You couldn't do your work without your keyboard. Why doesn't your keyboard's designer deserve a share, too? What about the EE who designed the CPU in your machine? The physicist who came up with the process to manufacture the CPU? The mechanical engineer who built the machine that implements that process? The chemist who invented the solder that connects it all together?

Every single one of those people gets paid less than the programmer, their work affects millions, and without them, the programmer could not affect the same number of people or indeed could not work at all. So why should the programmer get all the cash?

(This is kind of a strawman argument, and is meant to be a little provocative. There are some pretty obvious holes in it. But I do think it is a real argument and you cannot dismiss it entirely. There is something here.)


A 16 year old child can work in a factory putting a keyboard together. Face it, there is a pyramid , and some jobs require the smartest people humanity has and that job is programming. Computer science is becoming established and is the most important area in the progress of civilization now. All other human work exists only to support us programmers at a species level of thinking and we make the real contributions that matter , that live on for centuries. The doctors exist to keep programmers and all the people below us that we depend on for supporting us healthy. The farmers exist to keep programmers fed. But have you wondered why do we programmers exist? We exist to automate and manage all these cattle below us and direct humanities endeavors with the systems we imagine and bring into reality.


Although a lot of people have made fun of your post, and it sounds a bit funny in a macabre way, I actually think it has more than a ring of truth to it. Programmers are essentially doing this, except that I would say that the people on top are the CEOs that know how to put a team of programmers together. But regardless, I think your description is actually eerily accurate because people are so dependent on technology, that they basically work like crazy to afford it and be immersed in it.


Excuse me, what? This is extreme even for sarcasm


This is a bad troll post.


Lmaooooooooo


Software developers don't need expensive hardware most of the time. They don't have to pay the cost of borrowed capital unlike a factory that costs millions and takes years to pay off the business loans.

The companies producing chips do make a lot of money but nothing compared to e.g. a $5000 lathe that only does one job.


I say this as someone who has cultivated a very low cost of living (very nearly 0.)

For every 100k I have been paid by organizations to create & program out marketing automation code I have made those organizations 1mil in recurring revenue.

It's no mystery. Code is compounding. Why do authors and musicians with a proven financial impact get huge advances? Until any one company has a monopoly on a given business niche/vertical/whathaveyou you code has compounding value.

If anything engineers need to band together as small groups and start cloning out business models for any company with a relatively traditional business model that tries to have a time wasting faang level interviewing process. The moment the salaries drop there will be an army of software devs moving back in with family and cloning any business they can aquire basic knowledge of...


Curious about your cost of living statement being zero. How did you do that? What kind of life style is that?


I have a several properties, the lot that has my primary residence has another structure on it that is rented out, the property is almost net neutral (barring the cost of my own utilities.) The other properties are rented and net positive.


So technically your cost of living isn‘t really zero, you just have plenty of passive income that covers it.


Yes, that would be technically correct. Which is the best form of correct :-). I guess a more accurate way of say this would be that I have a 0 burn rate life based on owning a single modest property I share with others (the other properties don't factor into that equation), by embracing a more communal way of living in N.A. that is generally looked down on for independent adults.

I think a lot of people with a single family lot don't want to share a house, don't want to share their property, with a slightly more community based approach you can get your burn rate to 0 or even into negative numbers which opens up all kinds of flexibility. After the age of 15 I didn't have a familial safety net so I effectively built my own through modest and shared living long after the age it's frowned upon culturally.


Relatively easy if you're single and employed. Buy a van, park it in the company parking lot. Use kitchen at work and drive to a remote area during the weekend and use solar shower, etc. Might spend $300-500 a month that way.


That’s cool if you want people knowing you live in a van in the company parking lot… that would seriously bust your reputation.


It’s also an easy way to ensure you remain single. Self-reinforcing.

“Remote area” = down by the river.


It's certainly possible that you're just more frugal than most, but I'm not sure if those numbers add up right. Even if you (1) ignore the cost to purchase and furnish the van in the first place and (2) assume you obtain 100% of your meals for free via "the kitchen at work", ~$400/month seems very low.

Even with an incredibly spartan lifestyle, you'll have maintenance, insurance, inspection and registration fees on the van; health insurance; fuel costs (power when parked, gas when moving); phone service; gym membership (showers, bathroom); hygiene supplies/toiletries; laundry; clothing; household consumables; etc. $100/week seems like a stretch unless you're compromising on personal welfare and/or safety.

At some point you might as well just do away with the van and sleep in the office.


> Might spend $300-500 a month that way.

$6,000 per year is a long way from zero. That's ~20% of the median income.


[flagged]


Fortran 77 programmers live with dessicated corpses of parents who died decades ago, while collecting their pensions.


Ah yes, I'm going to make software for trillion dollar corporations with record profits and record salaries for CEOs and fat cats and take a pay cut because ,,I'' am being over paid.

5 of the top 10 companies with highest profits are tech, most of the others are banks (which highly rely on tech). What a joke. This feels like propaganda to make me happy to just get paid while fat cats still get fat.


Almost all of the highly profitable companies have one monopoly product and a massive associated bureaucracy doing nothing...and that is a handful of companies. The rest are setting fire to money. Google is the worst example, they have one product that is probably 150-200% of profit, 99% of staff do nothing, and they believe they have a right to harvest income from a search product they don't work on.

The economics of this system make no sense at all. Part of it is VC culture, part of it is massive corporate governance breaches, part of it is QE...it is just multiple things that defy logic. I don't think anyone should be surprised that shareholders ask questions when they see individual companies with thousands of staff who do nothing, that is normal. The bezzle of 2021 was the exception.


I had the same thoughts now in regards to the Twitter layoffs. I mean Twitter still looks like a pretty trivial product that some nerds could hack together in a garage. So why more than 7000 employees? What were all these people doing?


I have some unpopular opinions about developers, and I'm likely to be downvoted on this site for them, but anyway:

* Most developers have negative productivity: they cost the company more money than the company derives from their work

* Thanks to the bubble in the tech sector and lavish startup funding, a lot of developers are producing software that brings in no value

* Thanks to the bubble in the tech sector, and great pay, a lot of developers have distorted ideas about the value they bring to their companies

* Thanks to the bubble in the tech sector, a lot of developers overestimate their productivity and importance

* Even if you work really hard, and have great skills, it does not mean you are producing something valuable and demanded by the market

* If your company makes more money from your work than it pays you, your job is likely safe

* If the company does not make money from your work, you should be thinking about plan B

* Sometimes it is very difficult to estimate the impact of your individual work on company performance. Your job is not safe, and you should be thinking about plan B even if you are doing something really important.

In conclusion: always be thinking about the real $$$ value your employer derives from your work. If you are paid a lot only because there is a shortage of developers, you might be in a bubble, which can pop.


A lot of these points would be right if it wasn’t for the fact that software is so damn valuable.

In investment banking for instance you are lucky to get 10 lines of code into production per week, but those lines of code can soon add up to replace a trader making $1 million a year or supporting $X million in electronic trading commissions.

In Silicon Valley the salaries are breathtaking, but how many $billion and soon $trillion companies are being minted as a result of their efforts?

I think that’s the dynamic going on. Many developers are average and overpaid, but they are a scarce resource and look at what they can create in the right circumstances.


Some software brings in a lot of value, that's true, as evidenced by all the wealthy tech companies out there.

But I think I wouldn't be too far off if I will estimate that only about 0.1% of developers create software that results in seven (or more) figure economic gains, 9,9% have moderate productivity that pays back their salary with some surplus and they other 90% of developers have actually negative productivity, and are only paid well thanks to the success of the first 0.1%


I expect similar numbers are found throughout the entire economy. Most businesses fail. The trouble is that we don't have a good sense for who the winners will be until after the costs are sunk, and since software has the one of the highest potential upsides we're willing to sink more into trying.


Let's say you have software that sells for $80 a month, you only have to sell 200 subscriptions to pay a $200k salary. Let's deduct some expenses and say $130k.

The question is, can a developer with 2000 hours of time to spend working create a feature or improve the existing software to convert an additional 200 customers? That is 10 hours per customer. The software developer could interview every single customer for 5 hours and gather and analyze the requirements and still have a thousand hours left to actually do his work.

If anything, the moment there is a robotics revolution the value of software development will go up even further. There will be plenty of demand to spend on developing a harvesting program for a specific crop for example.


How much developer time do you need to invest to create software that someone will be willing to buy for $80/month? What are the chances that nobody will buy it for whatever reason (wrong idea, bad marketing, bad quality, bad design, better competing product beat you to market, ...) ?


All measured data points suggests developers should expect to continue high paying wages.

Wage growth continues to rise: https://www.atlantafed.org/chcs/wage-growth-tracker Unemployment is still low: https://tradingeconomics.com/united-states/unemployment-rate

While it's true that many tech companies are doing layoffs right now or having some form of a hiring freeze in place, it is likely to be temporary as the excess hiring from the past two years unwinds a bit. Many public tech companies with high P/E ratios are just finally having to face the reality that profitable business models must emerge from high growth markets eventually.

In general the US economy is still doing well, despite persistent inflation, and historically high demand jobs that require a college education have weathered negative economic trends better than most professions.

If you are an experienced, competent software engineer I would expect there will be more jobs than candidates for quite some time, and as a career will continue to be a quite lucrative.


I suspect in 5 years we will look back and see a trend that all of these tech layoffs supercharged the current crop of tech startups. Seed and series A stage companies have been starved of talent for a long time now.


Thing is, I don't really support the idea that the supply of good, functional, productive devs has increased that much. I'm over here just throwing out more resumes and giving no goes after interviews than ever.

It's apparently pretty difficult to be a good developer. Barrier to entry is low, but just because you can tap some code out doesn't make you competent.

My friend is still telling the story from two years ago when he went to interview some devs and even though their resumes looked good they couldn't even write a for loop to parse some lines from a file.

I'm as of this moment pretty secure that my pay will remain high for a while.

Edit: I want to clarify that the low barrier to entry is good, we do actually need more devs, but plenty of people don't take advantage of the tons of resources (online tutorials, bootcamps, hacker communities etc) out there to skill up and become productive and then expect to make lots of the monies.


> What makes me nervous, though, is that we don't really understand why programmers are paid this well

Are you kidding? Computer programmers have extremely high burnout and surprisingly low job satisfaction. People are eager to rush in to the field but they slowly realize how tiring the work becomes. I chose industrial engineering in college as my computer science friends would constantly tell me about just how much programming they had to do as part of their internships. I can't even recall a person who said they loved programming for 8 hours a day and wanted to do it for the rest of their life. The pay is deserved. Case closed.


Do you have data to back this up? It seems like an absurd claim based on the developers I know.


As an anecdotal evidence, my wife told me that she first considered my complaints about work a red flag, but that happened when I was the only programmer she knew. As her social circle expanded to contain more programmers, she realized that all of them, without exception, were complaining about their jobs.

(To compare, the people happy about their jobs are mostly teachers. Possibly a selection effect, because teachers who are not happy about their jobs simply quit; they can get a better paying job anywhere.)

The part about "programming for 8 hours a day" only makes sense to me if I interpret it as doing the stuff the programmers do at their jobs: the dark Scrum, Jira tickets, endless meetings, open spaces, etc. Most programmers hate that with a passion; that is why many of us were so happy about Covid allowing us to work from home. Actual coding is quite fun, if it comes without the baggage.


Have people ever stopped saying this, since the first tech bubble?


Nope. But I suppose if you say it enough people think we’ll believe it. Not sure what it is today, but a company like Microsoft was making somewhere around a million dollars a year per employee, and obviously those aren’t all software engineers, but it put into stark relief how much those software engineers are actually making vs. the value they create.


Tech bloggers and pundits have predicted 12 of the last 3 bubbles.


I think this is the year this finally comes to pass. The US demand for devs has grown so much faster than our own ability to supply that demand, even with bootcamps and exploding CS dept sizes, that this hasn't been an issue. But if that demand is met somehow, things will change. 10 years ago Asia tried to supply that demand but for a variety of reasons it didn't really work. 5 years ago it was eastern Europe and it worked a lot better, but still companies that wanted the best team environment possible wanted people in the US and since many had the money to pay for it, demand remained above supply.

Now South America is taking their turn and I think they can actually finally pull off being an outsource or remote-office option that has very few downsides and the enormous upside of being 1/5th the cost. Combine that with everyone going remote, removing the last of the two big differentiator between US and offshore employees (timezone and being on site), and I think in the midsts of these layoffs we're going to see a shift of a lot of that work to SA.

Unless demand continues to explode, outstripping even the supply of SA, I think we should all expect lower wages and fewer jobs in the coming years.


I used to be in the camp "let's outsource it" and I found out, it is not working. Code which you get is mostly useless and needs to be integrated into whole solution and you are potentially letting others to stole your IP and sell it to somebody else.


From all anecdotal reports of 'not working' I noticed all think of outsourcing as something they contract a middleman foreign company for, rather than hire individual remote developers. That company then hires locally. Then they wonder why it doesn't work. That's because no good local dev wants to work for an outsourcing company like that.


Exactly this. I had a few Indian colleagues, contracted individually, and they were very smart and capable and a pleasure to work with.

I also had an entire team contracted using a middleman Indian company, and they were barely competent. First they impressed us with an awesome demo (an animated login screen and some generic dialogs... which in hindsight they probably copied from some other project and reused to impress all customers), but afterwards all requests took a lot of time to complete, and the things they delivered were often not the things we have actually requested. We assumed this was a honest misunderstanding and thought "we can clarify the misunderstandings and let them do it over, and it will still be much cheaper than hiring a local team for the same job", but one year later we realized we still had nothing useful, and if we had hired a local team instead, they could have the same thing done in a month or two, so it would even be cheaper that way.

But this was not about India, but about using the middleman. In a different company we have hired a local team using a middleman, and it also turned out to be a disaster. Not as horrible as with the Indian middleman though; it is probably easier to be completely shameless if you never have to meet the people in person.

I guess the lesson is that you can outsource programming, but you can't outsource hiring programmers.


Just because it didn't work for you once doesn't mean it can't work. I know of many US companies using Eastern European and south American people very successfully and getting high quality contributions.


It's possible, certainly. I started my path to this career back in 1998, and dropped out of college because I was not prepared... it kinda worked out well because the tech bubble popped and I would've had a bad time around the time I would've graduated.

When I went back to school sometime later, the job market had recovered a good bit.

The whole time I had followed the countless predictions of the end of the software bubble, and not a one of them has come true.

I'm not going to say that devs can expect infinite growth and FAANG level pay across the entire category of dev jobs. But I think in general it will still be a viable path to a reasonably comfortable, middle to upper middle class lifestyle for many years to come like most other professional careers, and there will always be a strong market on the high end as well for FAANG or those with niche, in demand skillsets.


Yeah i'll believe it when I see it.


Recently came in contact with lawyer who charged over $400/hr fees. I was absolutely shocked that they can do that for the trivial work that they do. Most cases are cookie cutter and fill-in-template no-brainers but people tend to go to expensive lawyers just to feel safe and assured in time of crisis in their life. Lawyers take full advantage of this and tend to exploit people in crises to the maximum possible extent. This is not delivering value. Same for doctors. The doctors working in maternity wards routinely charge $500 for coming to you and looking your charts for like 5 minutes and never show their face again. Vast majority of their work is cookie-cutter and would eventually be automated but hospitals take full advantage of people who feel their life is at risk and tend to exploit them at maximum possible extent causing $2000/day maternity bills for absolutely normal pregnancies.

Another point is that developer salaries aren't set by charitable donations. They are set by revenues and companies pay according to value people generate. If author doesn't understand market, it doesn't mean market doesn't exist.


> Most cases are cookie cutter and fill-in-template no-brainers

You think the future of no-code/low-code/democratized code isn't the same?


Programmers are highly paid because of the value they bring to their employer, like every other profession, not by how much effort they put into their work.

This is also why stuff like the lawsuit against GitHub Copilot is extremely counterproductive. If AI enables you to build a full-fledged Google Docs clone in one day, your value in the labor market will go up by a lot, certainly not down.

Edited to add: Every company where layoffs have happened recently were cases where management largely squandered the value of their developers (Stripe, Lyft, pre-Elon Twitter, etc). They had the ability to build 1000x more stuff than they actually did, and are now simply paying the price. Truly innovative companies that use their developers properly will never see layoffs - I can guarantee Replit won't see any layoffs at least for the next four years.


> If AI enables you to build a full-fledged Google Docs clone in one day, your value in the labor market will go up by a lot, certainly not down

If AI enabled everyone and their dog to build Google Docs clones in one day, you can be sure that your market value won't go up but plummet, so keep the trick to yourself.


If you could integrate relevant features of Google Docs into, say, Twitter Ad Manager, enabling an ad agency and the CMO of the corporate advertiser to collaborate (multiplayer edit, comments, real-time in-app chat, sharing links, version control) on a Twitter ad, it would drive more revenue for Twitter. As the cost in developer time of building those Google Docs goes down, building such a feature would be beneficial to Twitter.

The same applies to many other domains.

Naturally, as a programmer with increased capabilities, your value will go up.

See: https://news.ycombinator.com/item?id=33485483


But the key here is "if YOU could". To the extent that AI lowered the minimum requirements to build Google Docs, labour supply would increase and you would be facing a lot of competition, becoming more and more replaceable by people willing to earn as much as you -2c. Then it becomes "if WE could". Market value is not productivity.



Depends on how hard that tool is to use.

And call me small minded but I don't see much room for innovation now (that wouldn't reduce stock prices or even bankrupt the company. Heck, lots of tech companies today are still in the red)


> the lawsuit against GitHub Copilot is extremely counterproductive

The lawsuit against Copilot is about license infringement, not productivity.


Copyright and copyright licences exist to protect authors and publishers.


If you can do more with one programmer, then you need less programmers, and demand for programmers goes down.


Did demand for programmers go down in the transitions from pure binary to assembly language, and from low-level languages to high-level languages?

It just became more feasible to build profitable products/services that solve larger products, which would increase the demand for programmers. As long as people/companies want to make more money, they will keep moving on to more ambitious projects that have just become technologically/economically feasible. In aggregate, they will not stop at the previous technological/profit level, downsize by firing all but the absolute minimum of programmers, and stay content with their stagnating income.


Business expands to the resources available. One programmer can easily deliver a basic website that worked great in the early 2000s, but same website won't cut it today.


It's not about "more", it's about accessibility. If AI makes creating software products more accessible, the demand for programmers will increase.

If programming was niche and hard to capitalize on then yeah the demand wouldn't increase if you could do "more".

HW engineers seems like a good example of this. HW has gotten tremendously better, but HW engineers haven't seem a commensurate increase in demand.


Looking forward to when 3D printing reduces upfront capital costs enough that it is feasible for a single engineer to bootstrap a profitable hardware business.

The 'GitHub for hardware' of that time would be one heck of a sight, with a huge ecosystem of interchangeable tried-and-tested hardware parts, corresponding to open source software libraries and packages today.


I’ve seen entry-level McDonald’s jobs paying $20, surely a programmer that can automate countless things is worth 5x that ($208k annually)


They pay that much because it's very difficult to find local employees to work for McDonalds, you can hire hordes of great developers for half that salary in India or LATAM.


Oh yeah? they pay US$20 an hour at a local fast food equivalent in India?


McDonald's is location dependent. The people you hire actually have to be able to show up


Industry standard has been over a million dollars revenue per year per employee. That includes non-developer roles, which all exist either to support or sell what developers make, or to enable you to hire more developers (management).

So these companies have had high margins on hiring developers. I think this also makes sense of "why does company X need so many developers to make product Y?"

The answer is they don't need that many developers to make product Y. But companies have largely seen that the more developers they hire the more money they make.

Or course that's a statistical phenomenon, it doesn't happen right when you hire a developer, it isn't guaranteed to happen at all. You're hiring as many developers as you can because it increases the chances that some of them will make money printing machines for you or improve your existing money printing capabilities.

I can see salaries going down if revenue is dropping across the board and companies switch to focusing more on making money right now and less on maximizing these chances. I don't think it is likely to drop like a rock, though.

Between being roughly zero margin once produced and being general purpose, software does promise to eat everything. But not everyone can afford to pay developers. Prices going down means more people looking to hire.


Very few companies make more than a million per employee. And that is inherently deceptive because you have products (FB and Google Search) that could actually make a lot more but don't.

And you need to consider the return, NFLX has the highest revenue per employee and likely isn't profitable once you actually look at capitalized costs. Their spending on tech is certainly very profitable, but that is because they are a media company...not because they are a tech one (the same is true at Disney, at Warner Discovery, etc. This kind of measure overstates the economics because they have huge non-employee costs)...so that leaves AAPL (profitable, but also substantial non-employee costs), GOOGL and FB...both protected by corporate governance abnormalities that allow insiders to skim off huge amounts from monopoly products.

Ultimately, your salary is limited by what you are able to charge. Most devs do not have real agency with the product, that is why salespeople make more. It isn't like finance where you can get paid based on revenue.

> But companies have largely seen that the more developers they hire the more money they make.

I haven't seen this. Almost every tech company I have researched (I worked as an equity analyst for a decade) is massively overstaffed. Sales spending is far more accretive. Devs and cloud are a black hole caused by the lack of business knowledge amongst senior managers (even in the C-suite, tech companies have unusually bad managers). The industry is nowhere remotely close to where it needs to be, and won't be for a long time because most people don't have skills that will generate profit.

Prices going down won't mean more looking to hire because the market is going to shrink overall. I think one area that is severely under-penetrated is tech spending by legacy companies, but that won't make up the gap. At large individual companies, the numbers are insane...some companies will cut thousands, and still need to do more (it isn't just devs either, it is data, managerial, support, project management, marketing...there is just bloat everywhere).


> Very few companies make more than a million per employee.

Yeah, and they hire a whole lot of programmers, which is where the top prices come from.

> Almost every tech company I have researched (I worked as an equity analyst for a decade) is massively overstaffed.

Well it is very directly the reason I have heard given for the process of always hiring everyone that passes the filter without regard to old fashioned concepts as "job openings".


This isn't a prediction about the future per se, not in the sense of "hey there's a recession, hey there's AI, hey there's X... programmers beware!"

Rather, this is saying "we cannot think of a logical reason why programmers are paid as much as they are in comparison to other skilled fields, thus it is possible we're living in a salary bubble, and it makes sense for programmers to save as much money as possible."


Bad prediction i guess? Since 2019 my pay has gone up by triple digit %, at the age of 45.


The article didn't explicitly talk about near future, even from his advice (that you should save several years worth of your expenses) you can see that it's something for the long-term.

Ironically, reading what you just wrote ("Since 2019 my pay has gone up by triple digit %") and considering that it's the same case for me - we should probably ask ourselves - is that kind of growth healthy, and can it be sustained? While I obviously do enjoy the benefits of it, I have to say that I'm afraid the answer to both of these questions is "no". The only question that remains is when it will stop and collapse.


Fair point. I actually never expected to make this kind of money so i'm just buying real estate for a rainy day.


The comparison with law is interesting, because there are a lot of low-paid lawyers. This is true of programmers, too. In many countries, the average programmer makes about the average salary, often even with a CS degree under their belt. (Ie. most of western Europe) Even in the US, salaries are highly dependent on your region, and on your skill.

Doctors are outliers compared to basically every other profession; no one that highly educated gets exploited as brutally, virtually across all systems and countries. My personal guess is that this is due to their passion and specific character traits being easily exploitable.

I doubt that programmer salaries will fall across the board, although it might be true that low-to-mid-tier skill levels will become almost redundant. The programmers I know in great positions are generally highly intelligent and put in way more hours than the average person, they are simply irreplaceable without great AGI, and they are quintessential to the modern economy.


> The comparison with law is interesting, because there are a lot of low-paid lawyers.

Yeah, my parents have both been lawyers in a (rich) European country for more than 20 years and they tell me new lawyers earning close to minimum wage are legion.

This blog post was written by a guy than doesn't understand how anything works outside of software engineering and outside of the Silicon Valley.


One similar group has it slightly worse than [people] doctors: veterinarians.


Programmers are paid well because vc money is cheap and available and selling ads and collecting data is considered important. If there’s enough of a downturn to torpedo one or both of these conditions, salaries will collapse. Or, if they truly make an effort to use remote work they’ll go down on geographic cost arguments.

As the original article states, things are really good. Layoffs are approaching but they’re not assumed to be a concern. Saving some bucks for a rainy day is prudent, or just phrase it “save so you can retire at 40”.

There’s a whole world of programming out there that handles niche markets or odd geographic areas and doesn’t make the big bucks. Keep that in mind. And lots and lots of people in cheaper areas than SV. And maybe ai or other programming assist will eat some of your lunch?


A large component of pay was also "funny money" shares. You have all these companies that say they are free-cash flow "profitable" and they are still losing hundreds of millions in GAAP terms a quarter. That system relies on employees taking non-cash remuneration, and when share prices collapse it all falls apart.

Devs are uniquely unable to understand the system they are part of because the system has had no basis in economic reality. Most of the public companies, the best of the best that VCs managed to cash out through, are going to fail (without significant changes to how they do business). And devs carry on like they are a Hollywood actor: I don't get out of bed for less than $500k/year. No-one is profitable paying the big salaries of 2021. The majority of the big salaries were due to extremely inept management at tech companies.


I don't want to sound classist or too entitled.

But I am deeply familiar with developers in FANG unable to handle the complexity of a relatively not difficult project.

Moreover the time necessary to train a perfectly capable new grand is absurd.

I keep thinking that it is a very difficult job, with absurd upside if well done.


The big overlooked thing here is that doctors and lawyers don't really create any value. They extract value from existing systems (not to disparage their professions - they're both very important).


> we don't understand why programmers are paid so well

Then proceeds to explain exactly why programmers are paid so well.


The information technology industry is a disruptive force in this economy. The way it has disrupted the global economy has changed over time as newer generations of technological innovations (e.g., software, programming languages and development paradigms) have overtaken older generations of information technology. As a consequence, these changes have demanded new sets of technical skills from programmers. Those who master the skills (and a lot who learn just enough of them) can command premium pay because of the labor scarcity they now find themselves in.

In my own 30+-year career, I've had to continually re-skill and up-skill to keep pace with the changes over the years and stay competitive with new programmers entering the industry. I have also been able to command a premium wage (even through the dot com bust) because I was always watching the industry and was able to anticipate and/or pick up whatever skill I needed to know next to remain in play.

I would conclude that the OP is right only if the IT industry itself stops being such a fast-moving, almost COVID variant-like industry disruptor, demanding brand new, rare skills or programmers decide to exit the tech skills arms race. I think I can go for another ten years at least myself.


Eh, I'm not particularly convinced. I'll buy that being a Lawyer/Doctor are overpopulated professions and offer terrible ROI on investment comparatively.

But I get the impression most of us could do blue collar jobs for less hours, less stress, and still pull 6 figures.

I would love to have an auto-mechanic who was as organized and pragmatic as a top software engineer. The person switching my sink is going to be getting paid at least $100 an hour.


When I started programming in the 90s I made 30k. Its so much more important now. Add experience and skill and most are underpaid for return on value


I think that in the short-term there will definitely be a shock to the system and there may be less lucrative and easy jobs, but long-term software is going nowhere and the fundamental value proposition of software engineers hasn't changed.

The only thing that changed is that due to increasing rates companies are tightening their CapEx and focusing on profit over growth.


> If you look at law, you have to win the prestige lottery and get into a top school, which will cost hundreds of thousands of dollars. Then you have to win the grades lottery and get good enough grades to get into a top firm. And then you have to continue winning tournaments to avoid getting kicked out...

This is just not true. My parents and grandparents were all lawyers. They went to law school in their home city (around 600k population) and made a living for themselves. At the time of his retirement my grandfather was very well off and had savings for college for both his parents and their kids (me). You can get by just fine without winning a "prestige lottery" and going hundreds of thousands into debt just to attend a 'top' law school.


> What makes me nervous, though, is that we don't really understand why programmers are paid this well, and especially why this has persisted.

Isn't part of this just how valuable software work is? Unlike almost every other industry, there is very little marginal costs to growth and a huge demand for new software based products. This means a large chunk of the value created is captured by the developers, since they are the limiting factor.

Even with that large chunk of value going towards developers, they still only get a small fraction of the total value created by the software industry. It is simply such a valuable industry with such small marginal costs, that developers are still able to be so valuable in the market.


Developers are paid well in US. In EU, the difference between the salary of a garbage man and a scientist or a software engineer may not be all that much (at least by US standards).

So, the salaries may be roughly similar and constant.


Though in general garbage collecting is well-paid for work requiring no prior qualifications. That's due to the fact that it's hard labour with early hours, and people are in general not thrilled at the idea of working with garbage


> we don't understand why programmers are paid so well

Isn't it for the high demand?


This post fails to acknowledge cost of living. If you work in SV, you're going to be paid well just because of COL. I've seen programmer jobs only paying 50k (in the US, at large companies, in 2022), all because of low COL.

Why is it that there's all this griping about salaries when PMs, EMs, data scientists, etc make similar pay in a similar COL situation?

COL aside, imo SWEs could make twice as much and it still wouldn't be adequate. Apple's revenue per employee is over 2 million, their median SWE comp is probably only 1/8th of that

We should unionize


A lot of the pay inflation in recent years comes from inflated stock values; cash comp at big companies tends to plateau after 200k and a larger portion of your comp is in stock. Hot stock market + stacked refreshers calculated from previous lower values = programmers making $1m a year temporarily, but it doesn't last; new refreshers will be calculated at market highs, and when the market inevitably recedes, the stacking effect works in the opposite manner.

On the plus side, the techtok trend died a quick and ignoble death.


archive.org mirror: https://web.archive.org/web/20221105194440/https://www.jefft...

Additionally I'll note this is an article from 2019. Prior discussion: https://news.ycombinator.com/item?id=21904070


Thanks! Macroexpanded:

Programmers Should Plan for Lower Pay? - https://news.ycombinator.com/item?id=21904070 - Dec 2019 (492 comments)


Programmers aren't actually paid all that well. We're paid marginally better than schoolteachers and firefighters. Don't believe me? Look up the salaries for software engineers vs schoolteachers in Boise, ID. Now look up those figures in SF/NYC. They track. People get an incorrect view because they look at the national average, which is skewed because more programmers are forced to move to high-COL areas like SF/NYC just to pursue their profession.


Not quite. Let's compare them with some BLS data. For teachers, I'll pick elementary schoolteachers other than special education, which seem to be in the middle of the pack.

In Boise, an elementary schoolteacher earns $52,500 on average, while the computer programmer earns $75,070 - 43% more. "Software developers" (a different BLS job from programmers) are paid even higher: $96,980, or 84% more than teachers.

The mean wage for an elementary schoolteacher in San Francisco is $86,920, while for a computer programmer it's $126,220 — 45% greater. For software developers, it's $158,320, which is 80% higher than the teachers.

Nationally, this also applies: elementary teachers make $67,080, while programmers make 45% more at $96,650 and software developers 80% more at $120,990.

So while it's true that the ratio of software engineer and schoolteacher salaries is similar between Boise and San Francisco, this ratio is quite large (not "marginally better"), and software businesses being in high cost-of-living areas doesn't seem to explain why software engineers are paid so much (they still earn a lot more than teachers even in cheap areas).

Data is sourced from the BLS:

https://www.bls.gov/oes/current/oes_nat.htm https://www.bls.gov/oes/current/oes_14260.htm https://www.bls.gov/oes/current/oes_41860.htm


Thanks, you're right.

But I still think that "programmers are highly paid" is a bit of a myth, and schoolteachers in SF are just under paid.


> What makes me nervous, though, is that we don't really understand why programmers are paid this well

It's very easy. Look at any tech company's stock price and you have a very good proxy for programmer compensation.

When government and central banks worked together to create a bubble in the monetary base, valuations went nuts and so did job creation. Many of these jobs are useless or unnecessary. Many of them are completely discretionary. Many of them have a negative effect on the bottom line. Many of them employ people who we would normally consider unemployable. Many of them offer the employer atrocious returns.

In classic government fashion, they realized they made a mess, so now they are doing their best to correct this by pulling on every lever they have:

* Wage suppression through record immigration

* Under-representing inflation so that CPI-driven wage adjustments do not rise enough to sustain demand

* Outright job destruction by trying to destroy certain key industries (e.g. oil and gas)

* Making money very much not free, so companies suddenly have to think about costs.

Unfortunately this trend has only begun and technology has been generally fast to act, but mass layoffs are coming to almost every industry, especially if you were not characterized as "essential" during the pandemic.

Inflation isn't just fought through interest rate increases and quantitive tightening, it's fought by ensuring you can no longer afford your current quality of life.


Like any job, it’s probably a small percentage of the workers producing most of the results. I know at my job it’s pretty obvious who is actually consistently working but somehow it isn’t obvious to management or just harder to prove.


One thing that doesn't get talked about enough is the impact of programmers writing software for free[1] on what programmers as a whole are paid.

It could be argued that such free labor reduces the value of programming as a whole, because many if not most people would gladly use "free" software than pay for it, so if those "free" options didn't exist then the non-free software which did exist would be both more attractive and more valuable... and so programmers as a whole would be paid more.

But it's not so simple, because virtually the entire internet, all the tech giants, and arguably most companies today rely on "free" software, and their success is partially if not largely due to such software.. and, it's because of their success that these companies can afford to pay the relatively large salaries that programmers earn.

So the relationship between programmers working for free and programmer salaries is a complex one, but it's still intriguing to wonder what programmer salaries would be like today had the free/open source movements not flourished.

The other thing to remember is that programming used to be a pretty esoteric discipline, which was dominated by Americans... who could demand relatively high salaries because they didn't have to compete with the rest of the world. Now there are exponentially more programmers and people from all over the world can program.

Not to mention that math education in America is pretty awful compared to much of the rest of the world, and good math skills help people to be good programmers, resulting in even tougher competition from outside the US. If language, culture, and time zones weren't an issue the competition would be even tougher.

[1] - the term "open source" is often used as shorthand for this, but it's obviously imperfect as there are many programmers who are paid to write open source software, such as those employed by Red Hat, Mozilla, Google, IBM, etc..


With the crazy economy and layoffs I think companies will use this as a time to say they are focusing on keeping expenses low and use that as an excuse to suspend or delay raises. I wouldn't put it past them.


I've asked myself this question often but each time I look for qualified employees and co-workers I find none. Only thing could attract someone is high pay


This is three years old, but is kind of prophetic.

As far as I'm concerned, the money has been fairly corrosive. It's a damoclean sword, bringing manna, but levels of avarice and corruption that are just heartbreaking.

But I suspect that I'm in the minority in my opinion. I just love the craft of software development, and find it hard to adjust to the new rat race for money.

I do get treated fairly badly, when I mention that money just isn't that important to me. I love coding and architecting. In fact, I do it for free, these days, which was always my dream.


Medical doctors don‘t need to work 80 hour weeks. These exploitative systems need to be abolished. These should be basic human rights.


The supply of medical professionals is constrained by cartel-like economics around medical school entry (which also prohibits foreigners from entering the market). This creates a glut of supply.


I agree with the author that there are no signs of pay collapsing anytime soon, but living below your means is always great advice.


>What makes me nervous, though, is that we don't really understand why programmers are paid this well

Because what we do offers MULTIPLES in earnings and it's all for free. Zero margins. It isn't confusing to me. When we build something exquisite and people love it, the company can sell it infinitely. Forever. With no cost.

We take dreams and wahoo ideas and turn it into tangibles. That's art. Contrary to what bootcamps sell you, it's difficult to get right.


I have a friend who is a composer for film.

She has made some of the most exquisite thought products I've ever experienced (a strange way to talk about music, but I'm trying to cleave to the OP.) She works 90 hours a week, has been well-regarded in her industry for 10+ years.

I've seen her drop everything, on any night of the week, to hit the deadline for wrap, dozens of times in the last year alone.

She has $400CAD to her name, has no car, and rents. If she weren't in a housing co-op, she'd need to move out of the city -- and as it already stands, she lives in the roughest neighbourhood in town.

The moral of this story is that enjoyable, well-regarded work where you feel a deep connection to the end-product almost inevitably becomes part of its own compensation package.

There will be people who will code for free, for 'exposure', and there will be a lot of them, not too long from now at all, because coding is a delight, and is becoming more delightful every day (cf CoPilot).

I think a lot about how the romanticization of coding as a profession parallels the romanticization of the music industry in the 80s and 90s. People work cheap for hallowed dreams. Pride goeth before the fall.


I suspect this is enough info about your friend to identify them (assuming they live in one of the population hubs of their country).

If you switch to they/them pronouns and drop the reference to the specific currency their assets are in, it'd provide a good cloak of fuzz to prevent them from being identified by the description.

I don't know your friend but I think most people would prefer to avoid being called out as being nearly one step away from homelessness (even though I suspect I live in the same city, and the same is true of many hard-working people here)


Yikes. Good points. And a good time to admit that the person described is an amalgam of what I heard from (at least) two artist friends, in somewhat different industries, about things that happened in the early days of the pandemic. No actual creatives were harmed in the making of this post :)

In the future, I'll try to avoid exaggerating, and additionally, think more adversarially.

All the same: it's very, very hard to be a creative right now, if your creativity is anything other than coding. My fellow coders need to realize that the circumstances that have led to our overpay are indeed temporary, because of the way job titles function as positional goods, and become themselves fought over.

The other field I've seen this occur in are postgrad humanities, where the adjunctification of academe has been accomplished by inflating the status of the work. (And I speak as someone who dropped out of a PhD in the humanities.)

If you're expected to be honored just to be in the room, you can expect to earn only an honorarium.

And that's the future I see for our industry as well, eventually.

There must be some sort of general principle of growth and decay involved.


we are still hiring and it's hard to find a suitable candidate who is willing to take the offer and join our company.


>Summary: we don't understand why programmers are paid so well. If you're a programmer, there's enough of a chance that this is temporary that it's worth explicitly planning for a future in which you're laid off and unable to find similarly high-paying work.

Because to be a good programmer you need to be at least somewhat autistic and the number of high functioning autistic people is incredibly small.



Th is article should be ignored. When will the article CEO’s should plan for lower pay happen?


Just after CEO and management does


Low-skilled devs should plan for AI to replace them.


Funny way to say “Programmers Should Unionize”


I think programmer salaries have been so high because FAANG companies were willing and able to pay through the nose to hoard talent. This is changing.


Nah...


There are a bunch of not commonly stated factors for high SWE salaries:

- Gender disparity in CS...women tend to avoid CS programs cutting out roughly half the potential talent pool particularly before 2015.

- Gender disparity in education. Schooling and college admittance rates up to college tend to skew more female making the actual gender disparity even more pronounced. Software engineering is mostly men who have half the potential job market competition virtually drop out past college!

- Programming languages are "English". English is virtually a prerequisite for programming (correct me if I'm wrong). This makes outsourcing to any country other than highly developed parts of India difficult.

- It's changing constantly which diminishes the value of higher education and experience. Examples: kubernetes widespread adoption in 2017, JS frameworks that became popular after 2015, HTML5, wasm, transformers after 2018 (in ML). 16 year olds in 2017 can be more expierienced than college profs in newer tech.

- American tech monopolies. Google, Apple, Microsoft, and Amazon each have in the order of 100,000 engineers. Do they need that many? Probably no. Can they AFFORD that many? Yes. Google makes an avg of 1.6 million in profit per employee. They can do that by charging tax on their network effect monopoly and overwhelmingly hire in the US. This drives wages up across the sector.

- Programming is hard for many of the tech illiterate. Many of us grew up with a computer at the center of our homes. For the vast majority of people on earth this was not the case. It's actually getting worse with the usage of tablets and smartphones.

Here's a couple of WRONG arguments for high salaries:

- Software scales cheaply...this does not matter if you can hire cheap equivalently capable SWEs overseas. A McDonald's employee feeds many more people than a Michelin chef. Doesn't make one get paid more than the other.

- "Good" software engineers should get paid more... This is fine, but I see plenty of shitty SWEs getting paid just fine. Wages are set by supply and demand

- Programmers produce more value to society. Nope. Vaccine makers, bridge makers, factory builders..take your pick. All of these areas now DO involve software though, so the market for folks with mixed skills is getting ever bigger

I expect to see pure SWE jobs diminish in value, but domain expertise SWE jobs increase!


> It's changing constantly which diminishes the value of higher education and experience.

Constant change of tools actually improves the value of high quality education on fundamentals.


Now, do CEOs.


From my French perspective, I would say developers aren’t paid enough.

For one reason: market price manipulation.

Half the IT market here is handled by service company. Their main business is to hire IT guys and sell them at daily rate to customer (final company).

This works very well for market price manipulation when the middle company owners are also the owner of the customer company.

Let me explain: let talk about bank. They are the biggest IT service customer on the market followed by military. So in the late 80, when they get bored to be milked by freelancer cobol développers they united and created a service company, which business is to hire IT guys and resell them to themself ( the banks).

They hired cobol dev from North Africa, for far cheaper than hire locals, they invented business rules in buyers département, defining that the bank cannot make business with small company (too risky). Most dev freelance run a little company for accounting and billing.

Then when the locals were looking for new job, they didn’t have choice, they have to go through the service company which was allowed to make contract with banks (because she is far bigger than one man company) and the poor local dev was pressured to cut by half his price cause he was in competition with the North African dev.

We can also talk that as far as the service company and the banks are owned by the sames guys, this eases a lot of accounting and tax optimisation, to say the least.

And here in France only a des service companies are managing now half of the IT market.

And guess what ? There isn’t any union for IT industry.

And also this situation helps to create the fear about not enough IT guys, pushing government to open wide the frontier to allow foreigner to work. How ?

Well when a bank open a position to be filled, she sent the proposal to several service companies, and then those service companies publish proposal on job board. So there can be 5 or 6 open position on a job board, for the only one real open position at the final customer.

And after médias are writing article about how it is difficult to find people, there is much demand and so few candidates, we must open the border and let all peoples around the globe onboard in France to fill the job market. All that while we almost have 10 000 000 of unemployed.

I don’t know how it works in other countries, but this is mostly how it works since late 80 in France.

Let’s not forget about the public money (state money from tax) thief scheme. When ministry is opening an it positon around 1500 euros/day rate, and when the final working (the guy doing the job for real) is only able to bill around 600euros/day. 900euros days are going to a multilayer of middle service companies which most merit is to have a guy able to use a phone, and whose boss has often an uncle or similar into ministry or gov.


Such schema in the banking is not unique to France unfortunately, I've seen that in other European countries and Canada as well.


[2017]


[2018]


Nah, sorry.

Like it or not, virtually every aspect of life now is dependent on software. A CEO without developers is nothing. A manager without developers is nothing.

In fact, I would argue that we are paid comparatively more than other careers because to be successful in our work we must take the confused requests or requirements we are given and make sense of them, make them fit into the actual business need, and then execute on that improved understanding.

Some might point at recent articles about developers who do little and get paid a lot, but that is not as much an argument against paying developers well as it is evidence that most management is poor. Bad or underperforming developers would not have jobs if management had adequate supervisory and leadership skills.

Regarding lawyers (TFA), one does not need to go to an Ivy League school and land in a top firm to make bank. There are many small firms doing big business. One must just do reasonably well and get lucky with connections along the way. Of course hard work is required, but the same is true of many software developers. The difference is, there's rarely a path to becoming a partner or owner of the firm if you're a developer; but there often is if you are an attorney. I think given what I know now, I might trade with my good friend who makes 8 figures per year doing "boring" (reliable, methodical, and predictable) contract law.


>I think given what I know now, I might trade with my good friend who makes 8 figures per year doing "boring" (reliable, methodical, and predictable) contract law.

Considering the percentile of people who earn 8 figures, what makes you think you would have been capable of being in that percentile had you done law?

If your 8 figure lawyer friend was friends with a tech company founder, would they also say “given what I know now, I might trade with my software developer friend who makes 8 figures per year just from investment returns after going public”?


> Considering the percentile of people who earn 8 figures,

This is a wild jump and distraction from the actual conversation. I think very few people here are talking about top executives and stuff but rather your SWE making 100k-300k/yr. We're not even half way to 7 figures and you jump to... 8? The title of the article even indicates that it is about programmers, not execs. There's very few people programming and making tens of millions a year.

Let's be real here too. These companies are making record profits and the top execs are also making record salaries and you are defending them to pay the average worker... less? It's like you just called billionaires and hundred millionaires middle class. This is propaganda straight out of the Tea Party. Come on. (Same with the article itself)

We're talking about SWEs here and if ,,they'' are worth 6 figures.


lotsofpulp is not the one who originally mentioned 8 figures, the top-level comment did.


Not the person you're responding to, but anyone in America who was reasonably competitive at American competitive debate (esp policy) could easily push themselves into 8 figure lawyer territory.

In fact, I've never seen a more robust and potent pathway from high school to law than debate. Not even mock trial is this powerful.

To be fair, there is a lot of mentorship in the form of active lawyers who coach or judge at these debate tournaments.


Given that software engineers tend to be paid more than hardware engineers, things being dependent on software isn't a very good argument.


Software scales. Hardware does not. That's the reason why JavaScript developers are paid more than embedded C developers.


As an embedded C developer that just moved to JavaScript, this is very true.


Well, but design a great widget and a manufacturer can scale production as high as necessary.


This is a sick burn tbh. There should be a parity between software and hardware engineers but it isn't.


At a place like SpaceX, software engineers working on internal corporate software make far more than the mechanical engineers in propulsion doing things that have never been done before. It's just a function of the market. You'd never be able to hire any software engineers if you paid them the same.


Markets work differently for the two. There hasn't been much development in hardware because it is hard to get a good profit from it. You make something and it is very easy for someone else to directly copy that physical device. The "source code" is all there for you (especially if you send it to a country that doesn't respect IP to manufacture that product for you). And once you sell the piece of hardware, your sale is done.

On the other hand, that's harder to copy software, especially since many modern tools rely on having data. In addition to this, your sale can be reoccurring and you have a clear and more predictable revenue stream. This is a much safer business.

I'm not saying that things should be this way, but that's definitely the way that they are. Given that it makes sense to pay these people differently. If you resolved these issues I think it would be easier to argue for higher pay (not that they don't deserve it in the first place)


Honestly, that makes me wonder whether IP law is either too strong for software and/or too weak for hardware. I don't see an obvious public benefit to software being better protected than hardware.


We can probably answer the hardware one by determining the answer to this question. Do you think it should be possible to invent a robot dog, sold for $75k, but your manufacturer sells the same robot dog for $2.5k? The practice is technically illegal but effectively nothing happens. Should this be allowed?

I think the problem in your logic is that you're treating it like a zero sum game. You're ranking these against one another instead of scoring them compared to a stable moral base. It is highly possible that software is protected adequately and hardware is not. This does not mean you need to bring the protection of software down to bring that of hardware up. This would be either zero sum or negative sum. But these things are not tightly coupled. We can absolutely just increase the protection of hardware while the protection of software is unmoving (or even increases).


Hardware engineers are absolutely necessary. However, they make tools, not solutions. No question, their tools are incredibly complex, and their job is important. However, it doesn't scale in the way that a software need does.

And the thing is, as even absurd but amazing Minecraft examples have shown, software engineers can develop hardware, if inefficiently. True we don't know anything about 5nm lithography or whatever, but that's more a chemistry/physics/manufacturing domain.

Finally, even the hardware depends on software. It requires software internally to do its job, and the development of said hardware absolutely depends on software. Chances are, hardware engineers informed software engineers about their need for doing better hardware work, and the software engineers built better software to enable that.


The most important "developers" in most businesses in non tech industries as in most businesses does not program at all - they use excel. They have to do scope, development, ux, testing all by themselves and rarely has anyone to ask for help except google.

Oh and they have to do their normal job as well.

In order to have a developer provide value there needs to be quite the system of support around that job function. From a value perspective developers are valuable, but only in the right environment.


> they use excel

Indeed, many companies make great progress because of this. But inevitably they reach a point where it is unamanageable or unmaintainable. Then who do they depend on to keep their business operating (as it is now overly dependent upon Excel)?

I'm glad you mentioned it though, as it is something I have direct experience with. The firm I worked for several years ago was full of very sharp people, such that they made great money on mortgage related debt trading before, during, and after 2008. They did it by applying their effort and knowledge and depending on complex spreadsheets their people had made. But it reached a point where Excel would regularly corrupt the file; and opening the spreadsheet would take 10 minutes on modern hardware. And many activities were ungodly slow.

Who might you imagine took their Excel VBA and rebuilt it (still in Excel, and still with VBA) so it took an order of magnitude less disk space and launch time, and also a fraction of the file space? Yes! A software engineer. Also, as most of us SWEs do, I didn't know VBA before being given the project and producing a very successful new version.

How much money did they make using my version in the following years? I assure you it was enormous compared to what they paid me to do it.

A good developer is a force multiplier. Our salaries, if we actually do a reasonable job, should be large.


I wouldn't be so cocky about this. I agree with the value that software engineers bring, but you also have to look at the driving factors. Over the last 20 years, the web + smart phone revolution created an unprecedented opportunity whose value was captured by the first wave of companies who figured out how to scale always on services to billions of concurrent customers (Amazon, Google, Facebook). They built their foundations lean in between the dotcom and real estate bubbles, then, over the last 12 years we then had an incredible tech bull run capped off with a pandemic which forced everything online in a way that further ballooned expectations.

Ultimately software engineer salaries were driven by the competition for the elite talent, but the ceiling was created by the market dominance and stock performance of the biggest companies. FAANG hoovered up an incredible number of engineers because they could afford it, not because they could productively use all those engineers. The average FAANG engineer can not reliably generate the value they are paid—most of that comes from a tiny fraction of engineers running the revenue engines of the company.

While I agree the best engineers will only become more valuable, there is good reason to believe that competition will force margins down and this will put downward pressure on the average engineer salary over time. We may well have already seen the high watermark mid-level software engineer salaries, as we did a while ago for stock options which the investor class has steadily whittled away via financial engineering that consistently stays one step ahead of the historic perception of the rank and file.


>>>> In fact, I would argue that we are paid comparatively more than other careers because to be successful in our work we must take the confused requests or requirements we are given and make sense of them, make them fit into the actual business need, and then execute on that improved understanding.

It works both ways. From the "other side," communicating requirements to programmers is also hard. The problem is compounded if the requirements involve specialized domain knowledge, including math. And everybody thinks they have the best understanding of business needs. Everybody thinks the "other side" is the problem.

There may be conflicting expectations. For instance, when I communicate requirements, I actually try not to precisely specify every detail. I want the programmers to come up with their own ideas and solutions.

I think the high pay is just a supply/demand issue. Paradoxically, most of us who program think it's easy, but most people who try to learn how to program find it to be prohibitively difficult. And we don't know why.

Also, demand fuels itself in peculiar ways: The visibility and aesthetic appeal of software make it a focal point of the business. Investors are gaa-gaa over it. When the big-wigs review a business unit, they will ask: "What is your digital strategy?" It's hard for an electrical engineer to make their work self-advertising in this way. Second, Brooks' Law suggests that there may be positive feedback cycles in the need for programmers within organizations with large ongoing development efforts. Hiring more programmers makes programming harder.

On the other hand, my general advice to all workers is to plan for lower pay. My spouse and I have done this throughout our careers, and I don't regret it an any way. For instance we bought half the house that the bank said we could "afford."


I'm sorry to be meta, but geez. You make some reasonable points, and while some of your opinions differ from mine, I think they are valid for your experiences. And yet, you had been downvoted. (I tried to help you back up :) ).

And now because of that meta stuff, I'll take the time to reply. I already talk too much, so I had skipped this one. But now I must.

> communicating requirements to programmers is also hard

I absolutely agree. I also wonder: in business school, or management school, do they have classes to teach people how to break down their needs and articulate them to developers?

For example, if you want a house built you will talk to an architect who will make a plan, and those plans will be uniform to some standards. The plans will be consumed by some building contractor, and that person will arrange a lot of actual physical work. But in the business world, there is rarely such formality. Probably in high risk domains like air travel and such there certainly is, but in the typical companies it's just some executive or marketer with an idea telling the CIO what is desired, and the CIO passes the concept to the product team.

From the dozen or so companies I have worked with or for, I have not seen consistent approaches to this idea->project communication process.

And in my case, I have also done several projects, sometimes long term gigs, for small businesses where I will be working directly with the owner. The owners I worked with knew their business domain very well, but they knew very little about software or technology. Success meant me listening to their initial request and then asking a lot of broader questions to ascertain the actual need. For me it's quite fun, because I get to learn about their real problems, and then I can deliver a great solution which makes their lives better.

> most of us who program think it's easy, but most people who try to learn how to program find it to be prohibitively difficult

I do believe programming is easy, at least in terms of turning a specific need into code. For example, if you have a CSV file that you need ingested into your database, and you know the meaning of each column in the CSV, then it is just a matter of cleaning up the data and type converting it (and having a plan for how to handle fields which cannot be cleaned or are empty).

What is NOT easy is taking a big vague idea and programming against it. This I have encountered many times with non-technical people. They just want a "simple" X. So I start asking them questions, and inevitably we reach conditions which could exist but which they don't know how to handle. I politely say, "We have to know what to do when Y occurs, even if we just tell the user to go away."

Also, to be fair, big ideas are hard to program because they are logistical or organizational problems. They are not programming problems though.

> visibility and aesthetic appeal

This is definitely true. People are more motivated (and more willing to pay money) for stuff they can see. However, there are a lot of IT divisions at companies where the IT group is only a cost center. They enable all the other workers, but they are treated as unfortunate costs, like barnacles on a ship. Chances are they are paid much lower than SWEs in companies where software is a first class product.

> plan for lower pay

Another common version of this is "live below your means". I think most of us have fallen for the opposite. In my early years, I was a moron with money because I knew that every job change meant a 20-50% salary increase. And that was true for several iterations! Then it was no longer true, but I had lived as if my salary was large all the way. It was stupid, and I eventually learned.

I think a better approach to living and finances is to budget (rule #1, and it implies tracking expenses and patterns of expenditures) and save so you can weather career/health/financial storms with less stress.

Wha† is most perhaps surprising to me is that the biggest driver to me reducing spending is overaccumulation of stuff. Properly getting rid of stuff is a job. Sure we can throw it away or drop it off at Goodwill, but often it will just end up in a landfill. Given the current ecological state we are in, that is practically criminal.


Your comment "Like it or not, virtually every aspect of life now is dependent on software" reminded me of Heinlein's "The Roads Must Roll".

In the story, the road technicians were the linchpin to a system that millions of people depended on daily. Some of the technicians decided to go on strike, owning to their beliefs in the "functionalist revolution":

> Concerning Function: A Treatise on the Natural Order in Society, the bible of the functionalist movement, was first published in 1930. It claimed to be a scientifically accurate theory of social relations. The author, Paul Decker, disclaimed the "outworn and futile" ideas of democracy and human equality, and substituted a system in which human beings were evaluated "functionally" - that is to say, by the role each filled in the economic sequence. The underlying thesis was that it was right and proper for a man to exercise over his fellows whatever power was inherent in his function, and that any other form of social organization was silly, visionary, and contrary to the "natural order."

> The complete interdependence of modern economic life seems to have escaped him entirely. ...

> Functionalism did not take hold at once-during the thirties almost everyone, from truckdriver to hatcheck girl, had a scheme for setting the world right in six easy lessons; and a surprising percentage managed to get their schemes published. But it gradually spread. Functionalism was particularly popular among little people everywhere who could persuade themselves that their particular jobs were the indispensable ones, and that, therefore, under the "natural order" they would be top dog. With so many different functions actually indispensable such self-persuasion was easy.

The subtitle of the original 1940 publication: "The higher civilization becomes, the more it is dependent on each unit — and the more it is at the mercy of a few!" - https://archive.org/details/Astounding_v25n04_1940-06_SLiV/p...


> Like it or not, virtually every aspect of life now is dependent on software. A CEO without developers is nothing. A manager without developers is nothing.

Lol this is not true at all. Maybe if you replace developers with employees


How do you respond to the below?

“Every business is a software business”, as Watts S. Humphrey, the father of quality in software and CMMI, said two decades ago. The CEO of Microsoft, Satya Nadella, repeats the message: “all companies are software companies”.


> two decades ago.

I'm hoping your are aware that about two decades ago software devs took a TC hit that took at least a decade to recover from.

Every pre-dotcom bust software engineer I knew made substantially more pre-bust than post-post until just a few years ago.

In addition given the last N jobs I've had, while software is important in the world, most companies are vastly over indexed on software. A huge number of tech companies don't make more money than they spend, and we're so deep in a bubble people don't even care.

Plumbing is also hugely important, I don't know a single company that can run without running water, but it doesn't mean we can all be plumbers making 250k+.


Software engineering can optimize more workplace activities than plumbing can.

Once the plumbing is in place and working, you've exhausted the value that can be delivered by it. The only need for additional plumbing after that point is when something serious breaks (e.g. pipe bursts), which might be a once in a decade event at a typical business.

The software engineer plays a part in optimizing accounting, payroll, communications, customer outreach / acquisition, sales, and some legal processes in virtually every business that exists - and those are just the most universal aspects.

"X and Y are both important and therefore people doing X and Y should earn roughly the same amount of money" is the most ridiculous oversimplification I've heard in at least a week.


Are all companies energy companies?

While that statement broadly applies to white-collar and services, what percentage of COGS is software in manufacturing?

Is it 90%? 50%? or maybe only 5%?

Software is useful. That doesn't make it important enough to command high pay in a shrinking economy.


Importance does not command high pay, supply and demand does. The point of “every business is a software business” is to imply that demand for software developers will stay abreast with supply of software developers.

Edit: also, obviously, the buyer needs to be able to pay, which is sort of related to software being able to be used to advance a business’s competitiveness in the market such that the business is able and willing to invest in software development


So it's all about actual numbers (supply and demand curve). The case of energy (and other industries like transportation and construction) shows that this supply-demand curve be weak enough to lower employee salary. Is there any argument that software won't behave similarly?

Speaking about productivity, software is far from a guarantee of increased productivity - see the case of EHR in hospitals.


> Speaking about productivity, software is far from a guarantee of increased productivity - see the case of EHR in hospitals.

Was this conclusively measured? I imagine there are fewer errors or better decisions made due to quick and easy availability of health history.

I trust auditable, easily legible notes over chicken scratch handwriting faxed from office to office in previous years. With EHR, I can also catch healthcare providers writing notes about stuff that did not happen in the visit to cover their ass, although I imagine many doctors do this because a lot of the minutiae is a waste of time that just serves to help in case of lawsuits or billing disputes. But in any case, I would rather have access to that info than not in case in need to dispute it.


Broadly (and of course there are a lot of nuances), studies do not show substantial effect[0], which is a failure of the software considering the promise and the resources invested in this transition. Tens of billions of dollars invested in the transition, which could be used for other interventions.

The broad economy is different from pure-software companies. Software can eventually only manage information - but many businesses eventually operate on physical things, where people matter much more than software.

0. https://pubmed.ncbi.nlm.nih.gov/33441368/


The critical question being missed is not “how many businesses need software”, but “how many businesses need developers”.I suspect the former will end up being way larger than the latter as time passes.


It already is that way, always has been, always will be. One would have to live in a tech hub working in a tech company to think the inverse is true.

The overwhelming majority of businesses have no use for software developers. Plenty of uses for software.

Here in Australia over 99% of businesses are SMEs, and the overwhelming majority of those core business is not software development.

I personally know exactly one web developer, so I know about forty car mechanics, three thousand builders, ninety seven plumbers, eighteen diesel mechanics, seventy three plasters, eight hundred fencers, twenty four tilers, sixty four concerters, ten thousand low voltage electricians, fifty five high voltage electricians, seven dentists, four thousand chefs and kitchen staff, eight thousand hospital fkorr staff, four hundred doctors, two glaziers... to one person who writes code.


I work in a fortune 500 that sees itself explicitly as not a software company. The devs are only viewed as a nuisance by everybody else. The CTO reports to a finance guy. And it's a shit show. I agree with Humphrey and Nadella here and I assume that every company that doesn't get that leaves something valuable on the table.


It’s obviously wrong. Is a restaurant a software business? Ridiculous. Software is great but there are other businesses.

I could be convinced that all companies are energy companies, though.


What's the difference between me making food at home for me to eat and a restaurant? The restaurant is a business. What's a business? It's an economic & logistics model that produces value using known processes and workflows. Say that a different way. It's a repeatable algorithm for producing value. What's our current best toolset for building models and algorithms? Software. We're done.


In this model there's no difference between me making food at home and a hospitality venue making and serving food.

This model assumes that everything can be reduced to an algorithm, a model.

Anyways, getting hungry, enough of this nonsense, I need to go microwave some math.


Lol, I’m sure software will make a great restaurant. Using software does not make a software business.


Nowadays most of them are.

The whole payment system largely depends on software.


But they would still be restaurants without software.


Not for long.

I think the point op is trying to make is without the investments in software to support business modeling, procuring supplies, scheduling staff, payroll, online marketing and all of the front of house functions like reservations, online menus and what have you, your steampunk restaurant is going to be outcompeted eventually.


It’s not true though. If you go to Kenya restaurants exist without computers.


But much fewer


POS, payroll, time-tracking, inventory, food ordering, logistics, market analysis, digital advertising, DoorDash/Uber Eats integration, online menu ordering for pickup, etc.


Exactly, good luck competing with other restaurants without those.


I've been working in structural steel fabrication and construction for over two decades, when I started the product we put out the door had no software in it, and still doesn't today.

There's definitely more software used to aid–that's the A in CAD/CAM)–in manufacturing, but software isn't what we sell.


Every employee of every modern company is dependent upon software built by developers of that company or integrated/managed by developers.

Most of the productivity gains of the last 20 years can be attributed to software, and the only employees writing software are software developers.


Every modern company is dependent on a lot of things due to interconnections in the economy, that is nearly irrelevant.

Pay is about supply and demand in the labor force. Necessity will help hold up one side of the equation, but not the other.


This seems like saying that because all modern businesses depend on electricity, working in the utilities sector will always be lucrative. It's true that developers are necessary, but it doesn't mean they are important, or that they will always be in high demand.


Electricity is a commodity, though, and software is very much not.


Indeed this is the whole game. Turn everyone else's contribution into a commodity and eat their lunch. Not just the energy, but the hardware (e.g. Microsoft vs IBM) and even the libraries they provide (e.g. app developers versus apple).

I think software has gained a disproportionate advantage because tech has been changing so fast and software can adapt the easiest (which matters more than it should perhaps due to software patents).


This is somewhat like saying the person who pulls the levers is responsible for the productivity of a factory. Have productive gains in the last 20 years been that high? It seems like mostly a roll out of productivity gains from the digital computing/processing and communications hardware which came in a rush 20 years ago and has been incrementally improving since then. They just did a poor job of capturing the benefits from their new hardware. Other than Apple and a few others perhaps. Maybe Qualcomm? This is why it's good to go vertical like Amazon.


> Most of the productivity gains of the last 20 years can be attributed to software, and the only employees writing software are software developers.

While I agree with this, I think it's also true that most of the software responsible for those productivity gains were written by a relatively small subset of developers.


My conspiracy theory is that less than 100 people in the US actually know what they're doing and wrote the important code doing all the work and/or are keeping all of global IT sort of running. I am not one of those 100.


I think you are just feeling impostor syndrome or attributing it to the general developer population.

In reality, there's a lot of boring software which does its job and has meaningful value in companies.

I am not a great developer; I get bogged down by needs for perfection or distaste of inelegant solutions (when I know the _must_ be a proper solution to be found). Even still, I get things done, and I periodically hear about old code I've written that is still valuable to the companies I wrote it for. So imagine what the good devs are doing!

Also don't underestimate the immense developer effort that goes into making enormous, complex systems "work". These systems, and the developers working on them, are essential cogs in a big machine.

That is not to say that hardly any software is correct or reliable. But on average, at least as we see evidenced by software failures only being periodic rather than constant, the many people behind most software must be doing a halfway decent job.


That's probably uncomfortably close to the truth.

For sure, of all the code I've ever written, 124 lines I wrote for a previous startup that was acquired is responsible for 99.99% of the revenue that code I've written ever "earned".


but considering the law profession (AFT), an Ivy League school is required.


I like this. It makes me feel better. Dunno if it’s true.


> Like it or not, virtually every aspect of life now is dependent on software. A CEO without developers is nothing. A manager without developers is nothing.

Every aspect of life now is dependent on food and mineral resources, too, but that doesn't mean farmers and miners are rich. Lots of CEOs and managers don't have developers, even today, just like they don't have farmers and miners. So I think your explanation is wrong.

> I would argue that we are paid comparatively more than other careers because to be successful in our work we must take the confused requests or requirements we are given and make sense of them

This is not a difference between programmers and maids, taxi drivers, or bouncers. So I think this explanation is wrong too.

— ⁂ —

My preferred explanation is that people's share of the excess value of an endeavor depends mostly on their bargaining power. The people who have little bargaining power relative to the group, because their BATNA is poor and the group's BATNA is good, tend to earn very little. The people who have lots of bargaining power tend to earn a lot, relative to the excess value.

To take an extreme example, a gangster who robs a convenience store is making no positive contribution to the convenience store's value, even making a negative one, but he can still take home a lot of the convenience store's profits, because the clerk's BATNA is a bullet in the head. The store owner's BATNA is to close the store or hire security guards, so the gangster has an incentive to stake out territory to prevent competing gangsters from making the store's profits actually negative. The gangster's BATNA is (depending on factors like murder conviction rates and CCTV resolution) either to spend a couple of bullets and maybe have to wash his shirt, emptying out the cash register anyway, or to walk out of the store emptyhanded. He has more bargaining power when he can more easily choose the first of these.

Key employees whose departure would substantially reduce the profits of a business can, in the same way, demand much higher compensation than easily replaced employees. (So, too, unions who can credibly threaten a strike.) With the rise of managers in the late 19th and 20th centuries, many managers became such employees; now, many programmers are as well, for multiple reasons.

One is that business processes that were previously carried out manually under management direction are now carried out automatically by computers.

Another is that, like managers and unlike taxi drivers, farmers, or assembly-line workers, programmers are not very fungible: the knowledge in their head is what makes the software adaptable. You can easily buy your rides from a different taxi driver or your wheat from a different farmer, but you can't easily buy new GCC features from anybody but the small group of programmers who already know GCC, many of whom are specialized in particular parts of it. When we're talking about your home-grown payroll system, the situation is even more extreme. And of course we have the Fizzbuzz phenomenon, where hiring competent developers is difficult, especially for people who aren't themselves competent.

A third reason is that writing more software can vastly increase the profitability of a business, both because the software is a cheaper replacement for workers, and because it enables new productive activities that weren't possible without the software. This doesn't guarantee higher earnings for programmers (or sysadmins, or data center HVAC technicians) but it does increase the size of the pie the different factions are negotiating shares of.

— ⁂ —

So why are hardware engineers so much worse paid than software engineers? Their contribution to automation and softwarification is, after all, equally critical, and they're doing the same kinds of activities as software engineers, so they're no more fungible.

I think it's because the hardware engineer's BATNA is so much worse.

If they have ramen money, a software engineer can take six months off to write a game or a web service and make it available to the world. Mark Zuckerberg and Eduardo Saverin launched The Facebook to students of the entire Ivy League from their dorm rooms after a couple of months. They didn't need any investors or permission. 37Signals had similarly shoestring beginnings. Every company that wants to hire programmers has to compete against the startup ecosystem, both VC-funded and bootstrapped.

By contrast, spinning up a production run of a new chip is going to cost you upwards of US$100k, or US$10M for a chip in cutting-edge technologies. You have to convince Samsung and/or TSMC to give you access to their cell libraries, signing NDAs. Even a prototype shuttle run in a last-millennium process node from something like MOSIS or CMP is going to cost you US$3000. Feedback time is measured in months or years, not seconds or minutes. PCB fabrication is not so extreme but there's still no hardware equivalent of a US$5 DigitalOcean droplet where you can try out a web service on 1000 users all over the world.

So companies hiring hardware engineers have a lot of bargaining power by virtue of having the capital to put designs into production.


There's a lot to your reply, much of which I agree with and some I do not, but we cannot afford the space or time to expand this much :).

Trying to keep this brief, I think the issue is a matter of scale. Many of the roles or professions you mentioned do not really scale the same as software products do. One might argue that a farmer scales food production, but that is largely a solved problem which can almost be completely automated away. Taxi drivers and other laborers generally only provide 1 person amount of production value. Software engineers take idea from others or themselves and produce things which typically scale up very well. And because that process of taking an idea and producing working software is as much an art as a science, it is not easily replaced. Thus it has somewhat outsized value.

> Key employees whose departure

I strongly suspect that most key employees leave or are released without the company recognizing their importance. I don't believe this has anything to do with software developer salaries. Now the CEO level has successfully led everyone to believe that huge compensation is necessary to retain their special talent (talent which somehow exists even when companies have poor results); but then, executive boards are often staffed with other CEOs.

> So why are hardware engineers so much worse paid than software engineers

What they produce is far less consumable. They make tools or appliances (incredibly complex ones, but still). Software is ever growing and changing. Perhaps in an ideal world we could figure out what we need before writing the code, but it rarely goes that way. And since most businesses don't really understand their markets until they get into things, and consequently need to adapt and change to suit their market, the software has to adapt as well. The hardware isn't changing in most of these cases.

One could argue that software is a consumable in this regard. Certainly with the modern startup approach, software is built in stages and released at each stage, "finished" or not; then it is amended, updated, etc. Hardware just doesn't do this as much. Even with firmware updates (which are software...), it is often possible to ignore them and use what you currently have as long as it suits your needs.


should have a [2019] on it


Thanks! Year added by a volunteer year editor. If anyone else would like to be a volunteer year editor, let us know at hn@ycombinator.com.


(2019)


Thanks! Year added by a volunteer year editor. If anyone else would like to be a volunteer year editor, let us know at hn@ycombinator.com.


The moment you will get from US centric and Silicon Valley centric view, you will find out that programmers are paid fair wages but not eccentric 100-300k USD a year. For example embedded programmers in Eastern Europe have 30-60k USD wage a year. So it really depends what do you imagine under "lower pay"


"Programmers" who work for non-tech Fortune 1000 companies probably won't see any reduction in pay, particularly if they've been with their employer for more than a few years. The 26 year olds who work for FAANG and adjacent big tech who have job hopped every year to get their "TC or GTFO" cred on Blind are about to see their dreams end in tears. Big tech comp packages from around 2017 on became completely untethered from historical reality.


Why is this downvoted?

Are we really at the stage where we simply downvote people who say things that are hard to hear?

Parent is right: there's an imminent correction, and it will be unevenly distributed, because so is everything else.


>Are we really at the stage where we simply downvote people who say things that are hard to hear?

This behavior is ubiquitous and has been for some time. Though the frequency has seemed to have increased exponentially somewhat recently.

It's a form of punishment.


I have the following argument, applying market principles to discourse.

There are four categories of claims:

1. Easy to hear and false

2. Easy to hear and true

2. Hard to hear and false

3. Hard to hear and true

The first category is the domain of spin doctors and marketers. Let's ignore it for now.

The second category encompasses a lot of facts, but generally these won't show up in discourse. Because they're easy to hear, they are already known to both parties and agreed-upon. Let's call these "common knowledge" or even (more tendentiously) common-sense.

The third category show up in discourse as "things you want to make your opponent believe about themselves or their worldview." These are your psyops, basically.

Finally, the fourth category is pretty much every true thing that you need to actually point out. They need pointing-out because of a natural (and relatable!) human tendency to avoid discomfort. If they weren't hard to hear, you would have let yourself hear them already. Thus, most of discourse takes place here.

If the foregoing breakdown is correct, it also suggests that the *hardest-to-hear truths* are going to be the ones which are most significant, because of all the truth claims, they are the ones which have not yet been universally admitted.

HN's current posture toward discourse seems to encourage making hard-to-hear truths even harder, by literally fading them. And moreover, the guidelines prohibiting the discussion of our collective moderation decisions in-thread is itself a way of figuratively moving *even the awareness of the problem itself* from `color: #000` to `color: rgb(130,130,130)`.

This is why I now try to read the greytext first. For every one comment faded because it was offtopic or whatever, there are three which were faded for telling us emperors about our new clothes.


And this was written before GPT-3 etc. started producing… well, it's not amazing code, but I've certainly seen worse from professionals.

I was expecting programming (from natural language instructions) to be the last job that gets automated. Some of us and our jobs may yet survive to the final stage of automation, but I think the minimum bar for skill is rising, even if the market hasn't priced that in yet.

For those of us that remain in the sector, things could go either way; but for new grads? I remember a few years ago a story about software solutions in law that were automating the jobs that taught fresh law graduates how to be good at law, and I think even the current state of generative language models is sufficient to replace many fresh graduates in software.


This is what scares me. I predict that instead of developers salaries going down, there will instead be fewer low-hanging jobs for developers to pluck. We have an increasing workforce of web-stack developers (front-end JS, back-end JS, full-stack JS) that, as far as I can tell, do a LOT of the same stuff in the same language (JS) over and over again. I don't know when, but I am very confident that that skillset will be replaced by AI Software 2.0 in the relatively near future. That will lead to a massive reduction of jobs and a massive influx of SaaS type web-stack software. Obviously that means an increase in demand for AI developers (but only until that cycle repeats on them) and "prompt engineers" and project managers.

I've put time into learning a skillset that I genuinely think will be unnecessary in the near future and the solution is to prepare for a skillset that has almost zero fiscal value in the short term. That's a tiring thought, haha.

I could totally be wrong, but that's a future that I think is highly likely and people who think they can't be replaced are kidding themselves. I need to learn lower level programming ASAP.


Who tells GPT what to write? Who verifies what is written?

As long as you need a human in the loop, that human is the programmer. Programming will never go away, the job just changes.


This is true, a dev will always be needed in some way, but these are early days for AI code generators, and they will probably get continue to get better and better. The result of this is hard to say, but in the least it will increase productivity of each developer. Yeah, if a couple of developers can do the job of what used to take a 3 or 4, then that will greatly decrease demand for SWE's...

... although, on the flip side, this could be another case of more powerful tools allowing is to build even more incredible software systems, creating new classes of products for consumers to use (for example, cloud providers like AWS have decreased the need for a business to have a large sys admin team. This seems like it would reduce the need for system admins, but an argument could be made that cloud providers have allowed many small & medium-sized businesses to create web/phone apps that wouldn't have in the past, and increasing the needs for developers and admins).


It has been the latter for the lifetime of computers' existence. Personally, I see no reason for that to change in the near future. AGI will happen, but we will still want human's directing it's efforts.


"Near" meaning what?

My prediction horizon for this stuff doesn't go past about 2030; not because it's definitely happening by then, but because by then too much will have changed one way or another.

"Directing" meaning what?

A director isn't the same job as a programmer. A natural language interface isn't the same thing as a programming language. "Computer" used to be a profession, a person who computes, a programmer is an interface (for now human) between a computer and a business goal.


Right now? In the context of me saying it's a substitute for a fresh graduate?

> Who tells GPT what to write? Who verifies what is written?

Are the people who would previously be doing the same to the fresh graduate.

How far we are from a project manager being able to do the same to a whole team, or a CEO being able to do away with a PM, I don't know, but I did explicitly suggest it may still be the last thing to ever get automated.


Copilot hardly replaces a human. It might replace making a Github search manually.


in 2015, Deep Dream hardly replaced artists.

Five years later, and Imagen/Dall-E/SD are on the cusp of blowing up the entire field.

Sure, the celebrity artists will be fine, but the twenty-somethings who took out 90K of debt to attend SAIC are in for a miserable five decades.

Coding, like illustration, will evolve, or die, or both.


In 2015, fully autonomous driving was a year away. We will see. Though I don't mind copilot automating simpler plumbing tasks personally. To replace human programmers, it'd have to replace mathematicians first.


Perhaps (though I'm not as certain as you about sequencing) but my original claim was about junior devs, not all devs. How do fully autonomous driving AI compare to the motor vehicle accident rate of teenagers with less than 6 months experience?




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: