What’s really happening is two things. First, companies typically don’t like paying people lots of money (even if they are great at blowing millions on other mindless things). Second, these companies do not understand that their unwillingness to hire expensive people is causing months of bug-chasing and monkey-patching in their products.
When your organization is under the impression that “a developer is a developer”, new college grads willing to work for peanuts are very attractive. It takes a good manager to understand that someone’s decades of experience or advanced degree really is worth a lot more money, and not just in the long run. Experienced people have seen more programming constructs in more languages, they have encountered more examples of APIs in more libraries, they have made more mistakes and learned from them, they are more likely to be able to apply suitable algorithms and data formats to problems, and so on. Also, having experienced people on staff actually gives your new developers somebody to learn from.
These companies would NEVER hire an older, experienced developer because they know they wouldn't be able to retain them.
Its worth noting that I work on a dev team where the average age is about 40 (I am 27). We have multiple members of the team nearing retirement. Having the older guys around makes my job much easier and gives me resources that I wouldn't have otherwise. We are able to keep these guys around because we work on interesting projects, don't work more than 40 hours a week, and have a great manager.
Not only that, but they don't have to retain them. . There is a large supply of "~20 year olds who aren't quite Google material" lined up to take on those $80K developer roles. The "churn and burn" strategy seems to be working for these companies so why change?
People only care about how long it takes to write something from scratch. That dictates everything, from which frameworks or languages are "hot", to which methodology people use, how projects are managed, and....who gets hired.
There isn't a big argument that someone straight out of college with low experience who happens to know the latest new trendy thing will probably pump out a semi-functional MVP faster than someone more experienced. More energy, more willing to work late hours at home because of less responsabilities, but also less experience leading to less time spent trying to think about real problems that could come down the road and how to prevent them.
That lets someone code really, REALLY quick. It will blow up down the road and then thats just "normal" and goes in the bug queue and tech debt goes out of control.
I was once in a meeting where someone was like "Well, of course this code is buggy and sucks: Its at least a year old!!!"
I was floored.
Many SF tech startups overtly believe that actually planning your code before you write it is just, well, not very "agile" -- that's old school, that's waterfall, proven to be deficient, that's not how we do things in the new world. Planning an architecture before coding it?? Why, you may as well be wearing a pocket protector and hating on women and minorities from some massive IBM cubicle farm in the suburbs while chainsmoking and punching your code into paper cards.
In the new world, you break things and move fast -- when you're assigned a massive new project, you just sit down and start typing. When your unplanned, non-architected code turns out to have serious structural issues that make it unextensible and unmaintainable, why, you just layer another level of crap code on top of that to plaster over the cracks, and another on top of that, and so on.
I wonder how much of this has to do with the churn-and-burn VC culture, where they fund hundreds of startups with just enough money to build an MVP so they can "test for market fit," then burn 99 of them and move on to the next batch.
Peanuts being a ways north of 100k a year at the bigger companies.
> advanced degree really is worth a lot more money, and not just in the long run.
Are we talking PhD?
Agreed. I started working in tech when I was 22--now I'm over 30. I've never witnessed old-age discrimination. What I have witnessed are start-ups, built on newer tech, (AWS over on-prem, nosql over relational, etc), passing on candidates whose last reference book was the Data Warehouse Toolkit and still want to be called WebMasters.
The law of the tech industry is when you stop learning--at any age, you lose market value.
You can always catch-up and learn something that becomes important/mainstream, but you can't un-waste your time deep diving on a dead-end technology.