I don't want to start a flame war about diversity, I just want to get the facts straight.
Anyways the article as a whole is heavy on anecdote and low on data & facts. There is an emphasis on outcome metrics but the author doesn't bother to examine any data on productivity vs age.
Asians, especially from India have an out of proportion presence in technology jobs. It is no different than any other ethnic industry affiliation. Irish were cops, Greeks own diners, and South Asians are huge in tech, and have ethnic social networks that connect folks to jobs, just like any other tightly meshed ethnic community.
> The organizers know how male and white the tech industry is, so they make a special effort to recruit a diverse speaker lineup.
Yes, the author was using social awareness to segue into ageism, but it is still factually incorrect (or at least highly misleading) to state that the industry is predominantly "white".
It's worth mentioning that for 2017 leadership, the Google numbers are significantly different: 26% asian, 68% white (and 75% male). So the decision making is still quite white and male.
In case you missed the memo, the current cultural climate and trendy narrative structure demands that everything, no matter the topic, must include some reference to identity politics. Facts only interfere in this sort of narrative building, so please ignore your data and focus on emotional triggers instead.
As a female developer I really don't like the idea that I might be selected just to fill a quota.
Bonus points if it fits somehow into the very trendy culture war narrative.
My take is that older programmers who maintain their skills on new technology are worth their weight in gold. The financial trick is to pair them with younger developers to create an overall cost effective team. If both the older and younger devs go in with the right attitude, it can be wonderful for both the project and their respective careers.
I also think it's really, really dumb to push developers into management positions if they can't do the job. I'd much prefer putting younger managers with great management skills into the mix than just rotating older devs into that role.
Bottom line: software requires lifelong learning. If you can't or won't stay current, you're in trouble. If you can, then it's just agism or a bad business model that's keeping you out of awesome projects.
A lot of older Python candidates are being passed over for younger ones. Sure they might not be learning Rust, but there are younger python developers that are having an easier time getting a job, and they aren't necessarily learning Rust either.
: https://stackoverflow.blog/2017/09/06/incredible-growth-pyth... https://insights.stackoverflow.com/trends?tags=python%2Cjava...
Does it make sense to consider Python an old language if it's still a living language? The next version is coming out in October, and most new projects started today are running on versions released within the three years.
This seems pretty different than something like COBOL, where it's technically still being developed but not in anywhere near the same way.
Python's really good at binding C code , which is one of the reasons you see it in so many machine learning and scientific projects (Numpy, Scikit, Tenserflow, etc.) I think that's what given Python a lot of legs in the last few years.
Python, I'd argue, is very much a living language. It was introduced in 1990, so it's 29 years old. Last release was 5 days ago as I write this 
Java, is also a living language. It was introduced in 1995, so it's 23 years old. Last release was in September 25 
Cobol was introduced in 1959, so it's 1959.  The last table release was 2014 (which is a lot newer than I would have expected). Cobol is a great object lesson, I think, as the parent comment calls out. I don't think many software engineers would consider it a "modern" language, but according to the Wikipedia article, as of 2012, 60% of businesses use Cobol. So it's far from a "dead" language.
Lisp was introduced in 1958... 61 years old. . I don't think many developers would consider Lisp dead, either.
Cobol's longevity seems to be tied to the fact that much of the legacy code encapsulates business logic, as opposed to UX logic (websites, mobile apps, etc.) There's more than a few lessons in there I think...
Whereas COBOL today is pretty much the same pile of stuff it was in 1960-something. It's not a vibrant language family with newer members. COBOL does not absorb new concepts.
There are new, young people learning COBOL, but only in order to become maintainers of legacy systems: job security in banking, finance and government sectors. Maybe a few have a perverse curiosity: they want to know what is really behind the dirty five letter word.
New people coming to a Lisp are coming for the Lisp itself. There aren't that many Lisp jobs involving new or legacy coding; new Lisp people learn and make new stuff from scratch.
There is simply no comparison.
I've had managers who were both technical and people-oriented. They both bring a lot to the table. But the older I get, the more I value having a manager I can go to who is good at managing optics and helping me deal with political or institutional roadblocks. In an idea world, I'd have both.
But fundamentally, if you can't move the decision making power/responsibility/(+ to some degree compensation) out to non-traditional locations in your org structure, you'll naturally have to move those people towards the more traditional locations....
I also think it's really, really dumb to push developers into management positions if they can't do the job.
and also true of older programmers who maintain their skills in older technologies. see all the COBOL comments below. If history is any guide, competent Java programmers in 10 years will not be aged out, and will be compensated more than whoever is competent (regardless of age) at the current tech stack.
This has been known forever, yet it still happens - Why?
>> I'd much prefer putting younger managers with great management skills into the mix...
One reason is it is impossible to tell if a young manager has great management skills. It's really hard to tell if an experienced manager has strong skills.
As a result, technical proficiency is used as a proxy for management potential. It probably has a stronger correlation than picking a random person, but is not great. The new manager is then really unhappy, quits or is fired or eventually learns the new skills and is successful.
The real issue is that management is a pyramid but IC is ziggurat with, if you're lucky, an obelisk sitting on top of it.
People like to point to the principal staff engineer or whatever, but there's one of them for every twenty upper-mid level managers making the same money. And all those managers can take their skills to another company, but if an IC guru leaves the company where he made his bones he's probably destined for consulting (at best). Consulting has its upsides, but that's no more for everyone than management.
The tracks and levels are commonly referred to as the "job family architecture" -- don't be afraid to ask a recruiter for this information. Otherwise you'll never really know if you're being properly leveled. (For instance, I believe a Principal SE at Microsoft is roughly equal to a Staff Engineer at Google.)
>This has been known forever, yet it still happens - Why?
This problem exists in hardware departments too. My best guess is that they want managers familiar with the tech/industry/company. My second best guess is that it's a natural human tendency and hasn't been addressed systematically.
I do find that companies seem to know how to evaluate young programmers: just test them on their knowledge of your tech stack. Any young coder who is good and is working in that stack will know the basics. The bad ones won't.
For older coders, I'm not sure that same signal works. I've forgotten more frameworks than most coders will ever learn. I mostly don't care about programming languages, they're roughly the same and I can learn a new one in a couple weeks.
I know technologies like Rails and SQL, but I don't play with the new toys as much as I used to, becuase lately I would rather solve a new problem or build a new tool than learn a new tool.
But how do you evaluate a coder on their ability to solve problems? Much harder.
Even more difficult, how do you get a 24 year old coder who only knows react and FireBase to evaluate a 37 year old coder who has built web frameworks from scratch?
It's another world, and a lot of companies just don't bother.
Thankfully I have no desire to work at any of those companies. I solve problems. If a company can't hire for that I don't belong there. There are plenty of companies trying to hire people to solve actual problems, not just put coder butts in seats.
Yes, you can learn anything at a superficial level in a couple weeks. If you've spent the last ten years maintaining an ASP.NET WebForms app running on IIS, then within a couple of weeks, you can contribute code to a Go microservice running in Kubernetes and using Kafka.
And that kind of "basic competence, with the ability to learn more" is what hiring managers want out of a junior dev, so if you don't keep your skills up, well, great, you're as useful as a junior dev.
But what they want out of senior devs is people who know the ins and outs, who understand what to do and why to do it and can explain that to others, who can avoid the pitfalls of bad tech choices and misguided implementation patterns.
If you've got decades of experience on your resume, my guess is that you want to be evaluated as a senior dev. And if you're a senior dev who doesn't know modern tech, then... you're not that useful. Your encyclopedic knowledge of the WebForms event lifecycle doesn't give you any particular insights into the design challenges of a microservice ecosystem; your ability to tweak IIS for maximum performance won't help you write a Helm chart.
If you want to be hired as a senior dev, you need to understand how the modern world works, and all the posturing around "I could learn it in a week" or "I prefer to focus on problems" doesn't impress anyone.
So yes, it's helpful that people keep current, and obviously waiting for a new developer to get up to speed on the particular stack is always a cost (and it isn't a couple of weeks, for anyone). However, if what you really need is a senior person, it may well be worth that time.
Seeing this assumes the hiring manager understands how software is actually produced and maintained, not just what their current tech stack needs. It's surprising how often this isn't true, but a contributing factor is how mid level mangers are made - which is related to this discussion I think.
You might be great at mentoring... but you have to know the stuff to mentor people in it. You might be great at communicating with stakeholders... but you have to understand the technical constraints of your system in detail to have that communication. You might understand principles of software design and debugging... but you still need to understand the particulars of these technologies to make your design or to focus your debugging.
And yeah, you can learn the tech stack, and a company that really needs a senior dev eventually could do okay to hire a senior dev with out of date skills, and wait for them to get trained up. But don't plan on it taking three weeks, is all I'm saying. To really get to that point of fluency and domain expertise takes longer.
(And also, one of the things that senior devs are supposed to do is evaluate new technologies and figure out when and where adopting them can make sense. If you've fallen so far behind on learning new tech, then it's not clear if this is something you're able to do -- maybe it is, and you just worked for a workplace that stifled adoption of new tech, or maybe you're someone who'll just learn what you need and then stop paying attention.)
That's mostly not true. While there definitely have been advances, tech is also very faddy and repeats itself. Trends are often reactive, techniques repetitive. Someone who has been around for long enough to see some of these waves resolve themselves is in a much better position to evaluate current and upcoming approaches. Technologists also tend to be myopic, and view the things happening in their own tiny slice of the industry as "what's happening". There is a whole universe of interesting stuff on the go that you are at best barely aware of.
Personally, if I'm hiring senior technical people I'm much more interested in their continuing level of curiosity and engagement with something, rather than particular stacks.
So maybe you are worried that your candidate who has been doing ASP.NET WebForms for the last decade on IIS will have a hard time getting up to speed on your use of TypeScript and React ... but if I find out that she's also being building raspberry PI infrastructure for fun and messing around with the Julia language to implement a cellular automata project I'm not all that worried about their ability to get up to speed, even though none of that stuff will get used by the team.
But yeah, it doesn't take 3 weeks. It doesn't actually take 3 weeks for a new dev already well versed in your technologies, with rare exceptions. At least not to be firing on all cylinders.
The thing is, a huge number of teams really, really need a good senior dev. They often don't think so, and often erroneously think they already have some... but they don't. Instead they have people who had "Senior" (or "Staff" or whatever) added to their title because they'd been there for a little while. And every piece of work the team produces suffers for it. So yeah, it's typically worth the investment.
Now granted, that assumes you are good at identifying talented people. This is actually a big ask - and one of the issues surrounding the OP's questions really comes back to that. I think that a lot of software and technology development suffers fairly systematically from poor quality middle management. One of the symptoms of this is the "ageism" discussed here, but it's only one of many....
then those skills are still tied to your technical knowledge.
Focusing the hiring too much on that first 30% impact gets you in trouble.
I know developers in thier 30s/40s/50s who are both great developers/architects and have current skills. Why would I settle?
Two other points: First, of course when hiring you want to find the “perfect fit”, but you almost never do, so you have to make tradeoffs. Hiring an inexperienced person with the “right” tech stack when you really need to bring experience into your team is a common anti-pattern. Secondly, as a hiring manager you need to properly understand the difference between “10 years experience” and “one year of experience, related ten times”
So if you only know old tech, I'm a lot quicker to believe that you're a "repeated ten times" person.
Also, most coding is maintenance. So yes, an experienced developer in some language is really worth a lot when you initially start out designing your system. But most of the work spent on that system will be maintaining and extending it and to be able to proficiently do that you need to:
1. Grok the problem domain, so you can connect the real-world / problem you're trying to solve with the code in front of you
2. Have a grasp on the limitations of your own system, which will come with experience as you work longer on it
Those things don't really count as languages, they appear to be a mix of development environments, software packages and architectures. Learning these things is critical and hard but there is a key mitigating factor. That factor is that really what needs to be learned is 'how we do things around here' which is typically company specific and needs to be learned by a new hire anyway.
My experience suggest new hires in cognitive roles (at medium-large businesses) take 1-2 years to get to the point where they can be proactively useful, purely from learning who is who, what the corporate history is, what ahs been tried before, why things are what they are and what their role's problem domain is really intended to be by their boss. Compared to that set-up time, the cost of knowing or not-knowing a technology is quite small.
> This whole "learn a new one in a couple weeks" mindset is toxic and needs to die.
It is true though. Learning a new programming language to the point you can write bug-free code and execute someone else's design is really easy. Learning how to solve a new problem is typically the hard part.
I 'learned' Python in 2 weeks. What I write looks a lot like C code, and I didn't make great use of the standard libraries, but it works fine. Compared to that learning how to use a large new library for a new problem typically takes a months.
Not necessarily, except for maybe Go, those are industry standard technologies that you could find someone who knew and could add value almost immediately.
We are very heavily invested in the AWS ecosystem. The difference between hiring someone who has the same amount of useful development experience but no AWS experience and someone who does and how fast they can hit the ground running is stark.
The language + tools aren’t barriers anymore, the interesting part of solving business problems and how to apply the tools become the fun ones.
Once you've learned and used a half-dozen imperative languages day-to-day, there's very little new when picking up another imperative language. It's mostly mapping common concepts onto new syntax, plus some new features and maybe quirks/gotchas.
Silly little example: After an hour of reading a Go tutorial, and having never looked at the language before, I was able to diagnose a bug a co-worker had been struggling with for two hours, because I recognized the pattern from something similar in Django (a python framework). Now knowing what to look for, a quick Google search had the exact answer to the problem right in the first result.
I wouldn't be doing anything large or complex in that short a period, but absolutely can become useful with smaller stuff in that period.
And I say this with experience, thought in a different context: Years ago my manager introduced me to a new codebase by giving me a bug to fix. It was in a Django site, and I'd never done any python before. Took about half a day to figure out what was going on (for comparison, nowadays it would maybe have been 20-30 minutes), and a bit more to fix.
But I did figure it out on my own, taking some work off of another developer who was focusing on more complex work. I would call that "useful".
I have no idea why all of the old folks (again I am in my mid 40s) consider it some great skill that excuses them from actually keeping up with what’s going on in the industry.
* My bugfixing example was indeed trivial, just the first thing to pop into my mind. It was mainly aimed at the use of the word "useful", which is woefully ambiguous.
* Sounds like we agree when talking about someone who already knows the technology stack.
* No matter how skilled the developer, there will be ramp-up time when they not only don't know the framework, but also don't even know the language.
* That ramp-up time being significantly shorter for a senior dev is what we're talking about, and how selecting for the specific stack like you're describing isn't necessarily going to be worthwhile - learning the domain is likely to take longer than learning the technology.
> Anyone can pull up an IDE in almost any language, step through existing code, pull up a watch window and fix a bug. That’s something developers can do with three years of experience.
Thanks for the compliment I guess; I was describing something from around 3-4 months into my first job.
And if a company is using a popular language/framework they can find senior developers who know the language and the stack. Why should they hire someone who hasn’t taken the time to learn both?
That ramp-up time being significantly shorter for a senior dev is what we're talking about, and how selecting for the specific stack like you're describing isn't necessarily going to be worthwhile - learning the domain is likely to take longer than learning the technology.
Not really. If someone has ever been exposed to a similar framework - take your typical MV* framework for example - even a developer whose only been active for three years can easily transition.
I have no idea why old folks (putting myself in that category) think they can get by without keeping up. Do you really think that someone who has 15 years of experience but haven’t kept up with the industry won’t be at a disadvantage over someone with 5 that has? On the other end, you have people like me that knows what it’s like being behind the curve and vowed never to put myself in that situation again.
You can’t imagine how long it took me to unteach a 50 year old who had been doing ASP.Net web forms for years how to even develop in ASP.Net MVC using server side rendering let alone client side frameworks.
He was only eighg years older than I was but didn’t keep current
Moreover, if you're proud of how hard it is to learn ins and outs of your tech stack, you're likely using a shitty tech stack and costing your company tons of money.
But you can’t be proficient at an architectural level with a stack in two weeks no matter how good you are.
Would you say that you could me an iOS/Android developer in two weeks just because you picked up Swift/Java?
But to your actual comment: I'm learning new things all the time. My focus when I'm coding in my spare time is a simple IDE, some project management tools, and a simple game engine I am building. I learn new things in those projects literally every day.
I'm not going to do a bunch of hello world apps in random new "Pro" technologies. I did a lot of that in my 20s, it was fun and I was curious about them, but my interests have shifted.
There are enough jobs I can say "no" to the people who want only coders who are using the newest "Pro" toolset.
Honestly, a good hiring manager will realize you ideally want BOTH that person and people like me on a team. One person will know that Rails 7 solves our new problem, and the other will know that a simple refactor will save 30 hours of headscratching over the next year.
And this is where ageism rears its head.
If two people apply for the same position, and you're OK hiring someone who has only 2 years of experience for the position, don't ask for more from someone who's been working for 20 years.
Trust me: There are plenty of older programmers who do not expect a higher pay because of their age. In my career, whenever I hear "older people expect a higher salary", it has almost always been by employers/managers, not by actual older professionals. On the other hand, I've met quite a few programmers who complained they didn't get an offer or were told they were ruled out because they were "too senior", and the complaint almost always was along the lines of "I was not asked salary expectations".
I have another friend in his 40s who took a job that let him work from home who is in his 40s and it pays less than he was making when we were working together.
I wrote one parameterized Cloud Formation template that creates our build pipeline for any new service. Most of the time we can use the prebuilt Docker containers (maintained by AWS) for our build environment. We have one or two bespoke Docker containers for special snowflake builds.
And if you haven’t read my other posts, and I get accused of being young and inexperienced, let me just say that my first “internet based application” used the Gopher protocol....
But we really don't care about the lock in boogeyman. Out of all of a typical business's risk, lock in is the least of them. We use lambda where feasible, Fargate (serverless Docker) when necessary. Why waste time doing the "undifferentiated heavy lifting"?
Why not? I’m older than you and I have passed plenty of technical interviews on the $latest_tech.
Sure you can learn a language in two weeks but a language is far more than syntax. There is the ecosysystem, frameworks, third party packages, etc.
And honestly, I would be more concerned about hiring someone who built their own bespoke framework from scratch than someone who used an existing one. Building frameworks rarely adds business value.
If I were interviewing such a candidate, I'd dig into why they built a custom framework. The answer to that question could be quite enlightening, especially if they mention business reasons for doing so, or if they talk about what they've learned since then.
When I started my own business a decade ago, I decided that existing web frameworks wouldn't suit my needs and built my own. Though I built a framework that suited my needs extremely well, it was a massive distraction from my actual business (which wasn't even tech). In interviews, I freely discuss how my NIH syndrome contributed to my inability to get the business off the ground; I've found that it tends to go over rather well with interviewers.
I go to work to be on a team and pull in one direction and support the leadership and make hard compromises.
I write code at home to understand where the industry will be in 10 years.
If they're over 45, then they likely worked in the late 90s and early 00s. If they were like many of us, then they worked on at least some web applications back then. This was far before jQuery and even Prototype.js on the frontend. Java, PHP and Perl were commonly used on the backend (or where I lived, more often IIS with asp). But the start of the career of someone in their 50s and even 45+ was before any commonly-used framework in open source scripting languages used for web development (unless you consider PHP to be a framework).
Even Spring framework was only created in 2003 (I just checked to confirm). Symfony in PHP was 2005. And those took several years to catch on in their respective ecosystems.
Tons of companies had in-house frameworks back then. Who did you think wrote them?
Even if someone working back then never explicitly built a framework, they used design patterns that mirror today's frameworks. If not, they were writing spaghetti code (and to be fair, that was pretty common back then).
Based on your username and your assertion ("older than you"), you're what, 45? Have you forgotten what it was like back then?
But when he said he “made his own web frameworks”, I’m assuming he was talking about this decade.
I re-read the comment in question. Not sure why you would have assumed that. Someone who is now 37 likely started programming before the age of web frameworks.
I wouldn't call it a framework, because it's built of small independent pieces, but it's a similar scope to something like Rails.
I did start coding long before web frameworks existed though. :)
Granted, I hear what you're saying, but I'm not convinced your blanket dismissal is the right analysis without digging deeper on the individual details.
Agreed. However, if you're a software company you've already likely made all of those decisions. If I come work for you, I'm stuck with them. It doesn't matter if I like the package manager or the frameworks. That's just the background against which we work.
Our work is making our lives better, and solving the company's problems to the best of our ability, given all those constraints.
> honestly, I would be more concerned about hiring someone who built their own bespoke framework from scratch
Yes! I feel honestly quite nervous about sharing that aspect of my work, because I worry people will see me as out of touch.
But I am quite up front that I don't advocate using my toolkit in production environments. My professional advice is usually quite conservative. I tend to advocate for minimal disruption and using standard tools as they were meant to be used.
However, because I do that all day at work, I see all the cracks and warts in the architecture and when I go home I want to work on the future.
And even though you are correct "building a framework rarely adds business value"... that's not what motivates me. I do think there is value being created, and I will eventually be able to cash in on that. But I am doing it because I am interested in the future, and I want to work on tools that transcend the pointless busywork I have to deal with day to day, in the name of meeting quarterly goals (which TBF is crucially important in a business context).
I get direct enjoyment from working in my own little hobby world where those constraints don't exist, and that is enough motivation for me.
Let’s say you learn C# in “two weeks”. Is that going to be enough time to get proficient with ASP.Net MVC? Entity Framewor? IIS? The built in logging framework? The DI framework? Middleware? What if they are using Xamarin for mobile? Let’s say you did pick up Xamarin (which I haven’t).
Would you know the fiddly bits about doing multithreaded processing with desktop apps using Windows Forms?
It's almost like you're pretending the years between 2008-2012 never happened, but I'm sure that's not the case.
I learned my lesson. For the last 10 years I’ve kept my resume competitive with whatever the $cool_kids technology is and haven’t had a problem competing in the market.
I didn’t play in the Java space back then but there were plenty of Java frameworks (https://dzone.com/articles/what-serverside-java-web-framewor...)
Ruby on Rails was released in 2005.
Knockout JS was introduced in 2010.
Django was released in 2005.
There was also Mojolicous for Perl also released in 2008.
The things I've seen. shudders
Hum... My experience is that whatever framework a business use in any widespread manner, I do always have to build another one on top, because generalist libraries completely suck for productivity. Unless you are a single app shop, then you can just hard-code everything.
I don't get how improving development speed does not add business value.
If their is a problem, they have to wait for a response from the one Guru who wrote it.
Most architects who think their problem is a special snowflake usually isn’t.
A custom domain for a business would be something more than logging though. For example, perhaps you want your logging to include some regulatory compliance hooks. Or maybe you want to do logging that also serves as a persistence layer for undo/redo and you need that to be tied deeply into your presentation layer.
In those cases it's much more than just "log this" that's being abstracted, it's all of the business needs around logging.
And next developer can't get away from learning all that code. Either it's tied up as a nice internal company logging framework, or it's spread out as a bunch of concerns through the app. But either way, the newbie has to figure it out.
Again, my experience is that the personalization layer is very easy to get by a senior developer, maybe there is some unconscious architectural decision that is biasing it, but I've never seen one have problem taking my code. Junior developers by their turn just can't grasp it, what I consider a feature, they are better just using the functionality until they get some experience.
>I've forgotten more frameworks than most coders will ever learn.
Forgotten knowledge is worthless
> I mostly don't care about programming languages, they're roughly the same and I can learn a new one in a couple weeks.
I used to think that, its possibly true, but the platform that comes with the language certainly can't be learnt in a couple of weeks so I dont think this is true as it used to be.
> I know technologies like Rails and SQL, but I don't play with the new toys as much as I used to, becuase lately I would rather solve a new problem or build a new tool than learn a new tool.
This is a real problem for you if companies are hiring for people that know the new toys.
> It's another world, and a lot of companies just don't bother...Thankfully I have no desire to work at any of those companies...There are plenty of companies trying to hire people to solve actual problems
That's great now, but in the future the job market might not be so tight and then people with your attitude might find it really tough.
You might be getting a mid-six-figure salary in your first job, but you probably ought to be socking it away because you can't expect raises.
Not really, you pick up something you forget much faster than something you've never seen before. It's all in there somewhere.
I do agree that testing for the ability to solve problem is hard and gets neglected.
To play devil's (or manager's) advocate - have you ever crunched the numbers on what it takes to pay you for a couple of weeks? That's not a trivial investment, especially considering the risk that you just thought you could learn a new language in a couple of weeks but were wrong.
(although I agree with the sibling comment — one doesn't really learn most languages in a few weeks)
Architecture and processes that was fine for one environment was horrible for the other.
In development terms, you can write FORTRAN in any language.
Let’s not talk about the one Java developer who thought he could pick up C in two or three weeks. It was ugly.
With an IQ test, the same way you evaluate anyone else on their ability to solve problems.
Better to hire for your particular skills. And younger is cheaper.
I think what's actually happening is some programmers, who happen to be older, don't keep up with their skills. Then when they interview, they bomb it, or talk about how they could do the same thing in an older stack.
That's great, but the company is really invested in the current / new stack, so it doesn't make sense to hire someone who does not have those skills. So they pass them up and find a candidate who knows the software the company is looking for.
But the older interview candidate, instead of reflecting on themselves and realizing they need to spend time enhancing their skills, they simply say the company was "ageist" and blame it on age discrimination.
Blaming others is generally easier than blaming yourself, I guess.
On the other hand, ageism is real. When I was younger I was told more than once by managers, straight up, this candidate is too old so find an excuse not to hire. And now in my 40s, when I walk into some companies for an interview, I can feel the decision has been made before I even start talking. Not everywhere, not even most places, but it happens for sure and it's not even that subtle.
Making a real time video chat client is now a literal programming exercise, it used to be the domain of multi-million dollar venture backed startups that were valued in the billions.
Real time chat between users is now a feature that is added to an application in a day, or even a few hours if you follow any of the myriads of tutorials available on YouTube.
Go from writing a UI in C/C++ to writing it in any of the higher level languages. Assuming you don't get caught in some design pattern trap that results in massive code bloat, it is now possible to do things in hours that used to take days to weeks.
Heck, transparency is no longer "write some ASM" routines, it is setting an opacity variable!
Some tech stacks, such as those for making basic CRUD apps, may have actually degraded a bit, but at the same time someone who is only slightly technical can now drag and drop their way to an online store front that is a fair bit more powerful that Viaweb was back in the day!
Heck, it is possible to now go from an empty folder to writing+deploy HTTPS API endpoints in under half an hour.
App (not web!) updates can be deployed to users by uploading a file and having a client pull down the changes and seamlessly switch over to the new code next time the app is launched. Versus mailing out disks!
There are app stores that you upload intermediate code to and they'll compile apps to the appropriate architecture for a customer's device!
During 72 hours of coding for Ludem Dare, game developers and artists work together to create games that are as complex as a full fledged commercial game would have been 25 years ago.
And of course, many corporate software engineering efforts continue to fail or go massively over budget for largely political reasons.
Likewise, moving from ASM to C to Smalltalk is a night-and-day improvement... that we made in the 1970s. The difference again is what hardware we get to run it on.
Video game construction kits existed in the 80s, and 80s GUIs looked pretty much the same as they do now if you ignore resolution.
Drag and drop application development was huge in the 90s, and the CRUD applications of the time weren't significantly different than now, other than they didn't run in a browser. HTTPS wasn't hard back then, and you could write a Perl endpoint in half an hour easy.
When it comes to things like playing video, it's easy now but I don't even really consider it programming. You're just installing software that does it for you.
Libraries and OSS and StackOverflow and various services have made a real difference. I'm not arguing that we haven't made progress. I'm just saying that I can see how some people feel that this year's incremental advance or retreat is not nearly as exciting if they've seen how the last 20 worked out.
Myself, I've gone back to using Ventrilo because I realized that I really don't care about having video.
Firefox Hello worked remarkably well IMHO, it was super simple to use.
The field is full of video chat clients though. Uber-Conf is popular, although I think the % of problem free calls I've had with them is less than 50%.
Zoom has worked well for me, no real issues.
Facebook Messenger, privacy issues aside, works well.
I've never done this in C/C++ specifically but I have to imagine that the GTK bindings are mostly language agnostic. Using a tool like Glade, you can build a GTK GUI with ease.
Glade was released 21 years ago.
Edit: Ah, it's actually named that and I found a lot of results for it:
This is a topic I'm very interested in (and as I've gotten older, my interest has only grown).
In ~20 years as a coder I've seen only a handful of overt ageist issues. I've seen plenty of ageist language and attitudes, and even more assumptions, but those are a lot harder to answer "Yeah, but would they turn down a qualified candidate for that?" about. I've known people who were both experienced and open-minded, as well as those that were experienced and closed-minded.
End result: No idea what to believe. I'm not comfortable assuming one way or the other, because either assumption is harmful if incorrect.
On the other hand, companies who engage in age discrimination are probably shit companies, so are you really losing out by them not considering you?
If they have the money to make my life easier, and I end up not getting it, then yeah, I lose.
If they have the experiences that make me a better dev, and I end up not there, then yeah, I lose.
If they aren't actually that bad, they just have this one area of ignorance, but I never get to experience all those other benefits, then yeah, I lose.
If I'm NOT in a hot tech center and this is one of the relatively few jobs that is available/is a step up, and I don't get it, then yeah, I lose.
I myself have nothing to complain about, at least not in my experiences to date, but in terms of considering the issue overall, I don't think it can be so easily dismissed. I recall plenty of older techies struggling after the end of Y2K and the dot-com bust when the market wasn't "If you want a job, you can get a great one", and I assume there are plenty of people in the above categories.
Except they don't "simply say that".
It's a fact of the industry that is backed up by (a heck of a lot of) experience and observation, unfortunately.
In that sense, it's kind of condescending to deny the reality of the (sometimes painful and frustrating) experiences many people are having in this regard.
My strategy is: live in the bay area so I can get paid like crazy, live a very frugal life much below my means, invest diligently, and hopefully have enough money to retire in my late 30s (in a much cheaper area, of course), so that when I'll be told that I'm too old, I'll raise my middle finger and leave the scene with the few million dollars I saved.
I wanted to pop back in time and say thanks. Yeah you didn’t retire in your late thirties like you thought you would mostly because it’s harder to make FI cash than you thought. The crash of 2020 took a few years to get passed but hey 45 is the new late thirties as they say now.
The reason I’m thanking you is you went ahead and struggled and saved anyway while things got tough and now you are free. Just FYI, I texted my QUIT to the HR bot in the Philippines just before I came back to see you.
Btw quitting is harder than you think. Just to let you know, I don’t blame you. Not quitting at 40 when you could eke bye because they offered you 2x what you’d ever made to stay; yeah the place was thalidomide for your soul but then your wife (I’m not telling) got pretty sick and that cushion was worth it.
Also: take care of your health and sleep more. An extra ten years of long hours at the terminal has left me not as healthy as I’d like.
To be fair, I don't plan to retire at 39 or 40, as long as someone hires me with reasonable compensation (2x would be a dream!). I'm purely trying to protect myself against a foobar scenario where indeed, I become obsolete from the point of view of most employers, due to "old" age.
You're assuming that he would actually want to stay in that other place long enough for the house purchase to make financial sense. You have to own a house at least 5 years for it to work out, minimum. Moreoever, in the big tech hub, you're making more money, so if you keep your living costs low, you'll bank more cash than living someplace cheaper.
Curious, why? I am having no problem being hireable right now at all (literally just joined a FAANG recently).
As far as I know, it would be impossible to get the same pay I'm getting now anywhere else. Even after considering the high cost of living, the spread between income and cost of living is very high, and allows me to stash away a significant amount of savings every year, to pay for my future freedom. I really doubt I'd be able to save that much if I were working anywhere else.
What else would you suggest?
I don't own a house here in the Bay, I dump everything in very diversified index funds and some rentals out of state, since I'm not interested in staying in this place long term and play the million dollar mortgage roulette.
The day I call it quit, me and my partner (she does well too) can move to Austin TX, or anywhere else really (we're both immigrants with citizenships in really really cheap countries), with our fat liquid assets, and not having to worry about necessarily finding work in my 40s. At least that's my very optimistic plan, it might completely not turn out like this, or I might die tomorrow.
So it's really not that much, most people who get private or state pensions get a better deal than that from a cash-flow perspective.
Edit: sorry for the ninja edit. I read your comment too quickly.
1) If you keep it under your mattress it will be much less than what I described, because every year your nest egg will shrink due to inflation, so you'll just be able to take $25k non-inflation adjusted, which is a big difference from my $25k inflation adjusted (in 60 years, $25k will be $150k at a 3% inflation). My assumption is 0% real growth, not 0% nominal growth, which is what you'd get by keeping it under the mattress.
2) The 4% rule is based on a shorter retirement interval (30 years) than what I'm looking for (60 years). Try to go on firecalc.com and look for the statistical odds of 1M giving you 50k/y for 60 years: the failure rate is higher than the success rate, and that's based on historical data.
3) Yes, my assumptions are very conservative, but I don't believe index funds will return 7% nominal over the next few decades, the world is going to face too many problems in my opinion. That being said, pretty much all I have is invested in index funds despite my opinions (mainly because I wouldn't know where else to invest it, since both cash and bonds are sure losers to inflation), so in the best case I'll be pleasantly surprised.
Again, quite possibly I'll die much younger without even enjoying any of that freedom :-)
Also, Austin, while not as bad as other parts of Texas, is still in Texas.
I've lived in six cities on three continents (and I'm talking signing a lease, not staying for a month) and I can assure you Austin is no better or worse than any other metropolitan area regarding... wait, what metric exactly are you using here? Weather? Burrito size? Number of smug, passive aggressive west coast urbanites per square mile?
So, what's the disadvantage?
If I could pop back in time, it would be to (somehow, because I didn't back in 1992) recognize that where things were at was the Bay Area, and moved there instead of staying in Phoenix.
But here is where I am, and while I make a decent salary, I certainly am not retiring any time soon.
The whole age thing is standing in sharp relief in my mind currently, because things are slightly "up in the air" at my current employer; I want to stay employed here, and I'm sure my employer wants to keep me, but recent events may potentially cause the relationship to have to be dissolved if the projects and numbers don't line up. I can understand that from a business perspective - fair's fair.
But the prospect of interviewing yet again, 3 years after getting this position, is not something I look forward to. I have new skills I can bring with me that I didn't have before, so that's a plus, but I worry that the age thing may be a barrier.
I'll probably have to lower my salary ask just to have a chance, but I won't do that unless I don't get any traction after a few months of looking - should I have to do so.
Even with that, though, I tend to wonder if the question in the minds of those hiring will be something akin to "Why isn't he a team lead or a manager?"; which is something I don't have, and have never been offered - possibly because I've never been able to stay at a company long enough to get to that level - either they go out of business or they get sold 2-3 years after I join, or the company is just too small to have any kind of way to advance you into such a position (I've found that I fit best in companies of under 50 employees, and the fewer the better).
Maybe if there's another time around coming for me, I should look into startups, though that seems like a bust from the get-go (ie - find a startup, in Phoenix, who needs a 45+ year old software engineer - that's probably a very small pool, if it exists at all).
It is what it is, I guess. I am fortunate in that I am debt free, at least.
If I could go back in time I would have the done the bay area thing when I was young too. But, life is a bit of give and take - there are some good things that happened in my life not taking that course (even if they didn't all work out as planned). So you kinda have to appreciate those parts of life and accept the fact that you are maybe not in the financial situation you wish you were in.
The only guarantee we have in life is this moment :)
"the software engineering labor market can stay irrational towards older people longer than the 2 decades past 40 where I could stay effectively employable".
The company where I work administers a survey of all tech employees every year, then publishes the results as a series of graphs, which can be broken down at any branch in the org chart. The very first question is "how many years of industry experience do you have?" The graph for my team is, shall we say, bimodal. There's a bell curve between 1-4 years, and a spike at 30 years.
I've been building things for a long time, and I try to impress upon the other developers on my team (or in my org) that I've never, ever, had an opportunity to build at the scale that they're being asked to build at in their first gig out of school. If I come off as negative, it's only because I can say with surety that X won't work because it didn't work in the past. I expect to learn, with my team, whether we can make things work at now scale.
There are times that I miss hours and hours of uninterrupted coding, and being inventive at small scale. My leadership would take me out to the loading dock and kick the shit out of me if I tried to contribute at that level.
My job is, essentially, to educate, and slap the team off of local optima. To impress upon them that their customers are king, and that their customer base includes themselves and their teammates, paged out of bed at 3am. And to impress upon their management and the business that we really to need to focus on improving operations, rather than add features from time to time. It is also to observe the progress of, and advocate for the developers on my team.
I feel (at the time of writing... Talk to me on Monday afternoon) like I'm lucky that my employer points senior engineers at problems like this.
The expectation in the industry is that Senior people would be even better at it, because of their years of experience.
This,however, is not what I’ve seen on the ground. Most senior devs that I interviewed, who were clearly smart and experienced took much longer than new grads to solve most of the questions. Personally, I try not to hold it against them, if they can clear the bar, but I wonder how many other people do the same.
Also, I know in music they do blind interviews, were people perform behind a curtain to avoid any biases they might have (gender, race etc).
May be it’s time to do something similar in Software, since we already are prioritizing white board interviews above anything else?
Edit: I am not advocating for Leetcode types of questions. I do believe that most companies hire like this (including my current company, where I get minor if any say in hiring process). So just pointing out my observations regarding the status quo.
The fact that a person with years of experience solves an interview question more slowly than a fresh out of college grad should be a huge red flag that our interview questions are bullshit and don't reflect real world challenges.
Have a chat, discuss experience and see how well they actually understand the things they have put on their resume. Dig as deep as you can and see if they actually understand what they say they have done. Also, references are at least as important as the interview.
In either case, there are questions that have multiple difficulties of responses. The more experienced/capable they are, the more nuanced and complete their answers will be. For example, ask a Java person how GC actually works. Most people have very little idea, but the best talent knows a good bit more.
But there are other variables. I've known perfectly intelligent and productive coders who took months to build a really complex well architected system that didn't solve the business problem and hence were a bad hire.
That's an interesting statement. Isn't it usually the job of product managers (or someone in management) to determine whether or not what's being done is actually wanted? It is not typical for developers to be brought on under the broad banner of "make us money please" and then set loose to email users and determine 100% of their own direction. Wouldn't the fact that the wrong program was being built have been caught in an informal "let me see how it's going" sort of status update?
Reality: piles of legacy code that requires deep insight into product, legal, and marketing details to understand and untangle.
Guess which one leetcode script kiddies will suck at?
The question isn't how much you should hold it against older developers, but how much you hold it against leetcode as a way of assessing developers.
Seems like basic logic to me.
For myself (note: am old), I'm just following the advice I've always given to other minorities that are discriminated against to various degrees. Which is: don't whine, and play the cards you were dealt as best you can.
The FAANGS may not, but many other employers can spot talent when they see it. Keep looking--there's a niche for everyone.
Which is practically infuriating to read, once you take even a passing glance at her easily findable profile.
If recruiters are really saying that about her -- this just shows (yet again) how utterly useless they are at what you would think would be their core function:
That is, identifying and promoting technical talent.
I'm not sure how I feel about it. Sometimes I just feel fortunate for the career I've had. I mostly wonder if I'll miss it or be glad to move on to a new chapter.
For me it's a combination of 1) I see the writing on the wall: they don't really want me in this profession any longer and 2) I'm getting tired of it anyway — increasingly less willing to (laterally?) "move my skills".
Begrudgingly learned Swift (I like it tho). Begrudgingly learned every new code management system that has come down the pike....
I'll switch careers and move on, go back to "service sector" pay scale.
Maybe I worry more for the current crop of programmers. Those of my generation did all right — had fun. The industry is changing and not necessarily in a direction that looks enjoyable.
One more protip. At a higher level, moving up further does not mean being a better coder. It means talking to people (often on different teams), thinking and generating ideas (and applying for patents), sometimes taking risks to prove those ideas, figuring out how to improve process, and generally communicating. If you believe you're not good at any of these, you can actually study and practice them and get better at them. I did, and it's working out for me.
This means, right now, half of programmers have been doing it for less than 5 years.
This is the primary reason older programmers seem oddly rare, 87.2% of programmers are less than 15 years into their career, 75% less than 10.
Hopefully this attitude will change as the valley ages and people will learn that a lot of software engineering principles don't change as quickly as the buzzwords do.
D&I benefits those trying to increase the supply of coders and via supply and demand reduce the rates. Older engineers are going to want more money etc.. Of course a nice benefit with the movement (for those who want to exert downward pressure on the wages) is that many D&I initiatives also tend to be ageist. IE: accusing the previous generation for enabling bad behavior, stating that the majority of older workers are white/male anyways etc.. Older workers generally are more skeptical of the D&I efforts. The D&I supporters have a penchant for getting people fired etc. for expressing any dissenting views against the cause...even if the person expressing the dissent is an under-represented person.
For the ones trying to reduced labor costs it kills two birds with one stone. They get an expanded pool of labor. Plus the D&I crowd does most the dirty work for pushing out older workers. Now rinse wash and repeat with importing workers from low cost countries. The rich get richer and the rank and file engineers get caught up in the battles as either active participants or collateral damage.
It has very little to do with doing things that are morally right and all about money. The moral outrage distracts everybody and keeps the infighting going so that the labor does not organize and target the real problem.
"Keeping up with with new technologies" is generally a euphemism for "uses a random set of frameworks I chose". Amusingly, devs who demand everyone to learn their frameworks are often the same people who try to stop anyone at their company from using or doing anything genuinely new, because it would undermine their position.
This is, BTW, why there is so much talk about the "shortage" of software engineers right now.
This is a great point. I feel the same at work where a buzzword laden framework is more valuable than directly using a library underneath that framework.
In my experience a database is a database, even if you call it something else. Really a database isn't too different from flat files with some helper functions to make rooting around them easier. In turn that's not to different from storing the information in RAM. Different ways to get at them but in the end you have the same bits in that string.
Containerization, Lamdba, serverless. It's the same old code running in new ways. I don't know what value to edit in the config file but that's just a google search away. At the end of the day its "I need to hit this endpoint and get this back" same story, different cast of characters.
I don't mean to humblebrag or anything but it seems like a non-issue to me. I understand how computing systems work generally and the laws of physics impose some hard limitations that sort of massage any solution to look similar enough to any other that it isn't that hard to figure out what is going on. Sure I'll lack some of the deep knowledge that someone who has focused solely on these set of technologies may have but honestly most of the programmers I meet don't have that to begin with.
I'm 34, so it might become different when I'm in my forties, but so far it's not getting any harder. Your summary below is spot on.
> I understand how computing systems work generally and the laws of physics impose some hard limitations that sort of massage any solution to look similar enough to any other that it isn't that hard to figure out what is going on. Sure I'll lack some of the deep knowledge that someone who has focused solely on these set of technologies may have but honestly most of the programmers I meet don't have that to begin with.
Additionally, there are different levels of knowing a language. You can write perfectly workable programs in C without ever touching function pointers. So what's your level of mastery?
And then there are more difficult languages. Have you tried C++ or Scala? Ever explored the esoterical intricacies of Java's type system? Lisp? Objective-C? Kotlin? ADA?
Ok, that's languages. How about parallelism? Massive parallelism? Tight resource constraints such as limited memory? Error handling? Modern language and compiler design?
I'm not trying to say you need to know all these things to be a decent coder. My point is there are many harder problems out there and maybe if you're bored you might want to try working on some of them.
I'm 40, and it's easier than ever.
I do get a bit pissy now when I have to spend a week learning some technology that I end up using for 15 minutes to do one thing, and then discard, but that's about it. (That has, unfortunately, happened.) But otherwise, it's easy. I've done non-trivial development now in languages that I don't even properly know, I just know all the surrounding languages and how to Google fast enough to make up for the rest.
I will also get a bit pissy when I do one project over here and they're all in Ansible, and I do another project over here and it uses Salt, and then I tweak something over there and it's in Puppet, and then I have to help out over there and they're in Terraform + Docker + some home grown Compose thing that predates Compose, etc. That general category of things was a legit advancement and I prefer using them to what came before (in my world, nothing but precious snowflake servers), but I'm frustrated that I have to keep relearning the easy "surface syntax" stuff over and over again for what is almost the same exact task.
I will also admit to a bit of impatience with the "re-inventing the wheel" phase of new technologies. For a recent example, the initial AWS "serverless" stuff (dumb name, inevitable idea) was written with like zero attention paid to the basic needs of source control, or the utter inevitability of how your service that was designed to have like 10 5-line scripts in it will be used to implement the next Amazon, so I stayed far away from it. Even now I'm not sure it's all that great. Docker I'm still not all that up on for somewhat similar reasons; for all the "innovation" in it it took years to be able to do basic things halfway properly. I'm definitely kind of done riding the cutting edge in the sense of chasing down communities that I can look at and tell are just going to be chasing their own tails for the next couple of years.
Sometimes a tech community comes out led by a person or a team that has experience, though, and when that happens I'm all over it.
Unsurprisingly, computer programming isn't such special and amazing stuff that, almost uniquely in all the fields of human endeavor, you're worse after doing it for 20 years than you were after doing it for 2. Other than cases where the body itself just breaks down, like professional-level sports, almost nothing works like that. I think it's a historical accident of some very unique circumstances in the 1990s that have lead anyone to believe programming is different. I think the idea has gone a long ways towards dissipating already, but has a long way to go yet, too.
So far, I've not seen any pattern to back this up. I've worked with a few coders in their mid-20s that knew PHP or Java very well, but refuse to learn a front-end framework or Node JS. They act as though the burden of learning a new stack when they've already learned what they know from school was too much to ask.
I'm in my early 30s and have a strong distaste for Electron, but I know a mid-40s coder who loves Electron and would never code a desktop app any other way.
I find myself reading about Smalltalk and NEXTstep and am continually impressed with the solutions of the generations before me (and their respect for their users' resources) while the coder in his 50s I've worked with loves that we live in a time where trading RAM and CPU for easy maintainability is the norm.
What bothers me isn't just the ageism, but also the absolutism based on anecdotes. It seems to me that the same person who would dismiss an older coder based on age, is also the kind of person who would, say, "never hire someone with C# experience even if they have Python experience, because we use Python and anybody with C# experience is slow and inflexible" (real thing a manager once said) or "never hire someone who prefers to use Sublime Text because we use a 'real IDE' like Netbeans" (again, thing a manager once said).
Hire the right man or woman for the job. Pick the right tool for the task at hand. Be open to new technologies and using them for their strengths, but also learn from the good and bad things of the past. None of this seems that difficult to me, yet it seems is a continual problem in the field.
When you come out of college, you might be pretty close to the state of the art because you've been doing nothing other than getting to that exact point. Then it suddenly gets harder, because most of your time is spent working on whatever technology you already have instead of ramping up on new ones (plus eventually family and such).
Sure, you can try to weave as much learning as you can into your job, but it's almost never enough. Most people will end up getting further and further behind, or more and more specialized. The harmful effect of the first becomes apparent quickly; the second can actually be quite lucrative for a while until suddenly it isn't. Either way, at some point your knowledge becomes pretty severely devalued until you go back to spending near-full-time to catch up.
It's hard to blame employers for favoring newer knowledge over older, but the effect on older workers is still pretty profound.
For context, I'm 37. I watch conference topics on topics I find interesting but they're often just mildly informative, definitely not the type of thing I'm going to travel halfway across the planet to see live. A lot of the time it seems to be more about making a name for the person or the company they work for. It seems the number and prominence of tech conferences has exploded in the past few years, I don't even remember hearing about these at all when I started out (mid-00s).
In my opinion, this is both true but not a problem. The reality is that it's much easier to have a large impact as a manager than as an individual contributor. I have worked with L5's, L6's, and L7's, and honestly I couldn't tell the difference between them. That's not to say the higher level people didn't deserve to be higher level as they had more accomplishments, but I personally didn't really feel like they had much more impact, if any.
For all but the largest and most complex projects, there aren't really going to be sufficiently difficult problems that it makes a big difference. I don't think I've ever been on a project where there were more difficult problems than engineers capable of solving those problems for any level of difficulty.
> Based on interviews with a half-dozen programmers, it is clear to me that companies should create a qualitatively different role for their most senior individual contributors. Candidates for such roles would be judged by their past effectiveness, the same as managers are, not by a fast-churning checklist of skills. Greater clarity would mean engineers could climb the ladder faster, and the prestige and renewed intellectual challenge of each level would keep programmers motivated into their fifties and sixties.
> Proven engineers who occupy the most senior roles should be deployed to solve the hardest problems on the most critical projects.
At least at Google, being judged based on past effectiveness and being given the hardest problems is already true. Or rather, higher level engineers have greater independence to be able to choose what problems they'd like to solve.
People who are good at development can't be determined by any outside criteria -- you have to talk, test, and learn about them. At my company, we have developers from early twenties to early sixties. We don't care about any race, gender, background, anything -- all of it's immaterial to whether you can create and code. Join us: https://www.datakitchen.io/company.html#hiring
How dumb or gullible does one have to be to actually think this? I'm betting it's mostly the 20 something year olds that would ascribe to such stupidity.
Let's just accept that there is no track for programmers period. There are no promotions beyond senior at most companies. You peak in your early 30's and then you continue to work at about the same compensation and level for the rest of your career. A few people can find more interesting work, but it will require jumping to different companies and a lot of self-training. Unless it's at FAANG, all that work will be for an extra ten, twenty, or thirty thousand dollars. And raises? I've never hear of such a thing in any industry these days.
The best option is to keep on top of your skills, find a job working 25-40 hours a week, possibly remote, and find some meaning in other parts of your life. Work isn't everything. It's not even the most important thing.
There are and will be plenty of jobs for older coders. It's a new profession, but older coders who stay on top of the tech are still plentiful. You just need to know where to look. Also, many people don't make it as coders period. If there is a shortage of old coders, it's because people drop out because they can't make it. Even better for the rest of us. As if being in your late thirties is old. What bullshit.
However I have several friends around my age who never bothered to learn anything beyond C++, and they are definitely struggling. They either are stuck in jobs because they're the only person who knows how to recompile some ancient module, or they're out of work and freaked out about the future.
Ageism is real, but so are some of the stereotypes. Older programmers are often going to be more resistant to change, and can be more sure of themselves even when they're wrong. The flipside is they often have forgotten more than a young programmer even knows.
Present Valley hiring practices seem to optimize for general aptitude, and downplay or disregard the value of experience.
There's also the undeniable fact that older people in general have built up more obligations and responsibilities, and are often more skeptical of the value of equity because they've been a few startups and their options never amounted to anything. Therefore, they are going to prefer stability and cash over the promise of future rewards.
If you're 50 and you've spent many years drilling down through increasingly deep scalability and throughput challenges, that will increase your value beyond what any fresh-out could provide.
If instead you spent your time learning a new JS framework each year and are proficient at 30 of them by age 50, good for you, but good luck finding a job.
That's not to say that the specialists have it easy, but they do have an advantage here.
In the first case I didn't particularly mean to specialize, but just to continuously do things you really don't even know if you can, to continually push against what you think is possible for yourself: create a js framework for instance, even a terrible one that you abandon after a year and move on to a completely different project, but that requires you to dig and research and stretch your brain. The second case was more you know it's possible and it's just following a tutorial to get up to speed on the techniques; someone else did the "hard" work, and you're not really getting anything out of it.
The first thing gives you skills that age well: you'll learn about lots of gotchas about the internals of how js and presumably other language runtimes work etc, which will help in lots of different positions; the second just gives you a line item on your resume until that fad is gone.
Prior to this I'd been doing freelance work and there it's more difficult. Especially if you don't have any particular unique skill or tight relationship with a solid client or a high level of marketing savvy, it's hard to compete with kids that are willing to do things at half the price. And I think that's fair: even though I consider myself a very solid coder, most consulting gigs I've worked on could have been handled by someone less skilled and worked out just fine.
To me, the future isn't very bright for anyone who just wants to be an "employee" for the rest of their life, coder or not.
OTOH, the manager path is dangerous in recession times. You lose a lot of "hard" skills, and if you lose your cushy manager job at company A it isn't necessarily obvious to company B what value you have when things are tight.
Depends on where you work. If R&D is core to the business, there will be a non-management ladder and plenty of near-retirement greybeards around the office.
Note: many fashionable names in "tech" don't fit that bill.
OTOH if you learned a little bit about various things, it's very likely that one of them will be that next fad, or its precursor. So when it happens, it'll be that much easier to learn.
So it's a balance. Going either too deep or too broad can backfire.