Hacker News new | past | comments | ask | show | jobs | submit login
The Planned Obsolescence of Old Coders (onezero.medium.com)
269 points by coffee on Mar 8, 2019 | hide | past | favorite | 349 comments

I wanted to like this article, but I was annoyed by the second sentence, which asserts that the tech industry is extremely white and male. The "male" part may be true, but the "white" part isn't, and that fact is well known to anyone with even a passing familiarity with the industry. The big tech companies are actually LESS white than the US as a whole; it's Asians who are vastly overrepresented. In 2017, Google's tech hires were 47.1% Asian and 42.1% white. The 2018 Facebook tech hires were 51.3% Asian and 42.7% white. This compares to the overall US population which is 72% white and only 5% Asian.

I don't want to start a flame war about diversity, I just want to get the facts straight.




This reads like you seized one sentence and stopped reading because you didnt like what was said so dismissed the whole article. Thats not fair IMO. I read the whole thing and the article uses the first sentences to set the backdrop for his argument that older engineers and programmers are forced into early retirement and companies in tech world and outside of it make rookie mistakes that could be avoided by treating engineering as a life long profession.

Alternating between complaints about White overrepresentation that aren't even true and politically correct terminology like "People of Color" and the entire article being a love letter to an underrepresented group is just jarring. It makes me wonder about how informed the author is about tech if he thinks Whites are overrepresented generally. There's either a lack of accuracy or moral consistency. Why the hell is the author even bring up his poorly informed opinions on racial topics in an article about ageism anyways?

Anyways the article as a whole is heavy on anecdote and low on data & facts. There is an emphasis on outcome metrics but the author doesn't bother to examine any data on productivity vs age.

The problem with your argument is that there are other diversity initiatives in the programming world which are mainly gender and race focused. That is undeniably true. There hasnt been the same thing at the same scale for older programmers. In fact there have been many articles on this very forum about how many tech companies filter out older engineers when searching for talent. Its not the first time this topic has been brought up. IBM was actively phasing out older workers and long term workers. Lets not act like data isnt there to support the claim even if it is just anecdotal.

I didn't really see anything substantiating a claim that eg 30 years engineering experience in tech is more valuable than say 20. It was just an assumption.

It's also very disingenuous to use these four oversimplified categories. There are no end of people who appear white, but don't personally identify as such for a litany of perfectly reasonable factors. And how much meaning does "Asian" have, when it's expected to somehow include two billion-person countries with vastly different genetic and cultural backgrounds?

That does not take away from the point that the industry is not as overwhelmingly "white" as people try to portray it.

You're right that it's a lousy set of categories, but I don't think it's disingenuous. Those categories roughly match the ones that large companies in the US have to track for their EEO-1 reports. I can't blame them for not doing more than is required of them for the sake of statistical accuracy.

I'd be interested to see aggregate demographics for software engineering, but I don't think that two, primarily SV, companies accurately portray our industry/field.

Comparing to the US population is a touch disingenuous. Those are California-based companies, and California has a much larger number of Asian residents than the US as a whole thanks to its proximity to Asia (15% as opposed to the 5% you quoted for the US). They're still over-represented, but not at 9.5x the base rate.

IMO, that’s not disingenuous when talking about an industry and using two well known companies as an example.

Asians, especially from India have an out of proportion presence in technology jobs. It is no different than any other ethnic industry affiliation. Irish were cops, Greeks own diners, and South Asians are huge in tech, and have ethnic social networks that connect folks to jobs, just like any other tightly meshed ethnic community.

I made it pretty clear that I agree they're overrepresented, but quoting the wrong base rate makes it look like a much bigger deal than it is. I work at a NY-based rather than CA-based company, and while the same is true there, we see a lot fewer Asian people in tech than the nearly 50% that West-coast firms claim.

It depends on what you’re doing. Some verticals in my neck of the woods are 65% or more Asian, mostly because the candidate pool is tight and often requires relocation.

FWIW I didn't get that impression. Seemed to me like the author, conscious of the women's holiday today, used the concept of social awareness as a segue into ageism.

From the author's second sentence:

> The organizers know how male and white the tech industry is, so they make a special effort to recruit a diverse speaker lineup.

Yes, the author was using social awareness to segue into ageism, but it is still factually incorrect (or at least highly misleading) to state that the industry is predominantly "white".

I'm looking at that Google report link and it says, for 2017 Tech, 39.2% asian, 53.2% white, whatever that means.

It's worth mentioning that for 2017 leadership, the Google numbers are significantly different: 26% asian, 68% white (and 75% male). So the decision making is still quite white and male.

> I wanted to like this article, but I was annoyed by the second sentence, which asserts that the tech industry is extremely white and male.

In case you missed the memo, the current cultural climate and trendy narrative structure demands that everything, no matter the topic, must include some reference to identity politics. Facts only interfere in this sort of narrative building, so please ignore your data and focus on emotional triggers instead.

While those two companies are large employers, does their labor force makeup the majority of developers in Silicon Valley? If not, then that’s two companies who have done well with upsetting the old balance, but what about the rest?

The organisers care about the colour of your skin and the type of genitalia you possess, not so much your technical accomplishments.

As a female developer I really don't like the idea that I might be selected just to fill a quota.

I refuse to read or react to articles hanging out troll-bait, so that the social trench war may creep to there page and bring in those sweet addmoneys.

Non engineering departments in tech will be white male majority.

Is that a quantified thing? I've only worked at 3 tech companies, but that was not at all my experience. Engineering was the only one that had predominantly men.

Of course not, but internet comments sections are begging for absurdity and falsehood.

Bonus points if it fits somehow into the very trendy culture war narrative.

For context: I'm VP of a software consultancy responsible for delivering software on time and on budget.

My take is that older programmers who maintain their skills on new technology are worth their weight in gold. The financial trick is to pair them with younger developers to create an overall cost effective team. If both the older and younger devs go in with the right attitude, it can be wonderful for both the project and their respective careers.

I also think it's really, really dumb to push developers into management positions if they can't do the job. I'd much prefer putting younger managers with great management skills into the mix than just rotating older devs into that role.

Bottom line: software requires lifelong learning. If you can't or won't stay current, you're in trouble. If you can, then it's just agism or a bad business model that's keeping you out of awesome projects.

In the article they mention a Python conference. I think Python is a good counter-example. It's an old language and still as strong as ever. On Stack Overflow, it recently passed JavaScript [1]. Unlike JavaScript, new Python code isn't all that different from old Python code.

A lot of older Python candidates are being passed over for younger ones. Sure they might not be learning Rust, but there are younger python developers that are having an easier time getting a job, and they aren't necessarily learning Rust either.

[1]: https://stackoverflow.blog/2017/09/06/incredible-growth-pyth... https://insights.stackoverflow.com/trends?tags=python%2Cjava...

I'll be 50 soon and I've been dabbling in Rust. Not to improve my skills or any of that, but simply because my 40 years of coding experience tells me it really is awesome. I have some specialization the will be good for some time to come. When I read these ageism articles I just keep shaking my head. Last company I worked at had tons of them. You startup folks must live in a fucking bubble. (That was for effect, I love HN in case you oversensitive millenials can't tell)

So did you experience ageism or not? I couldn't tell from your comment.

No. Not even a little. My post was a bit unclear. One company I worked for has lots of older guys and it was clearly good for their products.

> It's an old language and still as strong as ever.

Does it make sense to consider Python an old language if it's still a living language? The next version is coming out in October, and most new projects started today are running on versions released within the three years.

This seems pretty different than something like COBOL, where it's technically still being developed but not in anywhere near the same way.

It's also worth noting that recent python versions have added tremendous quality of life improvements. If you haven't kept up with the language, then the code you're writing may still run, but it's not "pythonic" anymore.

I haven't touched python in years. Can anyone link to a list of examples? I remember hearing about optional typing, which is always my favorite addition.

Python is indeed getting better and better, but I personally wouldn't include optional typing... yet. Mostly because the optional typing in Python is just for tooling, not for the run compiler or run time [1]. Python is one of my go-to language, but I haven't found a compelling reason to use type hinting yet.

Python's really good at binding C code [2], which is one of the reasons you see it in so many machine learning and scientific projects (Numpy, Scikit, Tenserflow, etc.) I think that's what given Python a lot of legs in the last few years.

[1] https://stackoverflow.com/questions/41356784/how-to-use-type...

[2] https://stackoverflow.com/a/10202569

Not if being old has a negative connotation, or if being old implies that they have no new ideas, neither of which should be true, of technologies or of people.

> Does it make sense to consider Python an old language if it's still a living language?

Python, I'd argue, is very much a living language. It was introduced in 1990, so it's 29 years old. Last release was 5 days ago as I write this [1]

Java, is also a living language. It was introduced in 1995, so it's 23 years old. Last release was in September 25 [2]

Cobol was introduced in 1959, so it's 1959. [3] The last table release was 2014 (which is a lot newer than I would have expected). Cobol is a great object lesson, I think, as the parent comment calls out. I don't think many software engineers would consider it a "modern" language, but according to the Wikipedia article, as of 2012, 60% of businesses use Cobol. So it's far from a "dead" language.

Lisp was introduced in 1958... 61 years old. [4]. I don't think many developers would consider Lisp dead, either.

Cobol's longevity seems to be tied to the fact that much of the legacy code encapsulates business logic, as opposed to UX logic (websites, mobile apps, etc.) There's more than a few lessons in there I think...

[1] https://en.wikipedia.org/w/index.php?title=Python_(programmi...

[2] https://en.wikipedia.org/w/index.php?title=Java_(programming...

[3] https://en.wikipedia.org/w/index.php?title=COBOL&oldid=88324...

[4] https://en.wikipedia.org/w/index.php?title=Lisp_(programming...

I consider the 1958 Lisp to be dead. The Lisp family's longevity doesn't stem from maintaining legacy code from 1960, but because it actually embodies attractive concepts that continue to captivate new generations of coders, often not in connection to anything mundane, like a job.

Not ragging on Lisp here in any way, but it's been long since the original Lisp is has been viable. It continues to historic significance, and anything that calls itself Lisp still has cons, car, cdr, atom, lists ending in nil (which is a symbol) and such. In 1958, you had no macros, no lexical closures, no exception handling, no backquote, no objects or structures, no bignum integers, ...

Whereas COBOL today is pretty much the same pile of stuff it was in 1960-something. It's not a vibrant language family with newer members. COBOL does not absorb new concepts.

There are new, young people learning COBOL, but only in order to become maintainers of legacy systems: job security in banking, finance and government sectors. Maybe a few have a perverse curiosity: they want to know what is really behind the dirty five letter word.

New people coming to a Lisp are coming for the Lisp itself. There aren't that many Lisp jobs involving new or legacy coding; new Lisp people learn and make new stuff from scratch.

There is simply no comparison.

Companies should embrace a multi-branch management approach. Have technical managers, who are experienced engineers that can focus on ensuring that people are doing quality work and building great products. Then along-side them, the team should also have a people manager, who is responsible for navigating the politics of the company.

I've had managers who were both technical and people-oriented. They both bring a lot to the table. But the older I get, the more I value having a manager I can go to who is good at managing optics and helping me deal with political or institutional roadblocks. In an idea world, I'd have both.

This is somewhat the dream of matrix organizations, but has it's own set of problems.

But fundamentally, if you can't move the decision making power/responsibility/(+ to some degree compensation) out to non-traditional locations in your org structure, you'll naturally have to move those people towards the more traditional locations....

I work at a small matrix organization. Deployments and time allocation are fairly chaotic due to lack of communication, but on the other hand we do have relatively many older people doing technical work and projects are managed by a mix of older and younger people.

   I also think it's really, really dumb to push developers into management positions if they can't do the job.
Even if they can do the job, it may not be the best use of their time. If someone is really good technically, they are more difficult to replace than a competent middle manager. So if that's the trade off you are making, you are probably losing out.

> older programmers who maintain their skills on new technology are worth their weight in gold

and also true of older programmers who maintain their skills in older technologies. see all the COBOL comments below. If history is any guide, competent Java programmers in 10 years will not be aged out, and will be compensated more than whoever is competent (regardless of age) at the current tech stack.

>> I also think it's really, really dumb to push developers into management positions if they can't do the job.

This has been known forever, yet it still happens - Why?

>> I'd much prefer putting younger managers with great management skills into the mix...

One reason is it is impossible to tell if a young manager has great management skills. It's really hard to tell if an experienced manager has strong skills.

As a result, technical proficiency is used as a proxy for management potential. It probably has a stronger correlation than picking a random person, but is not great. The new manager is then really unhappy, quits or is fired or eventually learns the new skills and is successful.


Some companies only have one kind of career track. If you want more money, you have to be in a manager position. They don't have a way for experienced programmers to keep moving up in their careers as programmers.

Is there any tech company where the technical track doesn't plateau below the people-manager track?

The issue isn't the absolute height. The top manager (CEO) is going to make more more than the top IC, that's just obvious.

The real issue is that management is a pyramid but IC is ziggurat with, if you're lucky, an obelisk sitting on top of it.

People like to point to the principal staff engineer or whatever, but there's one of them for every twenty upper-mid level managers making the same money. And all those managers can take their skills to another company, but if an IC guru leaves the company where he made his bones he's probably destined for consulting (at best). Consulting has its upsides, but that's no more for everyone than management.

I think both are true. There are more "slots" for management than technical leaders at the same compensation band. There are also generally at least some compensation bands in management that are above the highest band tech lead role (architect, chief/principal/distinguished/staff/whatever).

Many companies have separate tracks for independent contributors and managers. You can stay an IC and level up to Distinguished Engineer or Fellow or Staff Engineer or whatever, without needing to go into management.

I think we might need a curated list for those. At 17 years of career I've only seen one such company.

It seems pretty common with bigger companies: Google, Apple, Facebook, Amazon, Oracle, Box, Zynga, Electronic Arts, Pinterest, Microsoft, etc.

The tracks and levels are commonly referred to as the "job family architecture" -- don't be afraid to ask a recruiter for this information. Otherwise you'll never really know if you're being properly leveled. (For instance, I believe a Principal SE at Microsoft is roughly equal to a Staff Engineer at Google.)

I guess that would just not be realistic. There was a comment from a while ago that I think makes total sense: The organization has to evaluate the value that you bring and create. If your position moves and directs dozens of people, it will be hard for an individual contributor to match the importance no matter what. Of course, it's fine if you prefer to remain a more individual, engineering-oriented position. But there is little reason to expect you'd reach the same level of importance to the organization as some top managers/executives.

Agreed, completely. Most ICs overestimate their importance by far. Most people do regardless of title.

In my company it’s also much slower to move up the technical track compared to the manager track. If you want to make money it’s much better to be manager.

GitLab (I work there) does allow you to become Staff-Level which is the same multiplier as a manager.

Yes, that's common: but I have yet to see a case where the tech track goes as high as the management track. For example, "Distinguished/Staff Engineer" might be the equivalent of a "Senior Director" in an org with two or three levels of management above that (VP, SVP/CTO, EVP, etc.).

Google's tech track does go all the way to VP equivalents.

>> I also think it's really, really dumb to push developers into management positions if they can't do the job.

>This has been known forever, yet it still happens - Why?

This problem exists in hardware departments too. My best guess is that they want managers familiar with the tech/industry/company. My second best guess is that it's a natural human tendency and hasn't been addressed systematically.

I come frome a industry that sees software written under management that largely has no clue what software is, how to evaluate its qualitys and tech debt - and thus creates exponential tech debt time bombs.

I've been doing some job interviewing lately, at 37.

I do find that companies seem to know how to evaluate young programmers: just test them on their knowledge of your tech stack. Any young coder who is good and is working in that stack will know the basics. The bad ones won't.

For older coders, I'm not sure that same signal works. I've forgotten more frameworks than most coders will ever learn. I mostly don't care about programming languages, they're roughly the same and I can learn a new one in a couple weeks.

I know technologies like Rails and SQL, but I don't play with the new toys as much as I used to, becuase lately I would rather solve a new problem or build a new tool than learn a new tool.

But how do you evaluate a coder on their ability to solve problems? Much harder.

Even more difficult, how do you get a 24 year old coder who only knows react and FireBase to evaluate a 37 year old coder who has built web frameworks from scratch?

It's another world, and a lot of companies just don't bother.

Thankfully I have no desire to work at any of those companies. I solve problems. If a company can't hire for that I don't belong there. There are plenty of companies trying to hire people to solve actual problems, not just put coder butts in seats.

This whole "learn a new one in a couple weeks" mindset is toxic and needs to die.

Yes, you can learn anything at a superficial level in a couple weeks. If you've spent the last ten years maintaining an ASP.NET WebForms app running on IIS, then within a couple of weeks, you can contribute code to a Go microservice running in Kubernetes and using Kafka.

And that kind of "basic competence, with the ability to learn more" is what hiring managers want out of a junior dev, so if you don't keep your skills up, well, great, you're as useful as a junior dev.

But what they want out of senior devs is people who know the ins and outs, who understand what to do and why to do it and can explain that to others, who can avoid the pitfalls of bad tech choices and misguided implementation patterns.

If you've got decades of experience on your resume, my guess is that you want to be evaluated as a senior dev. And if you're a senior dev who doesn't know modern tech, then... you're not that useful. Your encyclopedic knowledge of the WebForms event lifecycle doesn't give you any particular insights into the design challenges of a microservice ecosystem; your ability to tweak IIS for maximum performance won't help you write a Helm chart.

If you want to be hired as a senior dev, you need to understand how the modern world works, and all the posturing around "I could learn it in a week" or "I prefer to focus on problems" doesn't impress anyone.

If you really are a senior technical person, you have many valuable skills that are not tied to a tech stack. Otherwise you may have a related title but you are really just a technology domain expert - which is sometimes very useful and valuable, but it's not the same the same thing at all.

So yes, it's helpful that people keep current, and obviously waiting for a new developer to get up to speed on the particular stack is always a cost (and it isn't a couple of weeks, for anyone). However, if what you really need is a senior person, it may well be worth that time.

Seeing this assumes the hiring manager understands how software is actually produced and maintained, not just what their current tech stack needs. It's surprising how often this isn't true, but a contributing factor is how mid level mangers are made - which is related to this discussion I think.

What I'd say to that is, yeah, it's true that a senior dev will have skills that go beyond a particular tech stack -- but if you're working as a dev, then those skills are still tied to your technical knowledge.

You might be great at mentoring... but you have to know the stuff to mentor people in it. You might be great at communicating with stakeholders... but you have to understand the technical constraints of your system in detail to have that communication. You might understand principles of software design and debugging... but you still need to understand the particulars of these technologies to make your design or to focus your debugging.

And yeah, you can learn the tech stack, and a company that really needs a senior dev eventually could do okay to hire a senior dev with out of date skills, and wait for them to get trained up. But don't plan on it taking three weeks, is all I'm saying. To really get to that point of fluency and domain expertise takes longer.

(And also, one of the things that senior devs are supposed to do is evaluate new technologies and figure out when and where adopting them can make sense. If you've fallen so far behind on learning new tech, then it's not clear if this is something you're able to do -- maybe it is, and you just worked for a workplace that stifled adoption of new tech, or maybe you're someone who'll just learn what you need and then stop paying attention.)

While I see what your saying, a lot of your language seems to come from the idea that tech is a progression, that we are moving "ahead" always, and that people get "left behind".

That's mostly not true. While there definitely have been advances, tech is also very faddy and repeats itself. Trends are often reactive, techniques repetitive. Someone who has been around for long enough to see some of these waves resolve themselves is in a much better position to evaluate current and upcoming approaches. Technologists also tend to be myopic, and view the things happening in their own tiny slice of the industry as "what's happening". There is a whole universe of interesting stuff on the go that you are at best barely aware of.

Personally, if I'm hiring senior technical people I'm much more interested in their continuing level of curiosity and engagement with something, rather than particular stacks.

So maybe you are worried that your candidate who has been doing ASP.NET WebForms for the last decade on IIS will have a hard time getting up to speed on your use of TypeScript and React ... but if I find out that she's also being building raspberry PI infrastructure for fun and messing around with the Julia language to implement a cellular automata project I'm not all that worried about their ability to get up to speed, even though none of that stuff will get used by the team.

But yeah, it doesn't take 3 weeks. It doesn't actually take 3 weeks for a new dev already well versed in your technologies, with rare exceptions. At least not to be firing on all cylinders.

The thing is, a huge number of teams really, really need a good senior dev. They often don't think so, and often erroneously think they already have some... but they don't. Instead they have people who had "Senior" (or "Staff" or whatever) added to their title because they'd been there for a little while. And every piece of work the team produces suffers for it. So yeah, it's typically worth the investment.

Now granted, that assumes you are good at identifying talented people. This is actually a big ask - and one of the issues surrounding the OP's questions really comes back to that. I think that a lot of software and technology development suffers fairly systematically from poor quality middle management. One of the symptoms of this is the "ageism" discussed here, but it's only one of many....

Yes. These days gRPC is a new technology. And yet RPC has been around in one form or another since the 1960s.

   then those skills are still tied to your technical knowledge.
I think this is a problematic way of looking at it. Thing about it this way: If you are hiring a fairly general developer or engineer, their overall impact is going to be determined by something like 30% technical/ 30% problem solving and communication / 30% teamwork & fit, leaves you 10% to move around in those categories. In the much rarer case you are hiring a domain specialist, it is something like 60-70% technical. But you probably aren't hiring someone like that, or shouldn't be very often.

Focusing the hiring too much on that first 30% impact gets you in trouble.

If you’re competing as a “general developer” with ten years of experience, you’re a commodity that is competing with everyone else and if you aren’t knowledgeable about the tech stack we are using, you’re at a disadvantage. The last thing I want is to hire someone who has been doing Webforms for 15 years when I can find someone that has been keeping up. Why should I have sympathy for someone my age who didn’t have the wherewithal to keep up when I can be aggressive about it?

I know developers in thier 30s/40s/50s who are both great developers/architects and have current skills. Why would I settle?

I think you missed the point I was trying to make, I’ll try and summarize. There are two parts. If I’m hiring you as a 10 year veteran “general programmer”, I’m going to be very interested in the non-tech-stack skills you have developed (i.e. the other 60-plus percent of your impact). And let’s be honest, the vast majority of the 3 year experience “general developers” are weak there. The other point is that technology isn’t a straight forward progression, so while it isn’t great if you have been “stuck” in a tech stack you consider obsolete, by itself that shouldn’t be enough to reject you.

Two other points: First, of course when hiring you want to find the “perfect fit”, but you almost never do, so you have to make tradeoffs. Hiring an inexperienced person with the “right” tech stack when you really need to bring experience into your team is a common anti-pattern. Secondly, as a hiring manager you need to properly understand the difference between “10 years experience” and “one year of experience, related ten times”

Agree with all this, but I'll also say that if you're an experienced developer who hasn't kept up with the times, that's not just a fact in itself, it's also a signal about what kind of developer you are. Most of the best developers are active learners who are always trying new things just as a matter of course.

So if you only know old tech, I'm a lot quicker to believe that you're a "repeated ten times" person.

I get where you're coming from, but I didn't get the impression that GP was ignorant to new developments. In my experience the most important thing is really wrapping your head around a domain to be able to solve concrete problems in it. Whatever framework or programming language you choose to solve those problems with is peripheral, really.

Also, most coding is maintenance. So yes, an experienced developer in some language is really worth a lot when you initially start out designing your system. But most of the work spent on that system will be maintaining and extending it and to be able to proficiently do that you need to:

1. Grok the problem domain, so you can connect the real-world / problem you're trying to solve with the code in front of you

2. Have a grasp on the limitations of your own system, which will come with experience as you work longer on it

> ASP.NET WebForms app running on IIS, ... Go microservice running in Kubernetes and using Kafka

Those things don't really count as languages, they appear to be a mix of development environments, software packages and architectures. Learning these things is critical and hard but there is a key mitigating factor. That factor is that really what needs to be learned is 'how we do things around here' which is typically company specific and needs to be learned by a new hire anyway.

My experience suggest new hires in cognitive roles (at medium-large businesses) take 1-2 years to get to the point where they can be proactively useful, purely from learning who is who, what the corporate history is, what ahs been tried before, why things are what they are and what their role's problem domain is really intended to be by their boss. Compared to that set-up time, the cost of knowing or not-knowing a technology is quite small.

> This whole "learn a new one in a couple weeks" mindset is toxic and needs to die.

It is true though. Learning a new programming language to the point you can write bug-free code and execute someone else's design is really easy. Learning how to solve a new problem is typically the hard part.

I 'learned' Python in 2 weeks. What I write looks a lot like C code, and I didn't make great use of the standard libraries, but it works fine. Compared to that learning how to use a large new library for a new problem typically takes a months.

Those things don't really count as languages, they appear to be a mix of development environments, software packages and architectures. Learning these things is critical and hard but there is a key mitigating factor. That factor is that really what needs to be learned is 'how we do things around here' which is typically company specific and needs to be learned by a new hire anyway.

Not necessarily, except for maybe Go, those are industry standard technologies that you could find someone who knew and could add value almost immediately.

We are very heavily invested in the AWS ecosystem. The difference between hiring someone who has the same amount of useful development experience but no AWS experience and someone who does and how fast they can hit the ground running is stark.

It’s not really gloating; I’m in the same situation as GP - I’ve worked enough with languages and their ecosystem to apply those skills to new languages and ecosystems. It takes me around 5-7 days to grok a new language and how to be pretty good at it with its tools/frameworks/idioms. Having experience where understanding basic fundamentals make it easier to extend to a different implementations.

The language + tools aren’t barriers anymore, the interesting part of solving business problems and how to apply the tools become the fun ones.

Yeah, this is how I see it, too. Their example kind of gives away that they didn't understand.

Once you've learned and used a half-dozen imperative languages day-to-day, there's very little new when picking up another imperative language. It's mostly mapping common concepts onto new syntax, plus some new features and maybe quirks/gotchas.

Silly little example: After an hour of reading a Go tutorial, and having never looked at the language before, I was able to diagnose a bug a co-worker had been struggling with for two hours, because I recognized the pattern from something similar in Django (a python framework). Now knowing what to look for, a quick Google search had the exact answer to the problem right in the first result.

So do you think you could pick up and become a useful iOS Dev in two weeks?

Depends on the context. As part of a team (code already exists that I can reference), and simply "useful"? Less than that, probably about one week at most.

I wouldn't be doing anything large or complex in that short a period, but absolutely can become useful with smaller stuff in that period.

And I say this with experience, thought in a different context: Years ago my manager introduced me to a new codebase by giving me a bug to fix. It was in a Django site, and I'd never done any python before. Took about half a day to figure out what was going on (for comparison, nowadays it would maybe have been 20-30 minutes), and a bit more to fix.

But I did figure it out on my own, taking some work off of another developer who was focusing on more complex work. I would call that "useful".

That’s just the point, as a “senior developer”, I’m looking for someone who can select “New Solution” and start a project from scratch or at least start a new major feature. Anyone can pull up an IDE in almost any language, step through existing code, pull up a watch window and fix a bug. That’s something developers can do with three years of experience. Bug fixing is something you put your more junior developers on.

I have no idea why all of the old folks (again I am in my mid 40s) consider it some great skill that excuses them from actually keeping up with what’s going on in the industry.

Several things on that tangent:

* My bugfixing example was indeed trivial, just the first thing to pop into my mind. It was mainly aimed at the use of the word "useful", which is woefully ambiguous.

* Sounds like we agree when talking about someone who already knows the technology stack.

* No matter how skilled the developer, there will be ramp-up time when they not only don't know the framework, but also don't even know the language.

* That ramp-up time being significantly shorter for a senior dev is what we're talking about, and how selecting for the specific stack like you're describing isn't necessarily going to be worthwhile - learning the domain is likely to take longer than learning the technology.

Oh, and:

> Anyone can pull up an IDE in almost any language, step through existing code, pull up a watch window and fix a bug. That’s something developers can do with three years of experience.

Thanks for the compliment I guess; I was describing something from around 3-4 months into my first job.

No matter how skilled the developer, there will be ramp-up time when they not only don't know the framework, but also don't even know the language.

And if a company is using a popular language/framework they can find senior developers who know the language and the stack. Why should they hire someone who hasn’t taken the time to learn both?

That ramp-up time being significantly shorter for a senior dev is what we're talking about, and how selecting for the specific stack like you're describing isn't necessarily going to be worthwhile - learning the domain is likely to take longer than learning the technology.

Not really. If someone has ever been exposed to a similar framework - take your typical MV* framework for example - even a developer whose only been active for three years can easily transition.

I have no idea why old folks (putting myself in that category) think they can get by without keeping up. Do you really think that someone who has 15 years of experience but haven’t kept up with the industry won’t be at a disadvantage over someone with 5 that has? On the other end, you have people like me that knows what it’s like being behind the curve and vowed never to put myself in that situation again.

You can’t imagine how long it took me to unteach a 50 year old who had been doing ASP.Net web forms for years how to even develop in ASP.Net MVC using server side rendering let alone client side frameworks.

He was only eighg years older than I was but didn’t keep current

There is no monolithic "modern tech" to keep up with. If you think there is, you work in a bubble and don't have nearly enough experience to say anything intelligent about how to hire senior developers.

Moreover, if you're proud of how hard it is to learn ins and outs of your tech stack, you're likely using a shitty tech stack and costing your company tons of money.

That’s true. No one is saying be good at everything. But you should be surveying your local market, seeing what an in demand tech stack looks like and align your skills with it and have a cohesive story to tell.

But you can’t be proficient at an architectural level with a stack in two weeks no matter how good you are.

Would you say that you could me an iOS/Android developer in two weeks just because you picked up Swift/Java?

Side note: saying my beliefs need to die as the first thing you ever said to me is pretty aggressive.

But to your actual comment: I'm learning new things all the time. My focus when I'm coding in my spare time is a simple IDE, some project management tools, and a simple game engine I am building. I learn new things in those projects literally every day.

I'm not going to do a bunch of hello world apps in random new "Pro" technologies. I did a lot of that in my 20s, it was fun and I was curious about them, but my interests have shifted.

There are enough jobs I can say "no" to the people who want only coders who are using the newest "Pro" toolset.

Honestly, a good hiring manager will realize you ideally want BOTH that person and people like me on a team. One person will know that Rails 7 solves our new problem, and the other will know that a simple refactor will save 30 hours of headscratching over the next year.

No, you can learn programming languages at quite a deep level in just a few weeks. I went from having never written or read a single line of Scala or Java in my life to making production critical commits to a customer-facing product in about two weeks, where the commits involved needing to know a lot about Scala case class patterns, some tricky in-house implicits, structural typing and the numerical package Breeze. The thing is, because I have developed deep skills at delivering software products in other languages, I know what questions to ask. I know to look into build tooling, certain design patterns, ask what is the equivalent of foo from language X over in language Y. Experience allows you to short circuit huge chunks of the learning process and make bigger strides to knowing complete functional units of the language and tooling. Yes, you’ll keep learning more after a month, but in all sincerity, if you’ve learned at least one procedural language very deeply and have at least brushed up against other paradigms and design patterns, then you can learn almost any other language, whether it’s C++ or Scala or Rust or Haskell or Fortran or Erlang or anything else... up to about an “advanced intermediate” level in just a few weeks of self-teaching.

I would rather hire someone with that mindset. They are more likely to be interested in solving the actual business problems. The junior guy excited about frameworks and fads, is much more likely to create complexity (usually from vast numbers of dependencies and inappropriate technology choices). The flip side of course is that if you already have a mess, then the junior guy might be a better choice to maintain it.

I wouldn’t be excited at all hiring someone whose skillset is ten years out of date. It tells me a lot about their lack of willingness to keep up with technology.

This depends on what you mean by "technology". I would expect them to be aware of emerging trends in the industry. Bonus points also for an interest in the latest computer science. But I don't care if they haven't hacked on groovy gradle scripts or injected Spring AOP pointcuts into Java beans. This sort of tech is accidental ad-hoc complexity and not progress.

Spring's AOP pointcuts (in its first incarnation as AspectJ) and Groovy (pre-Gradle and pre-Apache) are both over 15 yrs old. To not know those is having a skillset that's well over ten years out of date.

Heck, I know what those are and I've only written 20 lines of production Java code. They came up repeatedly when I was researching using Microsoft's (now defunct) Unity framework to do AOP in C#.


It’s not about “emerging trends” but they should be familiar with a chosen technology stack that is in the “plateau of productivity” in the hype cycle.

>If you've got decades of experience on your resume, my guess is that you want to be evaluated as a senior dev.

And this is where ageism rears its head.

If two people apply for the same position, and you're OK hiring someone who has only 2 years of experience for the position, don't ask for more from someone who's been working for 20 years.

They get paid very differently, is the thing.

Why do they? The older candidate may have a higher expectation, but as an employer, the compensation for a particular position should be somewhat fixed. If the older candidate doesn't offer anything more than the "inexperienced" one, offer the same salary you would the inexperienced person. They can always say "No".

Trust me: There are plenty of older programmers who do not expect a higher pay because of their age. In my career, whenever I hear "older people expect a higher salary", it has almost always been by employers/managers, not by actual older professionals. On the other hand, I've met quite a few programmers who complained they didn't get an offer or were told they were ruled out because they were "too senior", and the complaint almost always was along the lines of "I was not asked salary expectations".

I have a former manager who is in his 50s. As soon as his kids graduated he self demoted to a full stack developer working with React/C# and Azure.

I have another friend in his 40s who took a job that let him work from home who is in his 40s and it pays less than he was making when we were working together.

Not as differently as you might think. When I was more junior, I assumed the wage increases would continue as they had been. But they flatten out after 10-15 years of experience, to the point that the salary of someone with 15 years of experience is probably the same as someone with 30 years of experience.

the advantage of older developers is that they have seen the same patterns before. For instance, younger devs might think microservices is the hot thing while an older dev is thinking that it just looks like a poor man's erlang from the 1970s and knows to wait a few years when all the CI/CD and orhestration is integrated and commoditized.

So while you are “waiting a few years”, the company has new problems to solve write now. So no you can’t stick around using VB6 for the next 20 years.

yep and I'm pretty sure creating a crapload of devops scripts to support microservices will be the next problem you give the business.

If you are creating a bunch of “dev op scripts”, you’re doing it wrong.

I wrote one parameterized Cloud Formation template that creates our build pipeline for any new service. Most of the time we can use the prebuilt Docker containers (maintained by AWS) for our build environment. We have one or two bespoke Docker containers for special snowflake builds.

And if you haven’t read my other posts, and I get accused of being young and inexperienced, let me just say that my first “internet based application” used the Gopher protocol....

I haven't read any of your other posts and my comments were not directed to you in particular. You're really making my point. If cloud formation templates work then great. I don't know its limitations. I do know that kubernetes has become a standard for orchestration and that it becomes easier over time to manage. A couple years ago there was no support from AWS so maybe not the time to jump in. Likewise with Kafka. Cool tech but really expensive to manage. Now AWS and Confluent will manage so the costs have come way down.

It's really easy with AWS....


But we really don't care about the lock in boogeyman. Out of all of a typical business's risk, lock in is the least of them. We use lambda where feasible, Fargate (serverless Docker) when necessary. Why waste time doing the "undifferentiated heavy lifting"?

It's not toxic, it's dumb.

For older coders, I'm not sure that same signal works. I've forgotten more frameworks than most coders will ever learn. I mostly don't care about programming languages, they're roughly the same and I can learn a new one in a couple weeks.

Why not? I’m older than you and I have passed plenty of technical interviews on the $latest_tech.

Sure you can learn a language in two weeks but a language is far more than syntax. There is the ecosysystem, frameworks, third party packages, etc.

And honestly, I would be more concerned about hiring someone who built their own bespoke framework from scratch than someone who used an existing one. Building frameworks rarely adds business value.

> And honestly, I would be more concerned about hiring someone who built their own bespoke framework from scratch than someone who used an existing one. Building frameworks rarely adds business value.

If I were interviewing such a candidate, I'd dig into why they built a custom framework. The answer to that question could be quite enlightening, especially if they mention business reasons for doing so, or if they talk about what they've learned since then.

When I started my own business a decade ago, I decided that existing web frameworks wouldn't suit my needs and built my own. Though I built a framework that suited my needs extremely well, it was a massive distraction from my actual business (which wasn't even tech). In interviews, I freely discuss how my NIH syndrome contributed to my inability to get the business off the ground; I've found that it tends to go over rather well with interviewers.

Same. The way I work at home is no way to run a business, and I am painfully aware of that fact.

I go to work to be on a team and pull in one direction and support the leadership and make hard compromises.

I write code at home to understand where the industry will be in 10 years.

> And honestly, I would be more concerned about hiring someone who built their own bespoke framework from scratch than someone who used an existing one

If they're over 45, then they likely worked in the late 90s and early 00s. If they were like many of us, then they worked on at least some web applications back then. This was far before jQuery and even Prototype.js on the frontend. Java, PHP and Perl were commonly used on the backend (or where I lived, more often IIS with asp). But the start of the career of someone in their 50s and even 45+ was before any commonly-used framework in open source scripting languages used for web development (unless you consider PHP to be a framework).

Even Spring framework was only created in 2003 (I just checked to confirm). Symfony in PHP was 2005. And those took several years to catch on in their respective ecosystems.

Tons of companies had in-house frameworks back then. Who did you think wrote them?

Even if someone working back then never explicitly built a framework, they used design patterns that mirror today's frameworks. If not, they were writing spaghetti code (and to be fair, that was pretty common back then).

Based on your username and your assertion ("older than you"), you're what, 45? Have you forgotten what it was like back then?

In the late 90s early 2000s every website was a bunch of spaghetti code with a mix of code and html - the original VB based active server pages, JSP, or even ugly Perl and PHP.

But when he said he “made his own web frameworks”, I’m assuming he was talking about this decade.

> But when he said he “made his own web frameworks”, I’m assuming he was talking about this decade.

I re-read the comment in question. Not sure why you would have assumed that. Someone who is now 37 likely started programming before the age of web frameworks.

No, your parent was correct. I am currently building a complete web application hosting toolkit that I started in earnest in 2015. I worked on it this week.

I wouldn't call it a framework, because it's built of small independent pieces, but it's a similar scope to something like Rails.

I did start coding long before web frameworks existed though. :)

Given the amount of spaghetti I've seen... It's my understanding that being able to think in frameworks is a rare skill. If you find someone who is actually good at designing frameworks maybe, just maybe they have some real skill and they built their own framework to practice and hone their craft because that's a necessary condition for getting good at something?

Granted, I hear what you're saying, but I'm not convinced your blanket dismissal is the right analysis without digging deeper on the individual details.

> a language is far more than syntax. There is the ecosysystem, frameworks, third party packages, etc.

Agreed. However, if you're a software company you've already likely made all of those decisions. If I come work for you, I'm stuck with them. It doesn't matter if I like the package manager or the frameworks. That's just the background against which we work.

Our work is making our lives better, and solving the company's problems to the best of our ability, given all those constraints.

> honestly, I would be more concerned about hiring someone who built their own bespoke framework from scratch

Yes! I feel honestly quite nervous about sharing that aspect of my work, because I worry people will see me as out of touch.

But I am quite up front that I don't advocate using my toolkit in production environments. My professional advice is usually quite conservative. I tend to advocate for minimal disruption and using standard tools as they were meant to be used.

However, because I do that all day at work, I see all the cracks and warts in the architecture and when I go home I want to work on the future.

And even though you are correct "building a framework rarely adds business value"... that's not what motivates me. I do think there is value being created, and I will eventually be able to cash in on that. But I am doing it because I am interested in the future, and I want to work on tools that transcend the pointless busywork I have to deal with day to day, in the name of meeting quarterly goals (which TBF is crucially important in a business context).

I get direct enjoyment from working in my own little hobby world where those constraints don't exist, and that is enough motivation for me.

Agreed. However, if you're a software company you've already likely made all of those decisions. If I come work for you, I'm stuck with them. It doesn't matter if I like the package manager or the frameworks. That's just the background against which we work.

Let’s say you learn C# in “two weeks”. Is that going to be enough time to get proficient with ASP.Net MVC? Entity Framewor? IIS? The built in logging framework? The DI framework? Middleware? What if they are using Xamarin for mobile? Let’s say you did pick up Xamarin (which I haven’t).

Would you know the fiddly bits about doing multithreaded processing with desktop apps using Windows Forms?

> And honestly, I would be more concerned about hiring someone who built their own bespoke framework from scratch than someone who used an existing one

It's almost like you're pretending the years between 2008-2012 never happened, but I'm sure that's not the case.

In 2008 I was in my mid 30s, had been working at one job for nine years as a C/C++ bit twiddler and writing COM/DCOM objects that interop’d with VB - in other words I had a skillset they was woefully out of date. I got lucky and found a job that let me pivot into being a modern “enterprise developer” and barely survived rounds of layoffs for the next three years.

I learned my lesson. For the last 10 years I’ve kept my resume competitive with whatever the $cool_kids technology is and haven’t had a problem competing in the market.

What are you referring to?

Ditto, if something extremely special happened in the years 2008-2012 with regard to frameworks, its cataclysmic significance seems to have escaped my notice. Rails? /shrugs

I think maybe they are just referring to the fact that in that era pretty much everyone had to build their own "framework" to get a big web app up, because there weren't a lot of established choices.

Between 2008-2012 there was ASP.Net Web Forms and MVC was becoming popular.

I didn’t play in the Java space back then but there were plenty of Java frameworks (https://dzone.com/articles/what-serverside-java-web-framewor...)

Ruby on Rails was released in 2005.

Knockout JS was introduced in 2010.

Django was released in 2005.

There was also Mojolicous for Perl also released in 2008.

This is what I meant, apologies for being cryptic. Web/JS development was absolute chaos, it just so happened that Facebook & Google won the day and we've converged on some "standards" in React/Vue/Angular, and I use that term very loosely.

The things I've seen. shudders

> And honestly, I would be more concerned about hiring someone who built their own bespoke framework from scratch than someone who used an existing one. Building frameworks rarely adds business value.

Hum... My experience is that whatever framework a business use in any widespread manner, I do always have to build another one on top, because generalist libraries completely suck for productivity. Unless you are a single app shop, then you can just hard-code everything.

I don't get how improving development speed does not add business value.

It means that the next developer will have to learn your bespoke logging, authentication, etc. and other cross cutting concerns instead of just being able to go to stack overflow and finding an answer.

If their is a problem, they have to wait for a response from the one Guru who wrote it.

Most architects who think their problem is a special snowflake usually isn’t.

If it's "just logging" then of course just use the provided solution.

A custom domain for a business would be something more than logging though. For example, perhaps you want your logging to include some regulatory compliance hooks. Or maybe you want to do logging that also serves as a persistence layer for undo/redo and you need that to be tied deeply into your presentation layer.

In those cases it's much more than just "log this" that's being abstracted, it's all of the business needs around logging.

And next developer can't get away from learning all that code. Either it's tied up as a nice internal company logging framework, or it's spread out as a bunch of concerns through the app. But either way, the newbie has to figure it out.

The point of personalizing a framework is that most developers shouldn't be thinking about logging, authentication, etc.

Again, my experience is that the personalization layer is very easy to get by a senior developer, maybe there is some unconscious architectural decision that is biasing it, but I've never seen one have problem taking my code. Junior developers by their turn just can't grasp it, what I consider a feature, they are better just using the functionality until they get some experience.

So you are against encoding domain knowledge in the software? It's better to re-invent the wheel with every application so that "the next developer" can google their questions?

I don't grant your premise (that it's wheel-re-inventing), but yes: it's usually massively better to build with familiar, boring, transferrable, google-able tools. Even if it takes a little bit more time to build.

Frameworks aren’t about domain knowledge. They are about cross cutting concerns. You can’t imagine how many badly written custom ORM, logging, authentication, web frameworks I’ve come across.

I would presume he built his own as a learning exercise/for fun.

As a 40+ yo dev I want to agree with you but:

>I've forgotten more frameworks than most coders will ever learn.

Forgotten knowledge is worthless

> I mostly don't care about programming languages, they're roughly the same and I can learn a new one in a couple weeks.

I used to think that, its possibly true, but the platform that comes with the language certainly can't be learnt in a couple of weeks so I dont think this is true as it used to be.

> I know technologies like Rails and SQL, but I don't play with the new toys as much as I used to, becuase lately I would rather solve a new problem or build a new tool than learn a new tool.

This is a real problem for you if companies are hiring for people that know the new toys.

> It's another world, and a lot of companies just don't bother...Thankfully I have no desire to work at any of those companies...There are plenty of companies trying to hire people to solve actual problems

That's great now, but in the future the job market might not be so tight and then people with your attitude might find it really tough.

I think the point on forgotten frame works / new tech is -- nothing is _really_ new. Knowing ten web frameworks inside out isn't really 10x more than knowing one, because they have a tremendous amount of overlap. Relatedly, I've forgotten how a couple frameworks work because there's nothing interesting left behind, that isn't present in the newer ones. I'm quite sure if I had to go pick one back up, it would look very familiar. perhaps I know more about it now than I did when I "knew" it.

You didn’t work with a C# ASP.Net WebForm dev who before that was a WinForm dev trying to grok ASP.Net MVC.

Great, but that means that for the "knowing frameworks" box, there is no advantage in hiring an experience dev over a younger one.

The general thrust of this thread is that there is no advantage in hiring an experienced developer over a younger one.

You might be getting a mid-six-figure salary in your first job, but you probably ought to be socking it away because you can't expect raises.

> Forgotten knowledge is worthless

Not really, you pick up something you forget much faster than something you've never seen before. It's all in there somewhere.

Learn a language in a couple of weeks? No way! A couple of weeks is plenty to understand the mechanics of the language, but the biggest aspect of a language is the library.

I do agree that testing for the ability to solve problem is hard and gets neglected.

If at 37 you consider yourself "older programmer", the industry is fucked. The minimum social security retirement age in US is 62. That is older programmer. Full retirement age is 66.

"Why is that old guy still working? Must not have worked hard enough like I do! I'll be retired at 45 from my stock options and 401k" e.g. extrapolating from last 5-10 years of stock returns, which is all they've known...

> I can learn a new one in a couple weeks.

To play devil's (or manager's) advocate - have you ever crunched the numbers on what it takes to pay you for a couple of weeks? That's not a trivial investment, especially considering the risk that you just thought you could learn a new language in a couple of weeks but were wrong.

Sure, but you also have to weigh that against what it takes to pay people to develop the wrong architecture at the wrong time, and the opportunity cost to the company as a whole to be saddled with those bad decisions. That's not trivial, either.

(although I agree with the sibling comment — one doesn't really learn most languages in a few weeks)

Well let me tell you the story about some older experienced “infrastructure consultants” who thought that AWS was just like on prem infrastructure, did a lift and shift, moved my former company “to the cloud” and ended up costing more because they thought they had “learned AWS”.

Architecture and processes that was fine for one environment was horrible for the other.

In development terms, you can write FORTRAN in any language.

Let’s not talk about the one Java developer who thought he could pick up C in two or three weeks. It was ugly.

But during that time, they're not just learning the new language, they're also learning your environment, applications, and tools, which they have to spend anyway. In my opinion, that could accelerate their learning of the language, by seeing it used "for real", rather than in a more restricted tutorial context.

Every job I've ever worked I felt I produced more cash value for the owners than I took. I'm actually quite proud of that fact. It's important to me, and it's one of the things I think about all the time as I'm making decisions on how to work.

Isn’t the whole point of asking programmers to work out and and write code for alien dictionary from Leetcode to test their problem solving ability? There are really no easy answers to interviewing people for jobs, I guess.

Yeah, I'm talking about people being upset that I had to look up the syntax for ActiveRecord associations because I haven't written any in a while.

> But how do you evaluate a coder on their ability to solve problems? Much harder.

With an IQ test, the same way you evaluate anyone else on their ability to solve problems.

That evaluates a person on how well they solve IQ test problems.

It evaluates the quality of the substrate.

Having a high IQ doesn't mean you are going to be able to hit the ground running, solving problems and writing production code in your first week.

Better to hire for your particular skills. And younger is cheaper.

I work in enterprises with 50,000+ employees and billions in annual revenue. I work with older programmers all the time! They're better at programming in Angular than I am. And they are wonderful mentors!

I think what's actually happening is some programmers, who happen to be older, don't keep up with their skills. Then when they interview, they bomb it, or talk about how they could do the same thing in an older stack.

That's great, but the company is really invested in the current / new stack, so it doesn't make sense to hire someone who does not have those skills. So they pass them up and find a candidate who knows the software the company is looking for.

But the older interview candidate, instead of reflecting on themselves and realizing they need to spend time enhancing their skills, they simply say the company was "ageist" and blame it on age discrimination.

Blaming others is generally easier than blaming yourself, I guess.

The scenario you talk about happens for sure. And some of those protests are legitimate; we've made little advance in real productivity since the 70s but all the while churning languages and libraries and environments like fashionable clothes partly because programmers are on the average young and inexperienced. So after a decade or two of riding the wheel, a lot of engineers decide it's pointless and just kind of check out. (Not me, FWIW; I'm on board with React Hooks and looking for an excuse to use ReasonML.)

On the other hand, ageism is real. When I was younger I was told more than once by managers, straight up, this candidate is too old so find an excuse not to hire. And now in my 40s, when I walk into some companies for an interview, I can feel the decision has been made before I even start talking. Not everywhere, not even most places, but it happens for sure and it's not even that subtle.

> we've made little advance in real productivity since the 70s but all the while churning languages and libraries and environments like fashionable clothes partly because programmers are on the average young and inexperienced.

Making a real time video chat client is now a literal programming exercise, it used to be the domain of multi-million dollar venture backed startups that were valued in the billions.

Real time chat between users is now a feature that is added to an application in a day, or even a few hours if you follow any of the myriads of tutorials available on YouTube.

Go from writing a UI in C/C++ to writing it in any of the higher level languages. Assuming you don't get caught in some design pattern trap that results in massive code bloat, it is now possible to do things in hours that used to take days to weeks.

Heck, transparency is no longer "write some ASM" routines, it is setting an opacity variable!

Some tech stacks, such as those for making basic CRUD apps, may have actually degraded a bit, but at the same time someone who is only slightly technical can now drag and drop their way to an online store front that is a fair bit more powerful that Viaweb was back in the day!

Heck, it is possible to now go from an empty folder to writing+deploy HTTPS API endpoints in under half an hour.


More stuff!

App (not web!) updates can be deployed to users by uploading a file and having a client pull down the changes and seamlessly switch over to the new code next time the app is launched. Versus mailing out disks!

There are app stores that you upload intermediate code to and they'll compile apps to the appropriate architecture for a customer's device!

During 72 hours of coding for Ludem Dare, game developers and artists work together to create games that are as complex as a full fledged commercial game would have been 25 years ago.

And of course, many corporate software engineering efforts continue to fail or go massively over budget for largely political reasons.

In the 1980s a spell-checker was a very complicated project and now it's simply a hashmap lookup. But it's not because we invented the hashmap since then, or nobody in the 80s would have known that, it just wasn't possible with the RAM limitations at the time.

Likewise, moving from ASM to C to Smalltalk is a night-and-day improvement... that we made in the 1970s. The difference again is what hardware we get to run it on.

Video game construction kits existed in the 80s, and 80s GUIs looked pretty much the same as they do now if you ignore resolution.

Drag and drop application development was huge in the 90s, and the CRUD applications of the time weren't significantly different than now, other than they didn't run in a browser. HTTPS wasn't hard back then, and you could write a Perl endpoint in half an hour easy.

When it comes to things like playing video, it's easy now but I don't even really consider it programming. You're just installing software that does it for you.

Libraries and OSS and StackOverflow and various services have made a real difference. I'm not arguing that we haven't made progress. I'm just saying that I can see how some people feel that this year's incremental advance or retreat is not nearly as exciting if they've seen how the last 20 worked out.

Your discussion of "real time video chat clients" has me cracking up; it seemed that a myriad of pieces of software had this figured out in the 2000s, but today I can't think of a single piece of software that I'd actually want to use for video chat. All of them have some fatal Achilles' Heel: Facetime has security problems/previously did not have group chats/is exclusive to Apple OSes, Skype is laggy and has more connection problems than any piece of software I've ever used, Google Hangouts chokes once you get to 3 or 4 participants and frequently has issues with even 2 participants, Duo is only usable on Android/iOS as far as I know... does anybody have a good client that they can actually recommend?

Myself, I've gone back to using Ventrilo because I realized that I really don't care about having video.

I mean sure, video chat is overrated... but the tech is there, it just turned out it isn't quite as desirable as first imagined!

Firefox Hello worked remarkably well IMHO, it was super simple to use.

The field is full of video chat clients though. Uber-Conf is popular, although I think the % of problem free calls I've had with them is less than 50%.

Zoom has worked well for me, no real issues.

Facebook Messenger, privacy issues aside, works well.

Have a look at Vsee. Aimed at doctors, meaning they're probably reliable and secure. And they have a free version.

> Go from writing a UI in C/C++ to writing it in any of the higher level languages.

I've never done this in C/C++ specifically but I have to imagine that the GTK bindings are mostly language agnostic. Using a tool like Glade[0], you can build a GTK GUI with ease.

Glade was released 21 years ago.

[0]: https://developer.gnome.org/gtkmm-tutorial/stable/chapter-bu...

I don't remember where I read this, but that reminds me of the "Perl Paradox" -- that if your job listing requires a "Perl programmer", you get worse programmers, even though a good programmer can equally well adapt to any language, including Perl. But if you require such a person, your only applicants will be people who could never adapt to a new language.

Edit: Ah, it's actually named that and I found a lot of results for it:


Interesting. I legitimately liked perl back in the day. The ability to do things like inline c code, etc. was great. But I've been immediately passing on any roles that mention Perl for a long time.

Whoa, I had never even heard of c-inlining. Then again, I never really got into perl.

> they simply say the company was "ageist" and blame it on age discrimination.

This is a topic I'm very interested in (and as I've gotten older, my interest has only grown).

In ~20 years as a coder I've seen only a handful of overt ageist issues. I've seen plenty of ageist language and attitudes, and even more assumptions, but those are a lot harder to answer "Yeah, but would they turn down a qualified candidate for that?" about. I've known people who were both experienced and open-minded, as well as those that were experienced and closed-minded.

End result: No idea what to believe. I'm not comfortable assuming one way or the other, because either assumption is harmful if incorrect.

I am divided on this. On the one hand straight up age discrimination is definitely a thing. A surprising number of startup types don't even know it is illegal so they are completely explicit about it.

On the other hand, companies who engage in age discrimination are probably shit companies, so are you really losing out by them not considering you?

> companies who engage in age discrimination are probably shit companies, so are you really losing out by them not considering you?

If they have the money to make my life easier, and I end up not getting it, then yeah, I lose.

If they have the experiences that make me a better dev, and I end up not there, then yeah, I lose.

If they aren't actually that bad, they just have this one area of ignorance, but I never get to experience all those other benefits, then yeah, I lose.

If I'm NOT in a hot tech center and this is one of the relatively few jobs that is available/is a step up, and I don't get it, then yeah, I lose.

I myself have nothing to complain about, at least not in my experiences to date, but in terms of considering the issue overall, I don't think it can be so easily dismissed. I recall plenty of older techies struggling after the end of Y2K and the dot-com bust when the market wasn't "If you want a job, you can get a great one", and I assume there are plenty of people in the above categories.

They simply say the company was "ageist" and blame it on age discrimination.

Except they don't "simply say that".

It's a fact of the industry that is backed up by (a heck of a lot of) experience and observation, unfortunately.

In that sense, it's kind of condescending to deny the reality of the (sometimes painful and frustrating) experiences many people are having in this regard.

I think larger enterprises tend to be more friendly to older programmers. The companies move slower, so they have more legacy code and less interest in trends. Less fun to work at, but.

I'm in my early 30s and very aware of this, to the point where if I could go back I'd pick a different career.

My strategy is: live in the bay area so I can get paid like crazy, live a very frugal life much below my means, invest diligently, and hopefully have enough money to retire in my late 30s (in a much cheaper area, of course), so that when I'll be told that I'm too old, I'll raise my middle finger and leave the scene with the few million dollars I saved.

Hi. I’m you from ten years in the future. Do not be afraid.

I wanted to pop back in time and say thanks. Yeah you didn’t retire in your late thirties like you thought you would mostly because it’s harder to make FI cash than you thought. The crash of 2020 took a few years to get passed but hey 45 is the new late thirties as they say now.

The reason I’m thanking you is you went ahead and struggled and saved anyway while things got tough and now you are free. Just FYI, I texted my QUIT to the HR bot in the Philippines just before I came back to see you.

Btw quitting is harder than you think. Just to let you know, I don’t blame you. Not quitting at 40 when you could eke bye because they offered you 2x what you’d ever made to stay; yeah the place was thalidomide for your soul but then your wife (I’m not telling) got pretty sick and that cushion was worth it.

Also: take care of your health and sleep more. An extra ten years of long hours at the terminal has left me not as healthy as I’d like.

Haha thanks for this, I had a good time reading it.

To be fair, I don't plan to retire at 39 or 40, as long as someone hires me with reasonable compensation (2x would be a dream!). I'm purely trying to protect myself against a foobar scenario where indeed, I become obsolete from the point of view of most employers, due to "old" age.

They don't pay you 2x to work at a dream job. It's 2x because it's "thalidomide for your soul". Be careful.

I'm 40 and making the most money I've ever made in my career, fighting off hiring managers begging me to interview, and doing the most exciting work of my career. Also, you should leave the bay area now, not later. Your crazy money is getting eaten away by living expenses and you could have higher discretionary income in a smaller tech hub, and you could buy a house now and build equity instead of renting in the bay area.

Seconded. I make good money, am nearly 40, and work remotely while living in an inexpensive small town near a reasonably fun metropolis. I’m probably going to be able to retire in 5 years. All of this would have been much, much harder in a different career track. Computer science was a great career choice for me.

>Your crazy money is getting eaten away by living expenses and you could have higher discretionary income in a smaller tech hub, and you could buy a house now and build equity

You're assuming that he would actually want to stay in that other place long enough for the house purchase to make financial sense. You have to own a house at least 5 years for it to work out, minimum. Moreoever, in the big tech hub, you're making more money, so if you keep your living costs low, you'll bank more cash than living someplace cheaper.

> you should leave the bay area now, not later

Curious, why? I am having no problem being hireable right now at all (literally just joined a FAANG recently).

As far as I know, it would be impossible to get the same pay I'm getting now anywhere else. Even after considering the high cost of living, the spread between income and cost of living is very high, and allows me to stash away a significant amount of savings every year, to pay for my future freedom. I really doubt I'd be able to save that much if I were working anywhere else.

What else would you suggest?

Okay, it's crass but let's talk numbers. I'm currently earning $210k/year in Austin, TX. My monthly income is about $12k after taxes, and my rent is $1,400 for three bedroom house 20 minutes from the office (wood floors, nice appliances, yadda yadda). I save about $9,000/month, give or take. When I was looking at relocating to the bay area I crunched the numbers and couldn't find a realistic scenario where I wouldn't be taking a relative pay cut.

Damn... I may be more underpaid than I realized... I keep hearing from my employer that they won't pay me comparably to my SF-based co-workers as I'm remote, "Texas is cheap", and "No one there makes over 100k". I think it's time to haggle a bit harder...

No way, I have a friend that's a junior front end developer with three years experience making $90k. When I do get numbers in emails from recruiters it's 140-170 for a senior role.

You have a nice situation, but my goal is mostly saving money for the future while living a decent life now (and admittedly I am minimalist so many things people care about don't bother me much). I have obviously higher cost of living expenses (living frugally I do it in $50k-$60k/y), but just my base salary is higher than your compensation, and with RSUs I get easily into ~400k/y total comp or more depending on the year, so basically I end up saving a lot for the future, which is really the metric I'm optimizing for. I'm in my very early 30s and already saved up ~1.5M liquid.

I don't own a house here in the Bay, I dump everything in very diversified index funds and some rentals out of state, since I'm not interested in staying in this place long term and play the million dollar mortgage roulette.

The day I call it quit, me and my partner (she does well too) can move to Austin TX, or anywhere else really (we're both immigrants with citizenships in really really cheap countries), with our fat liquid assets, and not having to worry about necessarily finding work in my 40s. At least that's my very optimistic plan, it might completely not turn out like this, or I might die tomorrow.

Man, if I had 1.5 mil in cash laying around I'd be on the first flight to Spain or Italy.

While that might seem like a lot, if you estimate the money is invested to just keep up with inflation and no real growth (very conservative assumption, but still possible), for $1.5M to last you 60 years (i.e. from 35yo to 95yo, completely depleting the capital at the end of your life essentially) you could just withdraw $25k inflation-adjusted pre-tax every year. I'm not sure you could necessarily live that well in Spain or Italy with that, without working. I happen to be originally from a cheap European country, and it's not that obvious.

So it's really not that much, most people who get private or state pensions get a better deal than that from a cash-flow perspective.

What about the 4% rule (put it in an index fund and hope for an average 7% return, leave 3% for inflation and live off the rest)? I always figured that would work out to $50k per year per mil. Is that naive?

Edit: sorry for the ninja edit. I read your comment too quickly.

A few points:

1) If you keep it under your mattress it will be much less than what I described, because every year your nest egg will shrink due to inflation, so you'll just be able to take $25k non-inflation adjusted, which is a big difference from my $25k inflation adjusted (in 60 years, $25k will be $150k at a 3% inflation). My assumption is 0% real growth, not 0% nominal growth, which is what you'd get by keeping it under the mattress.

2) The 4% rule is based on a shorter retirement interval (30 years) than what I'm looking for (60 years). Try to go on firecalc.com and look for the statistical odds of 1M giving you 50k/y for 60 years: the failure rate is higher than the success rate, and that's based on historical data.

3) Yes, my assumptions are very conservative, but I don't believe index funds will return 7% nominal over the next few decades, the world is going to face too many problems in my opinion. That being said, pretty much all I have is invested in index funds despite my opinions (mainly because I wouldn't know where else to invest it, since both cash and bonds are sure losers to inflation), so in the best case I'll be pleasantly surprised.

I regret trying to give you advice. You have clearly planned your life much better than I have. :)

that is quite conservative. i would think 3% or 45K adjusting for inflation each year is still a very safe assumption.

Totally fair skepticism. If you wanted to make it last just 30 years (standard early-retirement duration) then sure, I'd agree with you, probably even 4% would be totally safe. But 60 years? Too many things can happen, I have to be super conservative before taking the decision (forced or not) to pull the plug, hence my estimation.

Again, quite possibly I'll die much younger without even enjoying any of that freedom :-)

A thing to also take into account is that it is very unlikely you will live the rest of your life without going into some other money-making venture. People who know how to accumulate money tend to accumulate money.

Basically, the people who recommend moving someplace cheap are people who value stability over cash savings, and want to have a big house for a family. If that doesn't fit you, then it doesn't make sense for you. If you're single and live cheap, living in the high-CoL makes much more sense.

If you are planning on staying in the bay area for 10 years to save money, why not move some place with a lower COL and buy a house that might appreciate significantly in those ten years? You can make good money in Austin.

There's no guarantee housing will appreciate, in fact it's a pretty good bet that it won't. Housing is already unaffordable in many places relative to local wages, so it looks like another bubble that may burst.

Also, Austin, while not as bad as other parts of Texas, is still in Texas.

> Also, Austin, while not as bad as other parts of Texas, is still in Texas.

I've lived in six cities on three continents (and I'm talking signing a lease, not staying for a month) and I can assure you Austin is no better or worse than any other metropolitan area regarding... wait, what metric exactly are you using here? Weather? Burrito size? Number of smug, passive aggressive west coast urbanites per square mile?

> Also, Austin, while not as bad as other parts of Texas, is still in Texas.

So, what's the disadvantage?

It's a felony to receive unregistered chemical apparatus, such as an Erlenmeyer or Florence flask.


That does seem a bit excessive, but is getting a transfer permit actually difficult? And it's not like California isn't doing the "war on drugs" either, with the same success.

I'm 45, soon to turn 46.

If I could pop back in time, it would be to (somehow, because I didn't back in 1992) recognize that where things were at was the Bay Area, and moved there instead of staying in Phoenix.

But here is where I am, and while I make a decent salary, I certainly am not retiring any time soon.

The whole age thing is standing in sharp relief in my mind currently, because things are slightly "up in the air" at my current employer; I want to stay employed here, and I'm sure my employer wants to keep me, but recent events may potentially cause the relationship to have to be dissolved if the projects and numbers don't line up. I can understand that from a business perspective - fair's fair.

But the prospect of interviewing yet again, 3 years after getting this position, is not something I look forward to. I have new skills I can bring with me that I didn't have before, so that's a plus, but I worry that the age thing may be a barrier.

I'll probably have to lower my salary ask just to have a chance, but I won't do that unless I don't get any traction after a few months of looking - should I have to do so.

Even with that, though, I tend to wonder if the question in the minds of those hiring will be something akin to "Why isn't he a team lead or a manager?"; which is something I don't have, and have never been offered - possibly because I've never been able to stay at a company long enough to get to that level - either they go out of business or they get sold 2-3 years after I join, or the company is just too small to have any kind of way to advance you into such a position (I've found that I fit best in companies of under 50 employees, and the fewer the better).

Maybe if there's another time around coming for me, I should look into startups, though that seems like a bust from the get-go (ie - find a startup, in Phoenix, who needs a 45+ year old software engineer - that's probably a very small pool, if it exists at all).

It is what it is, I guess. I am fortunate in that I am debt free, at least.

Don't feel bad - I am in this situation as well. I believe your assessment is true in regards to the stereotype / first impression we make to hiring managers etc.

If I could go back in time I would have the done the bay area thing when I was young too. But, life is a bit of give and take - there are some good things that happened in my life not taking that course (even if they didn't all work out as planned). So you kinda have to appreciate those parts of life and accept the fact that you are maybe not in the financial situation you wish you were in.

The only guarantee we have in life is this moment :)

Old people don't get useless, they have much needed experience that have to be harvested. Just because you use the latest fad does not mean you don't dig the same pit or just a slightly different pit. The young that realise this will stay ahead.

For sure, but in this case I apply the investing's theory "the market can stay irrational longer than you can stay solvent", to something like:

"the software engineering labor market can stay irrational towards older people longer than the 2 decades past 40 where I could stay effectively employable".

I will be 53 this year. I am, strictly speaking, an IC. A leaf node.

The company where I work administers a survey of all tech employees every year, then publishes the results as a series of graphs, which can be broken down at any branch in the org chart. The very first question is "how many years of industry experience do you have?" The graph for my team is, shall we say, bimodal. There's a bell curve between 1-4 years, and a spike at 30 years.

I've been building things for a long time, and I try to impress upon the other developers on my team (or in my org) that I've never, ever, had an opportunity to build at the scale that they're being asked to build at in their first gig out of school. If I come off as negative, it's only because I can say with surety that X won't work because it didn't work in the past. I expect to learn, with my team, whether we can make things work at now scale.

There are times that I miss hours and hours of uninterrupted coding, and being inventive at small scale. My leadership would take me out to the loading dock and kick the shit out of me if I tried to contribute at that level.

My job is, essentially, to educate, and slap the team off of local optima. To impress upon them that their customers are king, and that their customer base includes themselves and their teammates, paged out of bed at 3am. And to impress upon their management and the business that we really to need to focus on improving operations, rather than add features from time to time. It is also to observe the progress of, and advocate for the developers on my team.

I feel (at the time of writing... Talk to me on Monday afternoon) like I'm lucky that my employer points senior engineers at problems like this.

I got to interview a lot of people in the past 2 years, and I noticed that new grads are REALLY good at solving Leetcode type of problems.

The expectation in the industry is that Senior people would be even better at it, because of their years of experience.

This,however, is not what I’ve seen on the ground. Most senior devs that I interviewed, who were clearly smart and experienced took much longer than new grads to solve most of the questions. Personally, I try not to hold it against them, if they can clear the bar, but I wonder how many other people do the same.

Also, I know in music they do blind interviews, were people perform behind a curtain to avoid any biases they might have (gender, race etc).

May be it’s time to do something similar in Software, since we already are prioritizing white board interviews above anything else?

Edit: I am not advocating for Leetcode types of questions. I do believe that most companies hire like this (including my current company, where I get minor if any say in hiring process). So just pointing out my observations regarding the status quo.

> The expectation in the industry is that Senior people would be even better at it, because of their years of experience.

The fact that a person with years of experience solves an interview question more slowly than a fresh out of college grad should be a huge red flag that our interview questions are bullshit and don't reflect real world challenges.

In one hour, you are supposed to determine what will happen if they are left alone in their office to work for a year. How can you do it?

Exactly. So the arbitrary line drawn by leetcode interviews filters in one particular way and benefits people closer to colleges age where these kinds of problems are fresher in the mind. As you get older the number of leetcode style problems you actually encounter in real engineering is pretty low.

Have a chat, discuss experience and see how well they actually understand the things they have put on their resume. Dig as deep as you can and see if they actually understand what they say they have done. Also, references are at least as important as the interview.

I can only speak for myself, but hires on my team are usually based on my recommendation and I mainly try to tease out whether the candidate has enthusiasm and strong opinions about the languages and frameworks we work with. If I ask what their favorite stack is, or how they prefer to organize a project, and they just mumble their way through the answer, I think they are probably not a good fit.

Depends on the experience level of the interviewee. If they are fresh out of school I expect them to know algorithms and data structures and maybe some specific topic they studied at an advanced level (like compilers, db, graphics, etc). If they are 5-10 years into a career, they should have a lot of knowledge about whatever it is they've been doing. If someone says they have 5 years of Android programming, I'm going to ask them a lot of Android API questions as well as Java.

In either case, there are questions that have multiple difficulties of responses. The more experienced/capable they are, the more nuanced and complete their answers will be. For example, ask a Java person how GC actually works. Most people have very little idea, but the best talent knows a good bit more.

But there are other variables. I've known perfectly intelligent and productive coders who took months to build a really complex well architected system that didn't solve the business problem and hence were a bad hire.

>I've known perfectly intelligent and productive coders who took months to build a really complex well architected system that didn't solve the business problem and hence were a bad hire.

That's an interesting statement. Isn't it usually the job of product managers (or someone in management) to determine whether or not what's being done is actually wanted? It is not typical for developers to be brought on under the broad banner of "make us money please" and then set loose to email users and determine 100% of their own direction. Wouldn't the fact that the wrong program was being built have been caught in an informal "let me see how it's going" sort of status update?

Right. Maybe it just shows that the interviewer is too inexperienced to ask questions that are relevant.

Human beings went to moon before we had this coding test. this online coding test itself is indirect age discrimination..this is insanity needs to stop..except for few high end jobs most jobs don't use any of these need deep algorithmic knowledge..this has led to a cottage industry of books, websites etc..some younger folks prepare for 6+ months, go for weekend mockup tests etc...older folks don't have time or motivation to do any of these.

I would say most do, which puts seniors at huge disadvantage.

Expectation: solves leetcode on the fly, the job will involve leetcode only.

Reality: piles of legacy code that requires deep insight into product, legal, and marketing details to understand and untangle.

Guess which one leetcode script kiddies will suck at?

If you can see that clearly smart, experienced developers aren't very good at leetcode problems, what makes you think an aptitude for leetcode problems is a good measure of a developer?

The question isn't how much you should hold it against older developers, but how much you hold it against leetcode as a way of assessing developers.

Seems like basic logic to me.

I don’t, but it’s not the reality. Reality is that 80% or more of companies (in my experience) rely on those types of questions for interviews.

Solving leetcode is not software engineering.

One possible reason for the lack of interest in older coders is that they're largely white and male. Hardly a sympathetic demographic in the current cultural environment.

For myself (note: am old), I'm just following the advice I've always given to other minorities that are discriminated against to various degrees. Which is: don't whine, and play the cards you were dealt as best you can.

The FAANGS may not, but many other employers can spot talent when they see it. Keep looking--there's a niche for everyone.

There are tons of smaller companies out there that struggle to find competent developers. They're not necessarily super attractive jobs, and aren't as high profile, but you can get paid well in a lower cost-of-living area to solve some interesting business problems.

If you set down any roots, your captive to that company in most lower cost of living areas. There aren't many alternative employers generally, especially around the same pay.

It's kind of true, but every situation is different. I'm in a smaller city, not on the coast, and I'm doing ok. It's definitely more challenging in some ways. I make it through doing great work and networking. It's taken me a while but I'm building up a network and now I've got more opportunities available. YMMV, I know everyone has a different experience.

I usually joke that “there are always cozy government jobs which are usually starved for good pogrammers”.

Blenkhorn says that once she was back on the job market, the ageism she experienced was compounded by sexism. Despite her profound technical achievements, she was dismissed by recruiters as irrelevant and dull, as a “mom.” She recently completed a PhD in computer science and hopes the education will improve her chances in the job market.

Which is practically infuriating to read, once you take even a passing glance at her easily findable profile.

If recruiters are really saying that about her -- this just shows (yet again) how utterly useless they are at what you would think would be their core function:

That is, identifying and promoting technical talent.

I'm getting up there in age and have always resigned to the fact that I'll probably be making significantly less money in 10 years or less in a low skilled job unless I get lucky. I don't want to go into a full management role in software so it's just something I've accepted.

I'm not sure how I feel about it. Sometimes I just feel fortunate for the career I've had. I mostly wonder if I'll miss it or be glad to move on to a new chapter.

Same boat — mid 50's. (Crazy how I didn't see that happening.)

For me it's a combination of 1) I see the writing on the wall: they don't really want me in this profession any longer and 2) I'm getting tired of it anyway — increasingly less willing to (laterally?) "move my skills".

Begrudgingly learned Swift (I like it tho). Begrudgingly learned every new code management system that has come down the pike....

I'll switch careers and move on, go back to "service sector" pay scale.

Maybe I worry more for the current crop of programmers. Those of my generation did all right — had fun. The industry is changing and not necessarily in a direction that looks enjoyable.

Early 50's here. I am switching jobs now, I'm at the top of my game, and getting the biggest money I've ever made. There are several factors, I believe. First, keep up to date. Certain things are really popular right now and therefore attract the best pay, and more jobs are available. Second, be in the right place. For example, SF Bay Area only has so much housing but has a lot of companies who are hiring, so they are competing for the people who are here. Third, project confidence because you are in fact awesome because of all the cool shit you've built. If you walk in to interviews resigned to making less money or a taking a step down, I bet you'll find that actually turns out to be true. Fourth, on a regular basis (once a month or so) you should be having conversations with your manager about your career and next steps, and taking time to work on getting there.

One more protip. At a higher level, moving up further does not mean being a better coder. It means talking to people (often on different teams), thinking and generating ideas (and applying for patents), sometimes taking risks to prove those ideas, figuring out how to improve process, and generally communicating. If you believe you're not good at any of these, you can actually study and practice them and get better at them. I did, and it's working out for me.

How about moving into consulting? That's pretty much my plan. Older developers have both the experience and the expertise that companies are looking for in a consultant, and it pays pretty well.

Yeah, I'm worried too, here in my mid-40's. I just hope I can hang on to my current salary long enough to get my kids through college. My son is talking about majoring in CS, though - I'm wondering if I should be talking him out of it.

The doubling rate of programmers has been 5 years for pretty much most of the industries history.

This means, right now, half of programmers have been doing it for less than 5 years.

This is the primary reason older programmers seem oddly rare, 87.2% of programmers are less than 15 years into their career, 75% less than 10.

Now the interesting question is - what's the churn rate for those same programmers? Be it into seat-warming work like management, fluff work like "sales engineering", or exiting the software world completely?

Why do you think "sales engineering" is fluff? Ref: https://en.wikipedia.org/wiki/Sales_engineering

I've worked with at least 3 older IC software engineers (close to retirement age) and their experience has been invaluable in forseeing issues that younger people don't spot in advance due to limited experience. Also the breadth of the work that these people have been involved in is very useful in being able to pull experience from other areas into an unrelated project. I genuinely don't understand the bias against this unless the person is unwilling to learn new things which hasn't been my experience.

Hopefully this attitude will change as the valley ages and people will learn that a lot of software engineering principles don't change as quickly as the buzzwords do.

All of this boils down to the same thing. Labor arbitrage.

D&I benefits those trying to increase the supply of coders and via supply and demand reduce the rates. Older engineers are going to want more money etc.. Of course a nice benefit with the movement (for those who want to exert downward pressure on the wages) is that many D&I initiatives also tend to be ageist. IE: accusing the previous generation for enabling bad behavior, stating that the majority of older workers are white/male anyways etc.. Older workers generally are more skeptical of the D&I efforts. The D&I supporters have a penchant for getting people fired etc. for expressing any dissenting views against the cause...even if the person expressing the dissent is an under-represented person.

For the ones trying to reduced labor costs it kills two birds with one stone. They get an expanded pool of labor. Plus the D&I crowd does most the dirty work for pushing out older workers. Now rinse wash and repeat with importing workers from low cost countries. The rich get richer and the rank and file engineers get caught up in the battles as either active participants or collateral damage.

It has very little to do with doing things that are morally right and all about money. The moral outrage distracts everybody and keeps the infighting going so that the labor does not organize and target the real problem.

At this point I am convinced that the stack treadmill most of IT is jogging on gets propped mostly by the people who want to up their salaries by keeping as many people out of their niche as possible. Older devs probably get hit by this harder than most, because I'd imagine the treadmill gets more tiresome as the time goes by, you see more hype cycles, and become more aware of it. But older devs are by no means the only ones who are affected.

"Keeping up with with new technologies" is generally a euphemism for "uses a random set of frameworks I chose". Amusingly, devs who demand everyone to learn their frameworks are often the same people who try to stop anyone at their company from using or doing anything genuinely new, because it would undermine their position.

This is, BTW, why there is so much talk about the "shortage" of software engineers right now.

> "Keeping up with with new technologies" is generally a euphemism for "uses a random set of frameworks I chose".

This is a great point. I feel the same at work where a buzzword laden framework is more valuable than directly using a library underneath that framework.

I'll be 30 fairly soon and am I the only one who doesn't find learning new technologies all that hard? Is it going to get harder soon?

In my experience a database is a database, even if you call it something else. Really a database isn't too different from flat files with some helper functions to make rooting around them easier. In turn that's not to different from storing the information in RAM. Different ways to get at them but in the end you have the same bits in that string.

Same with programming languages, I don't know the specific one you use perhaps but I move between C, Python, Javascript and Go without much friction (Jetbrains IDE's are helpful for when you forget specifics of syntax after a time away from a language). Sure I have to look up some how something is done in the standard library of one language vs another but I've gotten pretty quick at that. If I'm working on a existing codebase I can look how similar problems are solved and replicate that style.

Containerization, Lamdba, serverless. It's the same old code running in new ways. I don't know what value to edit in the config file but that's just a google search away. At the end of the day its "I need to hit this endpoint and get this back" same story, different cast of characters.

I don't mean to humblebrag or anything but it seems like a non-issue to me. I understand how computing systems work generally and the laws of physics impose some hard limitations that sort of massage any solution to look similar enough to any other that it isn't that hard to figure out what is going on. Sure I'll lack some of the deep knowledge that someone who has focused solely on these set of technologies may have but honestly most of the programmers I meet don't have that to begin with.

The problem isn’t really learning the new technologies, but more finding the right projects to practice them with. As anyone knows (but especially experienced devs), simply learning something doesn’t convey much mastery, especially if you are used to more mastery than simply newbie levels.

> Is it going to get harder soon?

I'm 34, so it might become different when I'm in my forties, but so far it's not getting any harder. Your summary below is spot on.

> I understand how computing systems work generally and the laws of physics impose some hard limitations that sort of massage any solution to look similar enough to any other that it isn't that hard to figure out what is going on. Sure I'll lack some of the deep knowledge that someone who has focused solely on these set of technologies may have but honestly most of the programmers I meet don't have that to begin with.

Not to knock your experience, but imho C, Python, and Javascript are the BASIC of modern programming languages. (I don't know about Go).

Additionally, there are different levels of knowing a language. You can write perfectly workable programs in C without ever touching function pointers. So what's your level of mastery?

And then there are more difficult languages. Have you tried C++ or Scala? Ever explored the esoterical intricacies of Java's type system? Lisp? Objective-C? Kotlin? ADA?

Ok, that's languages. How about parallelism? Massive parallelism? Tight resource constraints such as limited memory? Error handling? Modern language and compiler design?

I'm not trying to say you need to know all these things to be a decent coder. My point is there are many harder problems out there and maybe if you're bored you might want to try working on some of them.

"I'll be 30 fairly soon and am I the only one who doesn't find learning new technologies all that hard? Is it going to get harder soon?"

I'm 40, and it's easier than ever.

I do get a bit pissy now when I have to spend a week learning some technology that I end up using for 15 minutes to do one thing, and then discard, but that's about it. (That has, unfortunately, happened.) But otherwise, it's easy. I've done non-trivial development now in languages that I don't even properly know, I just know all the surrounding languages and how to Google fast enough to make up for the rest.

I will also get a bit pissy when I do one project over here and they're all in Ansible, and I do another project over here and it uses Salt, and then I tweak something over there and it's in Puppet, and then I have to help out over there and they're in Terraform + Docker + some home grown Compose thing that predates Compose, etc. That general category of things was a legit advancement and I prefer using them to what came before (in my world, nothing but precious snowflake servers), but I'm frustrated that I have to keep relearning the easy "surface syntax" stuff over and over again for what is almost the same exact task.

I will also admit to a bit of impatience with the "re-inventing the wheel" phase of new technologies. For a recent example, the initial AWS "serverless" stuff (dumb name, inevitable idea) was written with like zero attention paid to the basic needs of source control, or the utter inevitability of how your service that was designed to have like 10 5-line scripts in it will be used to implement the next Amazon, so I stayed far away from it. Even now I'm not sure it's all that great. Docker I'm still not all that up on for somewhat similar reasons; for all the "innovation" in it it took years to be able to do basic things halfway properly. I'm definitely kind of done riding the cutting edge in the sense of chasing down communities that I can look at and tell are just going to be chasing their own tails for the next couple of years.

Sometimes a tech community comes out led by a person or a team that has experience, though, and when that happens I'm all over it.

Plus at this point there's literally so many things going on that nobody can keep up, not even the "young bucks". There is not a single person on this planet who is at expert level with all of the cream-of-the-crop Javascript frameworks, and there hasn't been in a long time. There's just too many now. When $YOUNG_GUY is up-to-date on 80% of the hot new shit and $OLD_GUY is sneering at it and up-to-date on 0% of it, back in the 1990s and such, yeah, $YOUNG_GUY might have an advantage. But when $YOUNG_GUY is up-to-date on 3% of the hot new shit and $OLD_GUY is up-to-date on 1.5% of it, and also knows a lot more stuff and can drop into a much wider array of projects and contribute (with the contribution being something other than "Let's rewrite all of this in $HOT_NEW_SHIT before we do anything else!"), the difference is far less extreme, even on the new tech's terms.

Unsurprisingly, computer programming isn't such special and amazing stuff that, almost uniquely in all the fields of human endeavor, you're worse after doing it for 20 years than you were after doing it for 2. Other than cases where the body itself just breaks down, like professional-level sports, almost nothing works like that. I think it's a historical accident of some very unique circumstances in the 1990s that have lead anyone to believe programming is different. I think the idea has gone a long ways towards dissipating already, but has a long way to go yet, too.

I was typing up a comment and couldn't get it right. You nailed it.

It seems many people either think older coders are experienced and wise, or old and stagnant. Younger coders likewise either young, foolish "wheel re-inventers" or up-to-date and ambitious.

So far, I've not seen any pattern to back this up. I've worked with a few coders in their mid-20s that knew PHP or Java very well, but refuse to learn a front-end framework or Node JS. They act as though the burden of learning a new stack when they've already learned what they know from school was too much to ask.

Likewise, I've worked with coders in their 40s and 50s who still love learning new technologies, and are more than happy to learn the new "Javascript framework of the week", especially if they see the benefits it provides over what they currently use.

I'm in my early 30s and have a strong distaste for Electron, but I know a mid-40s coder who loves Electron and would never code a desktop app any other way.

I find myself reading about Smalltalk and NEXTstep and am continually impressed with the solutions of the generations before me (and their respect for their users' resources) while the coder in his 50s I've worked with loves that we live in a time where trading RAM and CPU for easy maintainability is the norm.

What bothers me isn't just the ageism, but also the absolutism based on anecdotes. It seems to me that the same person who would dismiss an older coder based on age, is also the kind of person who would, say, "never hire someone with C# experience even if they have Python experience, because we use Python and anybody with C# experience is slow and inflexible" (real thing a manager once said) or "never hire someone who prefers to use Sublime Text because we use a 'real IDE' like Netbeans" (again, thing a manager once said).

Hire the right man or woman for the job. Pick the right tool for the task at hand. Be open to new technologies and using them for their strengths, but also learn from the good and bad things of the past. None of this seems that difficult to me, yet it seems is a continual problem in the field.

I'm 53, which makes me about twice median at my company. A lot of this rings true, but I'll add one more twist on the "hard to keep skills up to date" aspect. Keeping up with all the technological churn across the industry is damn near a full time job.

When you come out of college, you might be pretty close to the state of the art because you've been doing nothing other than getting to that exact point. Then it suddenly gets harder, because most of your time is spent working on whatever technology you already have instead of ramping up on new ones (plus eventually family and such).

Sure, you can try to weave as much learning as you can into your job, but it's almost never enough. Most people will end up getting further and further behind, or more and more specialized. The harmful effect of the first becomes apparent quickly; the second can actually be quite lucrative for a while until suddenly it isn't. Either way, at some point your knowledge becomes pretty severely devalued until you go back to spending near-full-time to catch up.

It's hard to blame employers for favoring newer knowledge over older, but the effect on older workers is still pretty profound.

One thing that I've noticed is that a lot of "new" stuff gets less exciting over time, because you have seen it before. First time is always exciting, but when it gets rediscovered later with relatively minor incremental improvements, it's kinda meh. And that, in turn, affects the motivation to learn the details (that you need to learn to be able to use that stuff productively).

Maybe a little off topic, but I find it funny how big a deal people make about going to tech conferences. I never got it.

For context, I'm 37. I watch conference topics on topics I find interesting but they're often just mildly informative, definitely not the type of thing I'm going to travel halfway across the planet to see live. A lot of the time it seems to be more about making a name for the person or the company they work for. It seems the number and prominence of tech conferences has exploded in the past few years, I don't even remember hearing about these at all when I started out (mid-00s).

For those doing it right, it's about networking and/or making a name for themselves rather than the actual content of the conference. I've gone to a couple and agree that the content itself generally doesn't warrant the trip.

> But the IC track is flawed. The programmers I spoke with said that promotion is slower on the IC track, and the distinctions between titles are blurry. According to David Golden, a 45-year-old engineer at MongoDB, “In the development-only track, there’s a bigger hurdle for me to move to the next level. It’s not clear how you get from one to the other and whether it’s something you can actually do anything about.”

In my opinion, this is both true but not a problem. The reality is that it's much easier to have a large impact as a manager than as an individual contributor. I have worked with L5's, L6's, and L7's, and honestly I couldn't tell the difference between them. That's not to say the higher level people didn't deserve to be higher level as they had more accomplishments, but I personally didn't really feel like they had much more impact, if any.

For all but the largest and most complex projects, there aren't really going to be sufficiently difficult problems that it makes a big difference. I don't think I've ever been on a project where there were more difficult problems than engineers capable of solving those problems for any level of difficulty.

> Based on interviews with a half-dozen programmers, it is clear to me that companies should create a qualitatively different role for their most senior individual contributors. Candidates for such roles would be judged by their past effectiveness, the same as managers are, not by a fast-churning checklist of skills. Greater clarity would mean engineers could climb the ladder faster, and the prestige and renewed intellectual challenge of each level would keep programmers motivated into their fifties and sixties.

> Proven engineers who occupy the most senior roles should be deployed to solve the hardest problems on the most critical projects.

At least at Google, being judged based on past effectiveness and being given the hardest problems is already true. Or rather, higher level engineers have greater independence to be able to choose what problems they'd like to solve.

Employers don't want to hire jr engineers because they're too inexperienced, and are too liable to make mistakes. And employers also don't want to hire older engineers who are too expensive, and are less willing to work nonstop. Those are strong signals about what employers in general think of employees.

Try to replace the words "old developer" with "woman developer" or "black developer" and playback the comments people make. It ends up pretty appalling.

People who are good at development can't be determined by any outside criteria -- you have to talk, test, and learn about them. At my company, we have developers from early twenties to early sixties. We don't care about any race, gender, background, anything -- all of it's immaterial to whether you can create and code. Join us: https://www.datakitchen.io/company.html#hiring

> In 2007, Mark Zuckerberg, then 22, said out loud what many in the software industry think: “Young people are just smarter.”

How dumb or gullible does one have to be to actually think this? I'm betting it's mostly the 20 something year olds that would ascribe to such stupidity.

Let's just accept that there is no track for programmers period. There are no promotions beyond senior at most companies. You peak in your early 30's and then you continue to work at about the same compensation and level for the rest of your career. A few people can find more interesting work, but it will require jumping to different companies and a lot of self-training. Unless it's at FAANG, all that work will be for an extra ten, twenty, or thirty thousand dollars. And raises? I've never hear of such a thing in any industry these days.

The best option is to keep on top of your skills, find a job working 25-40 hours a week, possibly remote, and find some meaning in other parts of your life. Work isn't everything. It's not even the most important thing.

There are and will be plenty of jobs for older coders. It's a new profession, but older coders who stay on top of the tech are still plentiful. You just need to know where to look. Also, many people don't make it as coders period. If there is a shortage of old coders, it's because people drop out because they can't make it. Even better for the rest of us. As if being in your late thirties is old. What bullshit.

I'm ancient by HN standards, but I still find my skills are in demand. I would categorize myself as a lifelong learner; I got into python over 15 years ago, mongodb almost 10, and I have taken the time to understand what I need to know about front-end frameworks like angular and react. One employer called me a Swiss army knife, and I take that as a compliment. My knowledge is quite broad and quite deep, though I'm not a super expert at any one thing and I can't claim to know everything about everything.

However I have several friends around my age who never bothered to learn anything beyond C++, and they are definitely struggling. They either are stuck in jobs because they're the only person who knows how to recompile some ancient module, or they're out of work and freaked out about the future.

Ageism is real, but so are some of the stereotypes. Older programmers are often going to be more resistant to change, and can be more sure of themselves even when they're wrong. The flipside is they often have forgotten more than a young programmer even knows.

Present Valley hiring practices seem to optimize for general aptitude, and downplay or disregard the value of experience.

There's also the undeniable fact that older people in general have built up more obligations and responsibilities, and are often more skeptical of the value of equity because they've been a few startups and their options never amounted to anything. Therefore, they are going to prefer stability and cash over the promise of future rewards.

I think to age well you have to consistently push your limits. "Learning new things" and "keeping up with tech" is a bit overrated.

If you're 50 and you've spent many years drilling down through increasingly deep scalability and throughput challenges, that will increase your value beyond what any fresh-out could provide.

If instead you spent your time learning a new JS framework each year and are proficient at 30 of them by age 50, good for you, but good luck finding a job.

The difference between the two cases you have isn't anything to do with limit pushing. Instead, it's a matter of specialization. If you develop some real depth, then that's a different matter than just being a regular programmer who's been doing slight variations on the same thing for a long time.

That's not to say that the specialists have it easy, but they do have an advantage here.

I think it does though. Or at least I intended it to.

In the first case I didn't particularly mean to specialize, but just to continuously do things you really don't even know if you can, to continually push against what you think is possible for yourself: create a js framework for instance, even a terrible one that you abandon after a year and move on to a completely different project, but that requires you to dig and research and stretch your brain. The second case was more you know it's possible and it's just following a tutorial to get up to speed on the techniques; someone else did the "hard" work, and you're not really getting anything out of it.

The first thing gives you skills that age well: you'll learn about lots of gotchas about the internals of how js and presumably other language runtimes work etc, which will help in lots of different positions; the second just gives you a line item on your resume until that fad is gone.

Anecdotes are one thing, but I think the data are important. The article brought forth some interesting stats about the median age in the industry. Does anyone know where that data comes from, and if it's tracked longitudinally? What I'm curious about is if older programmers statistically are pushed out of the field, or if it's just the case that the field is growing like crazy and newer college grads are flooding into it.

I'm 63. When I was a kid (pick an age), there weren't anywhere near as many programmers as there are now. So while I believe lots of older programmers have gotten out of the field, one way or another, I think the remaining ones are simply hidden in the crowd.

This hasn't been my experience at the larger companies. There's always room to grow and stay in a purely technical role.

Prior to this I'd been doing freelance work and there it's more difficult. Especially if you don't have any particular unique skill or tight relationship with a solid client or a high level of marketing savvy, it's hard to compete with kids that are willing to do things at half the price. And I think that's fair: even though I consider myself a very solid coder, most consulting gigs I've worked on could have been handled by someone less skilled and worked out just fine.

I feel like if you're a knowledge worker who isn't interested in management, the IC path is short-lived in almost any profession (not just SWE), no?

To me, the future isn't very bright for anyone who just wants to be an "employee" for the rest of their life, coder or not.

OTOH, the manager path is dangerous in recession times. You lose a lot of "hard" skills, and if you lose your cushy manager job at company A it isn't necessarily obvious to company B what value you have when things are tight.

>the IC path is short-lived in almost any profession (not just SWE), no?

Depends on where you work. If R&D is core to the business, there will be a non-management ladder and plenty of near-retirement greybeards around the office.

Note: many fashionable names in "tech" don't fit that bill.

Too many people make the mistake of learning laterally, a little bit of framework X, a little bit of framework Y, language X, language Y, then wonder how they can be replaced by people much younger than them. Learn deeply, this is something that cannot be replicated by people with not a whole lot of experience. Tech is a double edged sword, it rewards people who are good at what they do regardless of their age (this might or might not be a good thing depending on where u stand on the age range).

The problem is that you can spent a lot of time learning something deeply when it's in a fad stage, only to see it become obsolete in a few years. And then you'll have to learn the new fad, which is completely different from the old fad.

OTOH if you learned a little bit about various things, it's very likely that one of them will be that next fad, or its precursor. So when it happens, it'll be that much easier to learn.

So it's a balance. Going either too deep or too broad can backfire.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact