Hacker News new | past | comments | ask | show | jobs | submit login
How popular media portrays the employability of older software developers (arxiv.org)
278 points by sbaltes 16 days ago | hide | past | favorite | 393 comments

I've worked with two types of older engineers:

1) Ones who keep up their skills

2) Ones who don't

The former are a treasure trove of knowledge and skills, and provide substantially more value than anyone junior ever could. Going through so many computing eras gives a higher-level way of thinking about abstraction, or understanding computer architectures. They've hand-tweaked assembly, C, Java, and when they're now doing JavaScript or Python, they understand all the layers of metal underneath. They've gone through flow charts, structured, functional, object oriented, and all the variants there-of. They've written high-speed algorithms to draw lines with pixels, to ray trace, and are now coding GPGPUs.

The latter are liabilities, bringing in 1980-era best-practices. They're working on some legacy BASIC or COBAL system from the seventies, and surprised they can't find a new job when that's upgraded and they're downsized.

I've rarely seen #1 and #2 mix. They're very different crowds, and in very different types of companies.

My own experience is that I need to devote about 20% of my time to keep up, if I'm doing it on an ongoing basis, or about 1-3 months every 2-5 years if I do occasional deep dives. Basically, I dive headlong into whatever is the newest, trendiest stack, and get a product out using that, deeply learning all the deep things behind it too. That's what works for me. YMMV.

>My own experience is that I need to devote about 20% of my time to keep up, if I'm doing it on an ongoing basis, or about 1-3 months every 2-5 years

And that's the problem with software engineering vs other white collar careers. For example, my accountant friend is expected to be trained by their employer in the latest accounting practices and law frameworks and does't devote 1-3 months per year of their personal time on open source accounting projects to learn the latest legal framework for fun, that would be crazy for him. Same for my friends in architecture, dentistry and law. Their employers pay them to learn and gather the expertise needed for their future in the firm.

Whereas, as a software engineer, very few companies(at least in Germany from my experience) will invest into their existing workforce to train them on the job for the future language/framework they will plan to use and instead seek to let them go once their expertise is no longer valuable and hire someone already experienced in the needed stack then repeat the cycle several years/decades down the road.

That's why here you're expected to transition to management as a career progression, as IC roles are not really valued at old age unless you've dedicated your free time to coding and I don't know about you guys, but I'd prefer to spend my free time with my kids and exercising outdoors instead of coding to make myself employable in the latest stack.

As a physician, I attend conferences, subscribe to online references, question banks, various journals, take ongoing CME, repeat licensing exams, and spend the equivalent of one workday a week reading those new materials, and try to spend a couple hours refreshing myself on materials outside of my specialty. This amounts to an extra un-reimbursed workday a week, and several thousand dollars a year.

When I worked in a place that offered CME/conference reimbursement, it covered about 1-2K a year, depending on budgeting issues. In my current place, and for all independent or small practice physicians, that comes out of your own pocket.

This does not have any career benefit whatsoever; it’s done so as to be worthy of our patients’ trust.

I wouldn’t mind if it was at least partially reimbursed though. It’s an enormous chunk of change, and not for my benefit.

That's quite a workload you've set for yourself. I honestly don't know how my doctors do even the basic stuff I see them do, like seeing patients and charting. When you have to see 4+ patients in an hour (this is too many!), it would seem to me that charting would be one of the things that ends up going out the window.

I also find it interesting that you go so far as to repeat your licensing exams. Is this common among physicians as a whole? Having known a couple of med students personally, these exams were usually seen as a hurdle to be overcome and a source of stress, but, I suppose it might get easier after a few years of practice. On a related note, I find it hard to imagine that, say, lawyers would routinely re-sit the bar exam for funsies.

Regarding CME, isn't that required to maintain licensure? Or, are you talking about courses above and beyond the minimum to keep your license?

And, BTW, I don't know who you are, where you practice, or even what your specialty is, but you sound like the kind of person I'd like to have be my doctor.

Interesting how people interpret things so differently. A doctor who says that keeping abreast of their field has no benefit to them sounds like the kind of person I'd not want as my doctor.

I agree, but I also know how chronically overworked doctors are. That gives me a bit of sympathy toward the ones who don't want to basically work an extra day a week just to avoid falling behind.

> This does not have any career benefit whatsoever; it’s done so as to be worthy of our patients’ trust.

"I do it for my patients, not for my paycheck." If you have a problem with that, then by all means, I'm sure you can find another physician that would suit you better.

They specified no career benefit. I'm almost certain they view increased trust from their patients as a benefit, and the extra confidence you gain from keeping up with the latest clinical science is hard to measure in terms of personal value but almost certainly comes out to $0 in financial terms (or negative if you value your free time).

> Is this common among physicians as a whole?

Some of it is, some of it isn't. The worst doctor you know (okay, maybe not the worst, but close) is doing at least a couple of major journals and his CME. Really, you'd be surprised, but sitting on the other side of the exam table, believe me - it's the exception that doesn't try to stay fresh. That's really not what distinguishes bad from good from great doctors - it's finding a way to integrate and retain all that knowledge so you can apply it in an unexpected clinical scenario, rather than as watercooler talk or on an exam.

I don't re-sit step exams 1/2/3, but I do a lot of ongoing question banks to refresh my boards, and I do go back to refreshing material from step 1/2/3 all the time (which is what I meant about repeating licensing exams- I see now that phrasing was unclear.) You're right that it's largely a hurdle and a stress, but that's because as a med student you're drinking from a firehose and your career depends on it. Studying it at my leisure, I can dive into things as deep or as superficially as is interesting at the time, and the broader my knowledge gets the more insights I ultimately glean from going back to those fundamentals. Memorizing biochem pathways when studying for boards is hell; refreshing biochem at your leisure just to better understand and retain is... well, if not pleasant, it's certainly not hell.

CME is required to keep your license, but that's not a problem that I have - there are multiple sources of materials that grant CME credits that I already do "for fun", so I've got an over-abundance of credits. I hunt-and-seek interesting CME courses to stay abreast of interesting things. I went into medicine for love of medicine, and the idea of getting so narrow into my niche that I lose sight of all of those other exciting things would be a tragedy to me.

As much as I appreciate the praise, honestly, you'd be surprised by how much even your least-impressive physician puts into staying up to date. There's just so much to know that the moment you stop your knowledge base evaporates.

> This does not have any career benefit whatsoever; it’s done so as to be worthy of our patients’ trust.

Keeping abreast of changes in the field and state of the art offer no benefits and is done solely for patients' trust?

Goodhart's Law

May be what OP is referring to.

I have seen it in teaching. Teachers need xx hours of training per year. Training often satisfies that requirement but provides no significant benefits in terms of pedagogical improvement or content knowledge.

Teachers that want to improve do so by other means. The training keeps us in compliance.

I've known more than a few high school teachers who ended up with Masters or PhD degrees kind of by default via continuing education courses. That would seem to go against your "teachers that want to improve do so by other means" idea, unless I'm confused about the nature of continuing education requirements for teachers.

My assertion is that continuing education credits, or advanced degrees are far from a guarantee of improving a teacher's practice. Continuing education suffers from "box checking." There are a number of reasons for this.

It is in no way a knock on teachers. They are caught up in a bad system and are responding to systemic incentives.

If we apply an always/sometimes/never framework to my assertion, we can find examples where teachers advanced their practice via continuing education. So the teachers you know certainly could have advanced degrees, some even very helpful in improving their practice.

My experience in K-12 as well as studying the history of education reform in America since Sputnik was launched inform this assertion. It has been a recurring theme for 60 years.

Credentialism - I saw this word used today elsewhere and it sums up what I am trying to say


We don't gain or lose patients by it; the most recent changes in the field are often so far from settled clinical practice that they're years from anything that would be considered malpractice; we don't get reimbursed better or for it; patients largely can't tell the difference, so it doesn't change your referral stream.

It does little-to-nothing for our careers. We stay up out of pride, and out of commitment for providing our patients with good care.

Medical literature has a pretty low SNR when you look at it from the point of view of "does this help my patients?" Also, PubMed is a thing that's roughly the medical equivalent of Stack Overflow, so you can do some of this "on the fly" to an extent.

They said no career benefits. They don't usually get a promotion for knowing about X or a salary raise so its effects are indirect if they don't actually use it regularly like say a dentist knowing about dental implant options.

That is a fair point and in that light the statement is far less jarring.

> This does not have any career benefit whatsoever; it’s done so as to be worthy of our patients’ trust.

That's what I said. It's surprising how many people chose to read into my post something that isn't there.

Does this surprise you? Interacting with my doctors has never given me the impression that they stay abreast of current literature, and their employment doesn't seem threatened by the deficit.

In medicine it's largely to college bribes from vendors.

Doctors around here don't do anything close and pay out of pocket. Most will take sponsored vacations paided for by drug companies for being the top subscription giver.

Same for dentists usually paided for by some whiting product they will push over the next year.

How would you compare your earning potential to a software developer's?

Doctors on average make more, but they also start their careers burdened with overwhelming debt (a few hundred thousand dollars) and work much longer hours than software developers.

Both my parents are doctors, and I'm a software developer. And you know what? I have it really, really good in comparison.

edit: I noticed that you phrased the question as "earnings potential" -- well, in that case, it's comparable. High-level engineers at FAANG make boatloads of money.

Good points. Doctors do appear to have good career longevity though (in the US). The practice my children go to has several doctors in their late sixties. The doctor who delivered my kids was close to seventy years old. So if you look at career earnings, doctors seem to be in a better spot, as they aren't worried about finding a job from ages 50-65. Is that your parents' experience with their colleagues?

In fairness the CME/conference circuit is basically a way for physicians to legally embezzle a free vacation. Many of those conferences are held at places like Jackson Hole, the Bahamas, etc, with relatively short amount of hours per day spent in talks and the rest skiing/relaxing/drinking etc on the hospital dime.

My anecdotal evidence does not match your anecdotal evidence. I'm approaching 50, and I've had about a dozen jobs in tech starting in my teens. I've never worked anywhere that didn't allow on-the-job time for learning, and the majority of my employers have both actively-encouraged it and financed it. I also learn on my own, for fun, as my career started as a hobby and still interests me, but the vast majority of my education has been paid for during normal work hours. I also actively seek out new and interesting technology when switching jobs, and I switch jobs when things stagnate. It's what you have to do in tech, regardless of your age. If you stay too long at a company that isn't advancing your career, you're going to go stale. This isn't specific to tech either. How many mechanics, lab technicians, chefs, marketing people, stock brokers, architects, lawyers, etc. could find a job today if they hadn't learned anything in 20 years?

> I've never worked anywhere that didn't allow on-the-job time for learning

I think the issue is more just that there aren't any clear ethical standards that have been set industrywide, and since developers tend to have limited oversight, at this point it's really just a matter of what standards you set for yourself.

IMHO for things you're learning that will materially benefit your career, a reasonable standard would be that for every hour you spend on your own time teaching yourself that thing, you can spend an hour of paid time. Whereas for things that only benefit your employer, e.g. niche libraries or outdated frameworks, that should happen entirely on the employers dime.

No field reinvents itself as frequently as software.

That's only true if if you consider SWE as "learning a JS framework".

The core methods and principles of SWE are almost timeless, other things are well documented and can be learned on the go.

It's extremely difficult to communicate this to a team exclusively populated by those who do not have cross-language experience.

As an anecdote, and not to boast: I have occasionally had to write data visualization and function plotting software at least a half a dozen times in as many languages and frameworks, from QuickBasic as a teenage hobby, to C in DOS, Java (Swing and JavaFX), OpenGL, WebGL, JS with charting libraries, Linux raw framebuffer... Also tweaked MRTG charts back in the day, wrote a super basic 3D editor in highschool for games I never ended up making, etc.

A team I was on once had to add a simple chart to a webapp. When the task came up in meetings and was causing the dev assigned the task some grief, I mentioned that I've had to do some charting work before, and offered to help with any details if they got stuck. Instead of saying something like, "Ok, I'll let you know if I have any questions," they said, "Yeah, but was it D3?"

If by "anything" in the phrase "hadn't learned anything in 20 years," you mean things like legal principles, rather than specific new precedents and laws, I suspect a lawyer probably doesn't have to learn much after they clear the bar exam. If any lawyers read this, I'd love to be proven wrong. :)

I think you're underestimating how fast the law changes. In a lot of practice areas (I work in tax), there can be substantial changes every few years, and in between, daily work changes as everyone converges on optimal strategies, and then the law changes again. Then there's new software constantly, which you generally need to be familiar with or get left behind. Then there are the gimmicks of the day that your clients find on the internet that you need to be familiar with or risk looking like you don't keep up with best practices (even if the new stuff isn't anywhere near a best practice, or is only applicable to Fortune 500 cos.).

I would go so far as to say that a lawyer who just recently passed the bar exam is substantially worse at the practice of law than a paralegal with 20 years of experience. The legal principles learned in order to pass the bar exam are akin to... basic algorithms (maybe?) for a software engineer. They're important, but they're also not really what the job is on a daily basis.

Nobody can argue what the next accounting rules are. Accounting practices are handed down from above and all accountants have to follow it.

Programming is not like that. If you ask 100 programmers what the new big thing is, you will get 100 different answers. I would much rather take responsibility for my own professional education than outsource it to a company that may not have my career best interests at heart.

>Nobody can argue what the next accounting rules are

This is just not accurate.

Accounting rules not only can be argued, they are all the time.

FASB and GAAP are rife with subjectivity.

I have an accounting degree.

True but the framework is standard, which is why there is no software engineering equivalent to GAAP. GASP?

I think a lot of this is because employee turnover in the software industry is much higher than for most professions.

An accountant, lawyer or architect can reasonably be expected to stay with the same firm for a decade or longer, often their entire career. It makes sense under that context for employers to invest more in long-term skills.

Whereas, once you get to the heart of Silicon Valley, it's not unusual for people to jump employers every 6 months. That's maybe not the rule. But even Google and Microsoft has turnover rates that imply a half-life of no more than a few years for the average employee. The economics of long-term re-training just doesn't make sense.

Is this the worse thing in the world though? It allows savvy workers to continuously jump around companies and continuously re-negotiating higher compensation packages. That helps to make sure that workers are paid at or near their market value. In a way that doesn't work in the accounting industry, because future employers would look down at your resume history.

I managed a software development team, staffed with a number of folks that each had about 30 years experience. We were a fairly advanced team that wrote image processing pipelines in C++.

When they finally rolled up our team, after 27 years, the person with the least seniority had ten years.

It's entirely possible to keep folks for long periods of time, but it requires some pretty serious management chops, and those skills are not exactly encouraged, in today's world; in any field -not just software development.

I worked for a Japanese company. The Japanese wouldn't even acknowledge my employees until they'd been around for a year. Having the same people on hand, year after year, was pretty much required, as our projects always took the long view.

I can't even imagine having an environment where engineers are expected to quit every year and a half.

No matter how good the manager is, if he is beholden to an HR department that only gives slightly above COL raises while the market explodes, people will jump ship.

Says you.

Our HR was run by the Corporate General Counsel. It was pretty much as bad as you can get.

Also, we were paid "competitive" wages (read: below-market).

I was the manager for 25 years (I should mention that the engineer with the most seniority had 27 years -we started together), and I feel I did a pretty good job. Considering they did not "jump ship," I guess something went right, eh?

Nowadays, it seems that people "manage by fad," as opposed to managing humans. It's kind of heartbreaking, really.

Until they find out that while they were getting 3% raises and their salaries were well behind the developers who cane in after them that were paid at the market rate.

Wow. You really want to discount what happened. Why?

Just assume that we were all idiots, and go back to your happy place.

Yes. Anyone who is willing to work for less than market value while new people are coming in at market values aren’t “idiots” but they are unwise.

Did you tell them that “we are all family here”?

Did it ever cross your mind that there are more things than money which make a person stay at a company for a longer period?

For me personally, if the money is enough for me to make a comfortable life with my family, other things at work become important. I could never spend most of my days with assholes or do idiotic work even if the pay was way above market value, for instance. I'd also trade money for more free-time, if possible.

And yes, the company I work at does feel like a family. But nobody had to tell me this. It just does so, naturally.

Well, I'm done engaging. It's fairly clear that we have incompatible worldviews, which is sad, because I'll bet we could find many things in common.

I sincerely wish you the best.

That logic doesn't hold up. The overhead in learning your way around a Google or Microsoft-sized codebase is much larger than learning a new language or framework.

That's no surprise. Both of those companies have many individual codebases that are larger than any web framework. For instance, last I checked, Django clocked in at around 60-70k [0] Python LOC. The Windows kernel source, IIRC, is over 1M LOC, and, obviously, much lower level than Django.


[0]: Also, IIRC, 20-30k of that is taken up by the ORM.

This is a good comment with a lot to unpack, but I want to raise a couple points here.

First, I wonder if firms investing in training could possibly improve turnover, thereby creating a bit of a positive feedback cycle. It doesn't even have to be formal training, either. It could be something as simple as having a weekly journal club, or the equivalent, and encouraging engineers to read at least one research paper a month. [0]

The second aspect, engineers moving jobs just to get raises, seems weird to me from a market efficiency point of view. Interviewing costs companies money -- so much so that it's something they should want to do as little as possible.

Many companies don't keep pace with the open market in terms of raises, which is a primary force driving people to job hop. Are there any studies comparing companies that do at least attempt to keep comp for current employees in line with the open market against those who don't?


[0]: In my experience, reading research papers thoroughly can be a pretty thought intensive process. In grad school, where I studied math, what I would do is read the abstract, decide if it was interesting, then skim the section headings and statements of theorems to see if I wanted to go further. If I did, and I was searching for a particular widget I needed for a proof, then I would read as much as I needed to read to digest the proofs of the useful theorems. If it was for general interest, then I would read the whole thing. I found that once I got an interesting paper in my hands, fully comprehending what it said could take up to 1 day per page for particularly dense papers.

> Whereas, once you get to the heart of Silicon Valley, it's not unusual for people to jump employers every 6 months.

Unless you're serially hopping from one shitty startup to another, it's pretty unusual.

> because employee turnover in the software industry is much higher than for most professions

I wonder if that's actually true, and if so, to which degree. On what basis are you saying this?

> Their employers pay them to learn and gather the expertise needed for their future in the firm.

I think it very much depends on the company and culture. I always had jobs where learning and self improvement was encouraged and expected (also in Germany). With a budget for conferences and books and a fixed time frame (~20%) for that. These were all companies that primarily did software development -- either direct product development or project work for customers. On the other hand you have firms where software treated as an appendage. They might have other great products but an entirely different managerial background. A mindset like: "We need a software department, everyone else has one too" can easily lead to mismanagement -- and I think a lack of time and budget for self improvement is an aspect of mismanagement in the business of creating software.

Ah, training - I remember that. Sadly I was only ever "treated" to a couple of "real" (paid-for) courses at the start of my career and the rest was on-the-job.

I've also heard of a lot of employers that are pushing the 20% policy for their software engineers or providing some other time allocated towards upskilling.

My employer requires us to do one hour of training per day. We are allowed to study whatever we want or work on personal projects. I personally don't think it's weird for an engineer to keep up with emerging tech and trends. I'm sure a lot of engineers outside of software do this.

I have a slightly different problem. My employer will pay to help us keep our skills up. We can take training classes and go to 1 related conference per year. We have some time during regular development to work on innovative projects that we'd like to implement or experiment with.

That's all great, but much of it ends up not being applicable. For example, we were trained in a new language about 3 years ago, but we haven't been allowed to use it in our product yet! I've been doing my home projects in it, so I'm ready to go when we do start using it. But most of my coworkers took the class and haven't touched it since. They've likely forgotten everything about it. Likewise, the conferences are nice, but I've never implemented anything useful after having read a paper about it, or seen a presentation on it. (The few times I've tried, it turned out the paper didn't give enough information to do your own implementation!) It does keep me aware of what's going on in the field, but I'm not sure how useful it actually is to my job.

You definitely have to push for it, only in the best of companies will everyone from top to bottom management take case of this proactively. Speak to your manager regularly that you need to expand your skills, tell him why, and show him what you want to do and how he can help. Some managers are not very good at this and you need to do the majority of the work. Accept it and do your part, but don't think training isn't needed just because your manager doesn't bring it up.

> Whereas, as a software engineer, very few companies(at least in Germany from my experience) will invest into their existing workforce to train them on the job

Maybe I've been fortunate, but in the UK and for a couple of years in Australia, I have had employers (and later clients for my contracting business) who have been happy to throw me at projects far enough outside my comfort zone that I keep learning and stretching my muscles. I feel at the top of my game (much of the time).

> ... (in Germany) ... instead seek to let them go once their expertise is no longer valuable and hire someone already experienced in the needed stack then repeat

Interesting. I thought Germany's labor law highly discourages this. Isn't "It is easier to divorce than to fire someone." a German saying for your tough labor law?

Note that these other professions have tests and certifications you need to pass. I can see developers howling in anger if they were required to do this.

I don't think it makes sense to have this in the form of required licensure to practice, but I certainly wouldn't mind if there were tests and certifications I could take that would allow me to show prospective employers what I could do. Extra bonus points if having those things on my resume allowed me to avoid the types of interviews where the candidate is essentially put on the spot for an entire day in front of a whiteboard or laptop. To their credit, I believe that's a direction TripleByte might be attempting to go in, but I don't pretend to be able to speak for them.

How often do legal frameworks change?

As a non-expert, I think there's a 50% chance it costs 10x more to teach each new hire for a month.

> They're working on some legacy BASIC or COBAL system from the seventies, and surprised they can't find a new job when that's upgraded and they're downsized.

Wow, exaggerate much?

This doesn't describe the vast majority of "older" workers, it's just another disappointing stereotype and expression of ageism.

The problem is that there is an undue burden put on those in group 1 to prove they are not group 2.

Imagine an engineer walks into an interview. It's a young person, you think nothing of it and you go on with your normal interview. Now a different engineer walks in, and he has grey hair and some wrinkles. You feel the need to dig into whether they're in group 1 or group 2, in addition to your normal interview.

I'm not saying you're wrong, but if we were talking about how there are two groups of women and one of them is a liability a lot more people would be setting off alarms.

Given the process of interviewing I think that burden is on anybody who attempts to go through the process of interviewing to a degree, young or old, male, female, or non-binary no matter the color or creed. A normal interview should dig through the qualities of 1 and 2 - although in practice many are downright insane powertripping prejudices that a stranger couldn't possibly know like filtering based upon if they follow arcane, arbitrary and archaic fashion "rules", or if they write a thank you letter afterwards.

People do think that about women, but it usually goes “Is this the kind of family-centered woman that will be on perpetual maternity leave and not pull her weight?” It’s very hard to do anything about it even if we acknowledge that it happens.

>Ones who keep up their skills

To get a rough idea of the value of up-to-date skills vs apparent age, let's consider a hypothetical:

If both a 50 and 24 year-old graduate from the same coding boot-camp, do they have equal odds of being seen after the first interview? (let alone actual employment)

And how big is the difference in probabilities?

I also note that desktop games are primarily coded in C++/C. If hiring is skill based then we would expect that industry to be zealously recruiting older engineers.

Sadly from experience, HR will blindly screen them out and speil the usual "overqualified" excuse to not progress. That they regularly do and get away with.

Had situation exactly like that once and I bypassed HR straight to manager who lambasted HR, got interview and the job, still got shafted by HR who messed up salary, lied about rise and generally made my life hell with pettiness and bullying for want of another way of putting it. That is along with one person in HR taking my side and telling me what her manager was upto and next thing, that person was gone. So yeah - HR causes many of these ageist issues when it comes to the situation you outline.

What I've heard about game development (correct me if you disagree) is that it is typically higher demanding or lower paying than gigs in other fields. What I've heard is that it's only worth being in that industry if you're passionate about it. I think older engineers will trend towards less demanding or higher paying industries, and are less likely to have as serious of passions towards gaming

Boot camps aren't "up or date skills" but to the extent that 24 year olds are getting hired off them would still support your point.

Why do you assume young people don't learn C++ to work on games? They do to work on web servers.

The AAA in AAA games refers to media content, not the game engine. Game programers are a trivial fraction of the C/C++ industry.

24 year olds aren't getting Senior or Mid-Level Dev salaries. They know the new hotness and are cheap -- that's good enough.

> They're very different crowds, and in very different types of companies.

This is not a fact. I've seen both of your types in the same company. Let's not reduce people down to "you're either this, or this". It's not a good way of thinking. People are more complex than that and have different things to offer.

I am fascinated when the #2 group pops up to say there is age discrimination in coding - when you dig into it, the problem is about skills and not age (although I cannot really speak to ageism in the tech sector).

Every other professional career requires you to stay up-to-date in your professional skills and knowledge to stay relevant in your field. Why would coding/tech in general be any different?

Just because there are older folks struggling to find jobs after not keeping up skills doesn't mean there isn't also ageism, and it can become problematic when everyone disregards the ageism by saying it's all just meritocratic and skills based.

I think there are lots of younger developers in the hiring process who start out with a bias of assuming older developers have atrophied skills. Then when that bias makes it harder for older developers to find jobs, the younger developers say "ah, well that's meritocracy for you".

Beautifully said. Ageism is real.

This isn't true at all. Think of other professionals - lawyers, doctors, professors, etc.

A doctor specializes in one aspect - surgeon, anesthesiologist, ER, GP, psychiatry, etc. They might pivot once in their career, but most of them don't, and they're able to find employment as long as they're able and willing.

I know several lawyers, and most of them had to specialize by their early 30's if they ever want to make decent money - family law, real estate, employment, personal injury, whatever. Again, the older the lawyer, the more seniority they have and higher they can bill at most firms.

How many professors in academia do you know that have experience teaching in multiple schools, e.g. business, engineering, social sciences, etc? Not many, usually they have a very narrow niche.

The problem with the tech industry is mostly due to offshoring, the rapid (and pointless) pace of new frameworks and tech that's mostly due to shifting dominant players, and the naivety of most software engineers who've been unwilling or unable to organize and create some sort of protective barrier similar to every other industry (teachers or cop unions, AMA, legal bar, UAW, etc.).

And again, because this is HN, the majority of developers are not worried that they won't be making FAANG salaries with sweet equity and stock options into their 50's. They're worried that they'll be training their 25 year old replacements from Bangalore at the typical mega bank or insurance company, left with only sporadic temp gigs and 6 month contracts at half their salary and with 15 years left before Medicare kicks in.

You can do pretty well as a developer if all you know is eg Cobol or low level C.

Do keep in mind that the industry has exploded in size over the years.

I'm pushing 40 and primarily a Java developer. I'd be surprised if I can't ride this wave all the way to retirement.

Python seems to be headed this way, as well. People tend to forget that Python is almost 30 years old already. That it's held up this long, and that it's still being developed and maintained strongly suggests it will continue to be a viable language in the industry for many more years.

There is still plenty of new development going on in Java, and I hope that continues. But I'd be afraid that if Java is all you know, you're going to increasingly be stuck on critical legacy JEE / Spring apps at banks, insurance companies, etc. Right now that's okay - there's still a lot of innovation in these frameworks. But in 10-15 years, it might be the worst kind of gig left, stuck with offshored and contracting teams of the lowest bidder.

If Cobol is anything to go by, at least pay won't be much of an issue.

>Every other professional career requires you to stay up-to-date in your professional skills and knowledge to stay relevant in your field.

Is this really the case? My limited experience is that the amount of constant learning expected from a coder is an order of magnitude more than in most other fields.

I work in education - if I were to sit on my hands and use anything more than the most basic foundational research and 'best practice' from when I left my post-doc program 10 years ago, I would be unemployed. I work on professional development and skill building constantly.

That's not the exception to the rule, either. You're expected to stay current on trends in your field, and stay ahead of best practices.

#2 describes a trend in every sector and business, there are a lot of people that find it harder to get work as they get older because the value of a skill set/knowledge base evaporates. From coal mining to payroll.

Definitely agree - at my last employer, there were 3 of us in our 40s. Their skill set could be best summarised by "the year 2005": Subversion, MFC, and C++03.

I suspect that in 15 years people will be saying the same about K8s, possibly JS if Wasm takes off and I suspect ML will a fairly niche area.

Is "subversion" a skill (unless we're talking about spies ;)? It's just a tool. I've used it. I've used cvs, git and other stuff too. If I had for some reason to work in a company that uses some other tool to handle their code repos, I'd learn it in a month or so (less if it's organized logically). It's like saying "driving a Honda Civic" is a skill - driving is a skill, Honda Civic may be what you're driving right now, and next week you may need to drive a BMW, and being able to do both is what I'd call a skill.

So? Subversion works fine; nothing wrong with it. Are you now a bad developer if you focus on actually getting stuff done instead of spending time migrating to a new tool?

Considering the company is down to its last couple of developers and can't attract new ones because of the old technology stack, it's definitely a problem.

I guess, but those tools alone build powerful fast software, and those programmers are solving problems which translate well to any field requiring problem solving. The software world may change rapidly, but a React / Git stack is no different than a C++ / Subversion stack when both jobs require programmers to solve hard, complex problems.

I see a lot of 40/50 year olds doing Java/Python/C++ development with CICD unit testing skills that are handy programmers without knowing much about JS/ML/k8s. To me that puts them right inbetween your #1 and #2.

As a programmer in that age range I know I don't need to spend time learning the nuances of JS if I'm not using JS. If I start on a JS project (god forbid) I can learn that then. Just like I learned C/C++/Java/Python/PHP/Ruby/Clojure etc when I started on one of those projects.

Machine learning - we know that just means linear regression or Bayesian filters plus marketing, and we prefer programming. We've also seen 20 years of "magic bullet" solutions like ML fizzle and die in the real world and know most ML projects never see a day in production.

K8s is great if running k8s is your job. But it is a specialized skill that is only needed to run very large infrastructures, unless your project architecture (read microservices) has gone very wrong.

20/30 year olds think "keeping your skills" up means learning every new programming fad that blows through because they don't have the experience to differentiate the fads from the advancements. It is like telling master carpenters they need to keep switching brands of tools to "keep their skills up". But all these tools do the same things and are 99% identical. They are busy building stuff with the tools they have.

I love the carpentry example, it's perfect! Sometimes I think tech hiring is kind of like hiring a carpenter, and asking him, "Do you use a circular saw? Because we're a circular saw shop. And I don't mean your 1952 Black and Decker saw, I'm talking a modern day Makita. If you don't have modern Makita circular saw experience, you need to work on updating your skills!"

The older I get, the less interested I am in learning new things for the sake of it. I've already programmed with a whole bunch of different environments in the last 20 years; what's the value of learning something new if it doesn't give me any benefits?

It's not being lazy or a "dinosaur", it's just better time management.

The absolute worst devs I've worked with were very "up to date" people who wanted to re-do all the already-working "outdated" stuff to new "modern" standards. Often, it's just a waste of time.

Very much this. Knowing particular set of keywords that constitutes latest fashion language is a short-term skill. Knowing the paradigms that guide all of them and being able to learn the keywords if necessary in a short time is a long-term skill. Some companies prefer to "hire to the spec" to save short-term learning costs. Smarter companies look into long-term skills which will be useful whatever keywords are in fashion this season.

ML includes logistic regression these days too!

Revolutionary advance in industry best practices. /s

From your comment I get the feeling that you think Java/Python/C++ is something obscure legacy tech along the lines of Basic/Cobol.

It also implies that machine learning and devops is a norm for software development

Which is typically a symptom of having worked at a start-up vs. an established company.

During the dot-com crash, I went from working at a 40 person start-up to a 25,000 employee utility company, and it was a real eye opener. A lot of my "cutting edge" (for the time) skills were dismissed as being flash in the pan, and all the "real work" was done with tried and trued technologies. I ended up finding my way back to a start-up a few years later, and everything was reversed again.

That says "startup". ML is the buzzword that investors love, and k8s/devops allows avoiding big investment into infrastructure which may need to be dropped anyway when it turns out the market doesn't actually want yet another "apply ML to click stream to save on ads costs" startup (I'm stereotyping of course but you get the idea).

Makes me wonder how old previous commenter is too if they have that view.

Ageism is rampant in this industry.

It’s actually some what ironic considering how much SV pushes for diversity and inclusion.. unless you’re over 40.

I'm 42, and until recently worked for a well known tech company in San Francisco. Most of the time I didn't feel like I was the only person over 30 in the room, but after moving to another city and starting to work on a truly age-diverse did I realize how unbalanced my previous team was.

My current team has a good mix of industry experience and excitement for new technologies, which makes planning both effective and exciting.

Part of the problem is that SF is just so damn expensive... it's going to self-select for people that can afford to live without the additional burden of a family and that tends to be people <30.

Some parts of SV push for diversity and inclusion, but when push comes to shove, firms are quite happy to protect established power structures - shitty managers, retaliatory practices, toxic culture, etc.

Seems to me they place them (IMO appropriately) somewhere between legacy and trendy.

1) was "keeping up with their skills", not "trendy". "Keeping up with your skills" does not mean "know JS/ML/k8s" for a wide range of developers.

No, I am one of the people I just described. I dont know where I fit in with the OP. I'm struggling to keep up with the avalanche of new stuff coming along.

As a veteran dev in my 50's, "k8s" gives me the screaming heebie-jeebies. I've just about got my head around Docker. But it's painful seeing a system that should be a nice little monolith serving a few thousand requests an hour split up into microservices and "managed" using k8s for no good reason.

I realise this might make me unemployable in a modern web dev environment. Maybe I can just ride it out until the industry goes through the rest of the cycle and rediscovers simplicity.

k8s basics is pretty simple actually. If you know Docker, k8s basically is a way to keep a bunch of Docker containers running according to a bunch of YAML configs. There are all kinds of fine details, but the gist of the thing is just that. Of course as every tool it's not always used properly.

It's not the complexity that concerns me. I have dealt with more complex things ;)

It's why people use it in the first place. I get the need for it when you're dealing with huge scale. But it seems to be the new default deployment model, for services that really, and I mean really, don't need to scale that much.

And I've seen people justify breaking a nice monolith into microservices (usually badly) so they can deploy it easier using k8s. Which is totally putting the cart before the horse.

> It's why people use it in the first place.

Easier to run a small service in a predictable environment where nobody can step on your toes. Also pretty easy to adjust resource allocations, update pieces independently, easier to isolate screwups (1 part going down sometimes is better then whole thing going down), etc.

I mean, of course you can't approach the task as "we want to deploy on k8s first, no matter what" - of course you have to consider the task at hand and if it works better as monolith - keep the monolith (you can still use k8s - it's just a way to run code, any code can be run within it). But if the task suits the model - e.g. many data processing/transformation/ML workflows do - having a tool like k8s can make deployment easier. One doesn't have to make a religion out of it, but if your problem looks like a bunch of processes working on separate tasks, it may be a useful tool to manage them.

Whether it'd scale better, performance-wise, is a very tricky question, which depends a lot on the task at hand. I think flexibility is a more important aspect. If your task is monolithic and all you need is raw power then maybe k8s isn't the answer.

> Whether it'd scale better, performance-wise, is a very tricky question, which depends a lot on the task at hand. I think flexibility is a more important aspect. If your task is monolithic and all you need is raw power then maybe k8s isn't the answer.

For most (80%+) of the applications I've seen k8s used on, the performance question is not tricky at all. Monolithic performance would definitely be orders of magnitude greater.

I can't help but draw the conclusion that people are using k8s because it looks good on their CV. Whether I'm wise in being skeptical about k8s at my age is a good question.

I can't say much about deployments I haven't seen, but I am using k8s at my current job and where we use it it works quite well and makes deployment easier. I can't tell much details but it's basically processing a bunch of data in real-time in a bunch of ways, organizing them in certain manner and serving certain set of queries from the result. Before anybody asks, no, it's not ads and not clickstreams or anything like that :) And deploying with k8s seems to work decently with that.

Moreover, I can see quite a few places where on my last job (which didn't use k8s) introducing k8s deployment could help in some places. That said, "one size fits all" is never a good strategy. But I think saying "people are using k8s because it looks good on their CV" is wrong. It has its uses, just don't put it where it doesn't belong.

There's a whole lot of current, modern programming that doesn't involve JS, ML, or k8s. Heck, I'm a young programmer and I've done marginal amounts of ML work and avoid k8s beyond a high-level of understanding of what it is.

As an older programmer I have been actively avoiding JS/ML/k8s.

JS is simply garbage we are stuck with where you have to learn all this years footguns to avoid creating bad code. ML is a buzzword of limited scope. k8s is system administration by another name. Web and mobile technologies are useless to learn unless you need them RIGHT NOW as they have a half-life of 18 months.

I want to learn the "force multiplier" sitting beyond what we are using today. GC languages were the last round of force multiplier, and we haven't had much since.

Right now, the only candidate that looks to be a force multiplier is Rust, but I would jump to something else that looked like a force multiplier.

ML is a "force multiplier", but it has limited scope. It might be worth learning depending upon what field I'm sitting in.

If you mentioned Rust, Go probably goes in the same bucket - seeing a lot of it lately.

Go is just another "managed language" with some oddities (somewhat better support for concurrency useful to servers and some programming in the large improvements).

Dart goes in a similar bucket even with the native compilation.

These languages are all effectively Java with some makeup.

I'm not seeing much force multiplication. I see no new language allowing me to write something more than what I can write now.

It remains to be seen whether Rust will wind up as a force multiplier or not. But it's really the only current candidate even if it's not a great one.

I know quite a few game and embedded developers of all ages from 20s to 60s - none of JS/ML/k8s seems to be particularly relevant to anything they do.

ML the language or Machine Learning?

Haha! Given they listed k8 and JS together with ML, clearly that must be something "trendy", i.e. Machine Learning. I love how acronyms can be translated completely differently depending on where you're coming from.

That's just your wishful thinking. Here on HN I've seen both people that stayed up to date complain about discrimination and interviewers say that they expect older developers to bring more to the table - i.e. if one is a decent older developer they will be disadvantaged when competing with a decent younger developer which has learning potential.

Yeah, the issue as an older developer with age discrimination is not that you are entitled to some credit for years of experience when say, you haven't been using Java since college, but that you can't be hired on the same basis and salary as a new grad who has no experience either. Despite having demonstrated many times your ability to learn in the past.

It's easy to stereotype people who have unrealistic expectations based on entitlement because of their age, but past a certain point, a lot of employers will reject a older candidate even at the same price as a new one on the assumption that they can't learn any more.

Of course, if you are employable, you don't bang your head against the wall, you go and do something else. Like any kind of discrimination, if they were forced to accept you, it wouldn't make the culture palatable. It reduces your opportunities though.

I agree in general but "skills" are not the only valuable commodity. Real experience matters and I personally feel a lot of so called experienced veterans are more like "1 year of experience 10 times" instead of "10 years worth of experience". So that is another challenge with people who have been in the industry for a while.

Can you provide metrics for the number of people you've seen in each of these two categories? Also I'm curious about what age demographic you're in and how long you've been in the industry. Hoping that I don't sound combative asking this, I'm just really interested in your perspective.

I won't go into personal specifics, but I'm mid-career. I've been through a few companies, and made it as high up as C-suite in a smaller company, and director-level in a bigger company, before finding I prefer senior IC, tech lead, advanced development, or research roles.

Of the companies I've been to, most were highly-tech focused (and pretty elite teams), but one was a large, distinctly non-tech company (with mainframes, even, and not the modern types, running legacy algorithms). I ran a small skunkworks team there.

I can't give great statistics, aside from saying that at the tech companies, people tended to fall into the category of older is better. Senior engineers were senior.

At the non-tech company, people tended to fall into the category of older is obsolete. The tech team was there for the paycheck. They worked 9-to-5, maintained a healthy work-life balance, and kept fairly boring systems running. The work they did was really outside of the company's core competency. It just needed to be done, and someone needed to do it. If layoffs ever came, I'm not quite sure who would hire them, though. It was culture shock for me. It wasn't specific to this one company either, but to the industry (we had pretty close contacts with both collaborators and competitors). This company was industry-leading, actually.

> I've rarely seen #1 and #2 mix. They're very different crowds, and in very different types of companies.

This has been my experience, too. Generally, the closer to doing technology as the main offering -- e.g. network engineers at an ISP, or developers at a code shop -- the better and more technical they are.

Once you get into the Enterprise, where they're handling specific apps and frameworks, they tend to get stuck in patterns and whither.

That's not to say you can't be an old grognard doing COBOL at a niche code shop, but when the tech itself is the offering you find much more technically competent seniors.

All too often "keep up their skils" is code for "learn what I think is important" and says more about the speaker's (in)ability to know what's important than about actual skill. That's how ageism creeps in - not intentionally but through lack of awareness of one's own bias.

I agree with this, the nuance is though that many interviewers are convinced they filter out the #2 with their pet CS quiz questions with no regard for experience or what the company actually needs. Not talking FAANGS, this is smaller companies reaching out to recruit senior devs they supposedly need.

I can agree with the methods described in the latter part of this post. There is no right or wrong, but as I began trusting the process of simply deepdiving for deepdiving sake, I found my core muscles maintained. Being able to grok and push in a short time is a valuable asset.

Any thoughts on how to identify companies where people from group #1 work? I'm thinking of this from both a mentorship and a future employability aspect.

My perspective, as an older developer (I'm 54):

- I love coding, and I still code every day. I have no desire to move into other roles (such as management).

- As for job security, I think it is harder to find a job as a manager than as a developer, simply because there are many more positions for developers than for managers

- Family-wise, it is much easier at my age, because my children are getting into their twenties now. Much more free time for me (including for work if I want), compared to when they were younger and need lots more attention (and being picked up from daycare)

- As for learning - that's one of the attractions of being a developer. There are always new things to take in, it never gets boring

- On the other hand, even if some things in programming are fleeting (the currently popular frameworks etc), there is a core of fundamental knowledge that you gain over the course of your career that will always apply.

50+ here. Never had an issue. Many media articles in this context also "consult" or "quote" or "interview" someone from a company that "consult older software developers how to stay in the market". One note: you have to respect it from both sides, and factor out age when young person is your manager or colleague. Your age is not an argument in a technical discussion.

Not everywhere is a high demand market. Brazil here. The job market really wants low paid workers that will work Saturday nights for peanuts. A lot of my older colleagues emigrate to have good technical jobs. Others become sellers or open a restaurant.

Very much agree with this. As someone who’s managed people 20+ years my senior, it always annoyed me when I overheard them complain about reporting to someone younger. I don’t hold their age against them, it should be reciprocal.

20 years is a huge gap. Those guys were writing operating systems at school before you were born. Of course they'd complain

But the age gap alone is not a valid reason to complain. The young manager could be in their 30's with over a decade of experience themselves, and they may be highly competent at the work they do. Assuming the young person is not qualified to manage veteran programmers is also a form of age discrimination.

If they're managers they should be in management jobs. If they're not, why should they complain about not being management? Management is its own job, not a "reward" granted for time in service. Not even the military works like that.

The point is that it’s still age discrimination. If someone wants their age to not work against them, they have to be willing to look past the age of others.

Besides age is not an indicator of quality of work, nor are college credentials. I’ve seen much older devs run circles around younger ones, and vice versa.

I am sometimes tempted to tell colleagues "well I've done this professionally since before you were born, so...", but so far I've resisted the temptation. Hope I can keep the streak going :)

Even if you're right, it's not a great argument, because there isn't really a rational counter argument.

If the argument is only based on having more experience - the "I've done this professionally since before you were born" argument - it can be correctly rejected as an argument from authority fallacy.

If the argument is based on a pattern you've seen before that's germane to the situation at hand - the "I've seen this movie before and it ends with regret and a data breach" argument - it can be much more convincing.

> I am sometimes tempted to tell colleagues "well I've done this professionally since before you were born, so..."

This sounds like you want the respect due to age alone.

The respect due to age alone is horrible. I wouldn't inflict it on my worst enemies.

The respect due to age and experience, on the other hand, is substantial, and something to strive for.

Back up your gray hairs :) with some relevant knowledge and the intelligent people will listen attentively.

> "well I've done this professionally since before you were born, so..."

This sounds like something coming from the advanced or expert beginner:


You don't influence by pulling the seniority card, you do it with data. If there isn't data to back up your suggested approach being better, it might not be, and you get to learn something new!

Well, often the data is conclusions from my decades of experience. It's not like I can show that in Wikipedia.

What I actually mostly do is saying "Doing X never works, because Y, instead I like to do Z because...", and it has reasonable success.

People want to hear it can work.

Sourcing data is expensive. I think it's best not to underestimate just how much development has to be done based upon gut feel, trust or experience simply because proving it would take too long.

If it's two people arguing their own opinions at each other and neither one has relevant data where exactly do you go from there? The one with 2 years experience or the one with 15?

Sometimes the data is the experience, and it's hard to dump your personal experience on someone and make them ingest it. Sometimes you've tried approach X a couple of times and it always failed, but you don't have a scientific proof it will fail again. You just know from experience if you do X it usually ends up in tears and all-nighters and missed deadlines. Not because you have an Excel table proving it, but because you've lived it.

"Historically this has turned out badly" and back it up.

If we are going with anecdotal evidence, I'm 55, in top 1% in my field with comprehensive knowledge of modern and cutting-edge technologies. After age 45 was denied promotion and had to look for another employment; was fired (from another employer) at age 52, was out of work for 7 months and had to take a position way below my level. Still there but could be fired at any moment (cost cuts, esp. under current circumstances). Receiving a lot of calls from headhunters based on my excellent linkedin profile, but never went pass first interview in 3 years.

I'm only half kidding here, but have you tried dyeing your hair, wearing slightly "younger" clothes, not including everything on your CV (and definitely no graduation year, etc.)?

It seems ridiculous but I wonder if it would work.

It's not ridiculous: I'm partly bold so I always shave my head. I tried to remove graduation years from my resume and did not find it's useful; it would only work if ageism was a sin of a few. Now I keep all my cards open: if my age would be the reason I prefer to skip erroneous interviews that hiding it would have brought (wasted 2 months and 4 interviews, Amazon, yeah?).

> - On the other hand, even if some things in programming are fleeting (the currently popular frameworks etc), there is a core of fundamental knowledge that you gain over the course of your career that will always apply.

Spot on. Ignore the popular framework of the month.

A lot of developers focus only on the surface (popular tools) and neglect fundamentals, algorithms, OS internals, how CPUs work, networks...

You can't ignore it. If you ignore the framework that is a requirement for 20-30% of advertised positions, then you are limiting your ability to get hired. That may be good advice if you are the one architecting the new project, but not if you are looking for a coding job. If you are looking for a new coding job, look at what people are hiring for and at least get familiar with a good number of those things.

Of course if you aren't having trouble finding a job then you don't need to follow this advice.

I interviewed a whole lot of people when I was in Amazon and no candidate was rejected, in my memory, for not knowing a tool, a framework, one language or one category of technologies.

The interview topics of on-sites are general enough and focused on principles, and, by far, the large majority of candidates are rejected for giving vague and superficial answers.

My experience convinced me that, as a candidate, I don't want to divide my time preparing and interviewing for many "typical" companies. I instead focus on very few interesting ones and maximize my chances there.

"shotgun versus sniper" if you will: you need one good offer, not 10 so-so ones.

Among other reasons, the typical buzzword-driven company is less likely to be an interesting environment to work in.

You can still ignore the framework.

You won't apply to to 100% of jobs. So you already make a selection.

Some jobs are require highly specialized skills and knowledge. Keeping up with the framework-of-the-month costs time as well. Time you could have spent getting better at Linux kernel programming or Cobol.

"If you ignore the framework that is a requirement for 20-30% of advertised positions,"

The "framework of the month" is not "a requirement for 20-30% of advertised positions". The "framework of the month" requires a minimum of a year to get to that point and 2-3 years is a lot more common, even in the JS frontend space. By the time something gets to 20-30% penetration it's past that phase.

What's the path around this?

I've tried to keep one main language and work with something different each time. It creates diversivity in my resume but allows me to use my core anchor language to find a role.

How do you pivot to something brand new?

Well, one option is just to ignore framework of the month entirely. It is perfectly feasible to just pay attention to things that prove themselves to the point that they're 10% of the relevant listings and have sustained that for a year or two.

A mature engineer doesn't need to be bleeding edge. It's perfectly fine to merely be tracking technologies that have gotten to their high-growth phase.

The other thing I try to do is have coverage, such that even if I don't know a particular tech I have a story that says I can learn it easily. I've used enough DBs that I don't need to chase the latest things; I'm confident that even if I've never used a columnar database that I can pick it up quickly if I need it. I've used enough programming languages that I don't need to go chasing the latest one, because it is still frankly mostly a different spelling of things I've already used. And so on. So I don't need to go chasing everything all the time. I generally don't pivot into anything brand new, because within the domains I work in, there isn't much fundamentally new stuff left for me to pivot into.

(A lot of my recent growth involves learning how to do engineering while being more directly tied into the business and interacting with business people, and learning how to be an interface between business and tech, rather than more types of tech. I'm doing this from an engineering perspective rather than a "management" perspective, because it turns out there is quite a difference between the two once a company scales up enough.)

I was taking "framework of the month" a little less literally.

Yeah, but that's a pretty big gulf.

Why ignore? Take a look, have an idea. Add another data point to the chart of where the industry is going.

Once you have seen, say, 15 frameworks, it becomes easier to map a 16th in the coordinate space, and quickly learn it when needed by reusing the knowledge you already have.

I find keeping up with the popular frameworks improve my knowledge of best practices and patterns.

All good points, brother.

I'll just add that the perspective of many years of experience brings clarity to many discussions.

= have you chosen tooling that is going to last five years?

= is the design as simple as possible or are we building a Rube Goldberg machine that will assert a life of its own?

= how are you going to train people / maintain the system? do we have a resource plan or is this a death march?

I agree that it brings clarity. You need good soft skills though, or else that clarity can become bitterness as your pragmatic advice is ignored and your teammates wonder why you always seem so oppositional.

Hasn't happened to me, but I've had a couple older coworkers who seemed to be in that place, which just seems like such squandered experience.

>> but I've had a couple older coworkers who seemed to be in that place, which just seems like such squandered experience.

It is squandered. It's one thing to know the right path based on experience. It another to be able to share that experience in a way that benefits others and helps them understand why it's the best choice. Nobody knows everything either, so having a real conversation can result in learning both ways.

Question regarding your last 2 points: how do you reconcile the fact that "there is always something to learn" with "I'm basically just learning a library/framework that does the exact same thing but with different names in the API"?

I'm being a little hyperbolic here, but after a few decades in the industry it's kind of baffling how few novel ideas are in these new frameworks. There is even a trend to go back to server side HTML and SOAP-like APIs. The tech industry sometimes just looks like the fashion industry. But maybe I'm just too cynical :)

> I'm basically just learning a library/framework that does the exact same thing but with different names in the API"?

I won’t say I reconcile anything, but I get a chuckle out of it. Makes it easier to learn the new flavor.

As The Who said, “Meet the new boss. Same as the old boss.”

To add another example to the two you provided: how many different designs and implementations have you seen for pub/sub and message-based systems (topics/queues). I literally lost count. Even fucking Redis has an implementation!

They go back at least to the 1970s. (Remember Enterprise Service Busses in the 90s?)

I think it easy to lose sight of the big picture.

Things are constantly improving, even if the road is winding. People just forget some of the horrors of the past. We used to have VB/VBA, now we have Javascript. We used to have Ada, now we have Rust. C++ has evolved.

Even C has evolved, in a kind of meta way: C itself hasn't evolved much, but the ecosystem around it has evolved a lot. How common was something like Valgrind back in 1990?

Whenever you forget how much we've moved forward, despite occasional setbacks, look at this: https://www.youtube.com/watch?v=fPF4fBGNK0U :-)

I think there is some cyclic property here, but I like to think of it as a sort of meta-refactoring. As we go back to SOAP-like APIs, as we return to mainframe computing, as we rediscover peer-to-peer networks, we take all the knowledge and improvements from exploring an alternate area with us. The advancement is in the nuance.

And, of course, there's also just the Eternal September effect with the vast majority of developers needing to learn things that the grey beards have already learned but have no succinct way to communicate (or just aren't heard over the roar of constantly greater numbers of developers with each passing year)

"SOAP-like APIs"

Do you mean things like Swagger?

[NB There have been few things I've hated more than SOAP]

No, as far as I understand Swagger it's just a tool to document your API. I'm more thinking about some ways GraphQL is used/implemented.

No question about the job security angle - a software engineer in tech is the best job security you can get. I also agree it has better work life balance than mangers, as long as you can get yourself motivated to get things done.

> a software engineer in tech is the best job security you can get

Can you explain this? Anywhere I have ever worked (Fortune 500 orgs) I have always had an "at will" employment contract as a W-2 employee. They can fire me at any time for any reason. Furthermore, when budgets are cut typically the first group to go is the IT department, since they do not bring in any revenue, unless the company is selling IT products and services.

I think there is a difference between working at a company where software is the product, vs a company where software only supports the main product (i.e. an IT department). Companies where SW is the product value developers more I think.

From a management perspective a company where software is not the product makes the software team is what's called a cost center. And companies love cutting costs. At a company where software is the product, the software team would be treated more as a profit center. In general, you want to be in a profit center, not a cost center. Hell, half our homework in accounting 102 was problem sets calculating if it was economically beneficial to outsource a given cost center - even at the most basic level cost centers are places that people want to cut.

Definitely. When an engineer is part of the “front office,” they’re valued and paid. When they’re part of the “back office,” they’re commoditized.

Has much less to do with software than strategy: maximize core competencies and minimize overhead.

> Companies where SW is the product value developers more I think.

And the IT budgets are different. Very different. On a related note, remember when some organizations had IT (and specifically web dev) under their marketing departments?

In case the OP doesn’t reply my interpretation is there is way more demand than supply so sure you’re “at will” but you could find another job quickly. As for fortune-500 and IT layoffs I imagine we are talking here about technology companies for which “IT” isn’t a line item to be cut but _is_ the business. Rules may be different at BestBuy or Bank of America but I bet not by much.

This. I haven’t been out of work as a developer in 28 years. And I’m definitely not in the top 5% of developers skill or intelligence-wise.

I am not the person you are replying to but I would imagine they mean "job security" as in if you get fired you can find another pretty quickly.

The “in tech” part I think was meant to say in technology/IT focused companies.

Oh, that's nothing to worry about. Any halfway decent software engineer who gets fired on Thursday should be able to just waltz into a new job the following Tuesday.

I keep reading that on Hackernews, so it must be true!

No one who works from home has job security in a global market. Plumbers have job security in a global market.

High demand is still providing pretty good job security. I'm fortunate to be in the us and get paid a good bit more than if I was in another country. Many many companies want developers in the us for various safety reasons. If you do work on a generic project that could be done safely by people in other countries of course that reduces your safety.

I think this is the key. Developers in our age range that keep their skills up and keep learning new things can be very productive and an asset on a team. You have to stay humble and admit you still don't know everything.

In my experience (I'm 51 myself), the older the guy, the most likely he's aware he doesn't know everything

And also the more likely they have wisdom that is far more valuable than whatever the flavor-of-the-month tech skill is anyway.

This is ignoring the stereotype of the recent college grad or someone with just a couple of years experience who thinks they know everything. I see everywhere at my work. When you take into account that 90% of software developers of all ages are just not that good at their jobs, things start to make a lot of sense.

+1 in all points, specially in the learning part. Continuous learning is a must IMHO to work in this sector

I’ve had a hard time adjusting to the AWS devops phenomenon. There is so much damn uninteresting configuration and button-pushing, rather than engineering, that I get bored with it. Yet I feel social pressures to get various AWS devops certifications to remain on top of things. Previously used Jenkins successfully for many years so I have devops experience, but there is something about EC2 instances, elastic beanstalk, load balancing, etc at AWS that just makes me want to retire (yet I still get tremendous joy from programming and designing).

Get used to it. Engineers are expensive, so if they can be replaced with button pushers they will be. Learn to push the buttons if you want to stay in the industry. You'll protect your value as an engineer because an engineer who can push buttons is more valuable than an engineer and a button pusher (especially if the company can get away with not paying an increased salary for the increased responsibility, which they usually can).

How is Jenkins not button pushing?

And besides, AWS has a stable API and a set of CLI tools to actually manage everything in the cloud without button pushing.

I am 29 now. Used Jenkins about 5 years ago because I couldn't find a much better alternative for a single-developer workflow, but it's a glorious pile of buttons that has 0 consistency and/or an API.

We manage, install and configure Jenkins as code, without clicking buttons. You can fully automate jenkins activity and setup, from install to job creation, execution, plugin updates, config changes validation etc. Of course only power users will fully use Jenkins power.

Jenkins isn’t really worth the hassle today in my opinion - there are a bunch of new build tools that solve the problem better without all the garbage that comes with it.

And they also look better.

For god’s sake, you could even use GitLab today. What does Jenkins give you? What is there to defend?

For complex pipelines and code reuse (plugins, pipeline libraries, scripting) between projects, Jenkins offers more features for power users, and its maturity has been proven while GitLab CI still suffers from his young age. Other examples with the Cons at the end of this article : https://medium.com/sv-blog/migrating-from-jenkins-to-gitlab-... and https://stackoverflow.com/a/37430097/2309958 (and I can probably find a lot more :-) )

It has a much bigger API than you'd expect, have you even tried to use it? :-)

Yeah. Can you create a user via the API?

First of all, Jenkins was created in 2005, when REST was barely past the stage of being a gleam in Roy Fielding's eye. Yet, almost from the start, almost every page had an API (you'd add /api after the URL for its corresponding HTML page). With XML, JSON and Python (!) formats, with built-in search and filtering.

Secondly, to your point, there you go: https://stackoverflow.com/questions/17716242/creating-user-i... I'm pretty sure that the reason there is no REST API for it is because you're supposed to be using your favorite back-end (LDAP, AD, etc.), with which Jenkins can integrate.

It's quite disingenuous to complain about the Jenkins APIs, there are a lot of them. They're not perfect or designed necessarily like something you'd design in 2020, but they are there.

It is indeed also a pile of buttons.

You can try out Microtica (https://microtica.com), a tool that abstracts the complex cloud setups and provides easy configurations from the UI, so you don't have to spend time on boring stuff and continue the engineering you love. Try it out and let me know what you think.

college professors can get stuff done in their 70s, so should be software engineers as long as they keep learning, both demands a working brain. software is getting complex, experiences matter.

I used to work for motorola, engineers with gray hairs are gems, full of knowledge and always learning new stuff, eager to coach new comers. it unfortunately stopped when outsourcing became a fashion.

I'm old too, I kept learning everyday, delving into c++17 these days, and it just makes my day fulfilling.

I come to HN because it's my access to grey hairs. Unfortunately I haven't ever met a grey hair that really enjoyed coaching.

Coaching and mentoring turns out to be as valuable for the Teacher as it is for the Student. The good ones will take the time to do this - even if the original plan was to shape the thought process of their evil minions. :)

I do. You will.

I'm curious about the attraction phase.

Can you describe what you do/have done on your day job, and what you are doing on your side project to prevent boredom on day job?

I have always worked for product companies, on products that I find interesting. Currently it is a product for margin calls in finance, before it was SMS routing and delivery, before that VoIP. Even if everything I do isn't new or interesting, there is enough variations and challenges, both in the code and also learning more about the domain. The times I have started to feel that it was boring and repetitive, I have changed jobs (but I've stayed at least 5 years in each place).

I also like to try to get better at what I do as a developer. Mostly this is by reading books or taking MOOC courses. I think there is quite a lot to learn about developing SW well (because it is a really complex activity). So that also keeps me interested.

About eight years ago I also started blogging [1] about SW development. I've found that trying to formulate what I think about it has also kept it interesting.

[1] https://henrikwarne.com/

Thanks for sharing. I'm currently doing fullstack web development and find it not as interesting anymore as it used to be (close to 6 years now). I'm trying out various projects but it is either need higher learning curve such as deep Math (i.e., machine learning, graphics) or domain specific knowledge which I usually don't have expertise about.

I guess it is part of the learning.

I'm a former tech recruiter for startups and now a resume writer and career consultant, and I've written for software engineers from 18 through their late 60's.

- Older software engineers (40+) won't be discriminated against if they are doing cool stuff and are doing different stuff than they were doing a few years back. I've written this a thousand times before, but we often confuse ageism for stagnation. If you've had the same job for 15 years working on the same system and using the same languages, you aren't a victim of ageism as much as you are being discriminated against because they don't think you'll be able to learn new tricks.

- On resumes, don't advertise your age if you're a bit older. We don't need to list that internship, first job out of school, or graduation dates from college. List 15-20 years of your career and leave the rest off. It isn't a biography.

> Older software engineers (40+) won't be discriminated against if they are doing cool stuff and are doing different stuff than they were doing a few years back.

But I don't do cool stuff. What I do is to keep a ~$25 million per year money printing system from falling over while the youngsters are in year 7 of the 18 month project to replace it with super cool tech. Making money is so overrated these days.

You'd be surprised how little corporate bureaucracy values the act of "keeping the lights on" for an existing and stable revenue stream. Obviously, keeping the money hose going is good for the business, but it only gets recognized in the case of failure. To put it simply, it doesn't create new value, which is what corporate culture in the US is all about.

^this is a really underrated statement.

Coming from “sysadmin”/tech/keeper of the lights on: there is no win at the bottom line. Keeping the lights on always costs money, they always wish it could cost less (except when inter-company bragging about having the new hot tool), and when you’re doing your best possible job they will forget you exist. If something happens on your watch they can just as easily assume you have not been working at all.

One thing I’ve distilled from all this: Always keep at least some focus on how much your work costs vs how much value it’s providing.

Maybe what you do IS cool, but you expect others don't think it is, or you don't make it sound cool when you talk about it. I've made some pretty mundane work sound cool, but it's not always easy.

You may be right about that. I should note, however, that I am not out there looking for a new gig. I have kids, am getting a master's degree part time, and have a small side business. Yet I still have time to read books at night. Find me another job where I can do all that :)

having been part of many candidate interviews you would have massively scored in my book with this comment. you have values other than work. that's something I value very highly.

wish there were more like you these days to interview (I am just 42, so take it with a grain of salt) but the youngsters, straight from college with a streamlined CV are sometimes difficult to deal with imho.

No one pays for prevention. Few pay for maintenance. Everyone pays for the fixes required from the lack of investment in the former.

You sound like you hold a lot of disdain for "the youngsters," and you'd be a liability on a team if you had any younger teammates.

Oldies can be hard to work with, and you're sort of proving it right here.

I have a disdain for people that don't finish what they've started and don't satisfy requirements. I don't feel as though that has much of anything to do with age. Anyway, if that makes me hard to work with, so be it.

Wait, are you telling me that you develop software to solve business needs and not as a personal art project?! What kind of crazy talk is this?!

> I have a disdain for people that don't finish what they've started and don't satisfy requirements.

And everyone younger than you fits this criteria? You're quite bigoted, I hope you realize that.

Wouldn't listing education without a graduation date basically mark a resume as "old"?

Not necessarily. If you list say 15 years of experience and a degree with no date, the reader will assume you are "at least 37". We're just trying to get an interview here - when you show up they'll have a better understanding of your actual age (in most cases), but at that point they're committed to the interview and hopefully the candidate performs well enough where age won't be a factor in any decisions.

And you can just use a different style of CV that doesn't use the chronological layout.

Something like a functional resume can do a good job in masking age a bit, but typically we still need to list employers, titles, and years.

At this point in my career, if I was prepping a resume, my education section would have one line:

Wossamotta U, Computer Engineering, BS

Doesn't need a date or any details, my job experience is more relevant, and if they've got questions, they can ask. If I had another good stint of work, I could just put Education: yes. Also going to leave off my college jobs and write Recent Experience. Although I got my last job through networking without writing a resume, and I'd expect the same for future work.

I list education without graduation date because I attended for 2 semesters. I've never considered it labeling me as old, but I've never had a hard time getting jobs, with the exception of a few jobs near me that only hire out of specific universities.

I do the same. I went to college for 2.5 years, never completed a degree. I'm not going to lie and claim a graduation. I used to list the date range I went to school, but now I leave that off entirely. No one has ever once asked about school. With 20+ years of experience on my resume, I doubt they even read the last section.

If you didn't graduate, omitting dates is not at all suspicious. I rarely see someone writing "Earned 80 credits at UCLA (2011-2013)".

> On resumes, don't advertise your age if you're a bit older. We don't need to list that internship, first job out of school, or graduation dates from college. List 15-20 years of your career and leave the rest off. It isn't a biography.

I started doing this years ago. If I listed every job on my resume, it'd be 4-5 pages long. Most of them aren't relevant to what I'm doing now, and only serve to make me look old, or like a job-hopper (which I am--about every three years--but job-hopping in tech isn't a negative unless your list grows too long). At first, I compressed older jobs into one-liners at the end, but now I leave them off entirely. I list 4-5 of the six jobs I've held in the past 22 years.

I have no way of knowing what impact this has on my employability, or how people interpret my age based on it, but I'm 47 years old with 22 years of experience on my resume, so I assume I come off as both old and senior-skilled. I never have to try hard to find jobs.

I feel job hopping in tech has gone from a negative to a positive in the eyes of many who are hiring, as long as the hops are for good reasons (better opportunity, company shut down, finished what you were hired to do, etc.).

Jumping around because you can't keep a job or nobody likes working with you is another story.

> On resumes, don't advertise your age

In Germany you're expected to include your age and a headshot on your CV[0]. I always considered that an insane invitation to prejudice and have only ever done it once, for a job I was guaranteed to get.

I'm sure other countries have other expectations.

In the US it's easy to forget that not everybody is "against" discrimination in hiring, even in tech.

[0]: https://duckduckgo.com/?q=lebenslauf&t=hk&iax=images&ia=imag...

I've written for clients in probably 50+ countries, and many do commonly include photos and birthday. I would hope people are 'against' discrimination.

Your first statement contradicts the second one, which only confirms a theory that recruiters aren't the smartest people. If you want to argue that ageism doesn't exist, please provide some stats from your former work. I believe that the real situation is even worse than we can observe exactly because people over 40 had to clutch for the old uncool systems/languages just to keep their job.

"doing different stuff than they were doing a few years back" so engineers who fail and need constant redos? Because the project I'm working on has had the same stable and productive architecture and tools for years. The only reason to redo it would be if the opposite were true. Our users don't know what technology we use. They just know the product does what they want, and keeps getting cool new features.

Who said anything about failure? If you're happy doing the same stuff day in and day out, that's great and I hope it works out for you. In my experience, people doing the same thing for a long time tend to struggle to find jobs when it becomes necessary.

If you keep building 'cool new features', you don't appear to be doing the same thing over and over again.

The skills to build from scratch and the skills to maintain are different. If you've only been doing maintenance, why would you expect to be at the top of the list to build something new? Maybe you personally are perfectly skilled to do it. How would anyone know?

I’m curious whether employers associate age with chronic diseases, and take it into account.

I'm 35, so kind of in the middle. Frankly speaking,I'd hate to work in a company where an average age is 20 as much as where an average age is 60. Sometimes I see job ads with things like ' youthful colleagues',which is crazy. I want to work in an environment,where I can talk to someone in their early 20s and gain from their perspective and understanding and also to someone who's been round the block a few times and can tell off the bat that the shit I came up with minutes ago is neither smart nor useful because X,Y,Z. Putting all this aside, one thing to consider is that there are there are tons of companies out there that do have reasonably easy operational model,which,once streamlined, doesn't require senior people,or only very few. And those companies are very happy to keep thr status quo by only employing from certain demographics.

In my country there is an industry in setting up your software company to simulate university life to attract the newly educated developers.

You’ll get cheap, extremely motivated employees, who spend almost all their time at your company thanks to boardgame/role play/pizza nights and you get them to keep up-to-date on techs by having them do weekly tech-workshops and presentations.

They produce a lot of cheap code, sometimes the quality is decent, other tones it isn’t, and their most talented developers tend to leave for “adult” jobs after a few years, but overall it has been a very successful strategy.

France's startup scene is quite like this.

Is your country America?

It sounds like America, except for the hiring for cheap bit... American companies pay a lot more on avg for their devs than other countries.

But younger engineers can be had more cheaply than experienced engineers. There is also a culture of having to prove yourself which often leads to junior engineers taking positions with companies that are toxic and which dramatically underpay them in order to get some years of experience. I think in our pandemic world, there is also a problem where people who may want to change jobs, due to poor working conditions, will stay put in fear of losing their pay check. These things and more, I'm sure, all compound against all of us, but I think can be felt more strongly at the lower end of the experience range.

oh yeah, the "pizza" party. 50+ developer here. As I age, pizza and games feel more like going to Chuckee Cheese (a US pizza chain for children's birthday parties). I rather watch a soccer match at the nearest bar or play freesbee in the parking lot.

I would go a step further by suggesting that diverse workplaces are healthier since people are more inclined to consider the perspectives of others. The most toxic workplaces I have experienced have been monocultures. In those cases exclusionary behaviour ran rampant, even within a peer group.

Have you worked in a company where the average age was that low?

When I joined Snap about 5 years ago, I was middle aged compared to the average which was probably in 20s - even my CEO is reasonably younger than me :). The company was also pretty small (100?).

From that (single/anecdotal) data point, I'd say it is not as bad as you portray. There is definitely an affinity to try new things - part of your job is evaluating it for all its worth rather than its glitter and ensuring you communicate that well / provide value. I always try to focus the conversation on the problem we are trying to solve and how the new tech is solving it better. The good alternatives usually seem to catch on very quick (See Go vs Java, Kotlin vs Java) compared to the also ran technologies. Your worth will kick in based on your skill level in judging these and definitely in getting to a mentoring role rather than a gate keeper role. But it has been a lot of fun.

The agility is also something to keep in mind: Snap allows for extreme career mobility. Recently I switched teams from doing data related work (for the past 15 years?) to a complete unknown of joining our spectacles team and Snap has been supportive. You just don't see the typecasting you tend to hear about in bigger companies.

So yeah - give it a shot with an open mind! You may be surprised!

> I'm 35, so kind of in the middle.

And this is exactly the problem with tech. At 35, you're only about one decade into your career.

Lawyers, doctors, professors, etc. at 35 would still be considered relatively young, with plenty of time to become partner, senior surgeon, obtain tenure, whatever. But you're now at peak salary, and can expect inflation raises now.

And worst of all, you've got 30 years left until you can obtain Medicare, so if you aren't working full time at a job with decent benefits, you're absolutely screwed, especially if you 'dare' decide to have a family, 30 year mortgage, etc.

People often live 'till mid 80s nowadays, and can't receive any sort of benefits (in the US) 'till mid to late 60s. If your career is peaked and trending downhill at 35, and you're not making millions of dollars in your prime (professional sports, fashion model), there is something wrong with your industry.

> Lawyers, doctors, professors, etc. at 35 would still be considered relatively young, with plenty of time to become partner, senior surgeon, obtain tenure, whatever.

After ten years in a mature field, haven't you learned, or even mastered, a large portion of your specialty? You're 95% there already?

I think OP was making the case that at age 35, non-tech workers don't have ten years experience; rather, they are just starting their professional careers due to school.

Not quite 50. Still coding, still enjoying it.

As I've gotten older my time has gone from '20% think about it - 80% code it', to '80% think about it - 20% code it'.

I currently work with a surprisingly well balanced spectrum of developers at my current employer. The median age is in the early 30s, with probably a third of us 40 and older out of 150 developers.

I'd say with some confidence, that the older developers in my company complete as many "tasks" but write less code when doing it, with a lower defect rate.

That 80/20 split (for younger me) is what taught me enough that I can now think it all out ahead of time. When I just started around 20 years ago, it was all new, it was all confusing, and all the answers were hidden behind obscurity, gatekeeping, and strange social norms (no StackOverflow back then). So I had to learn by brute force, smashing my face into every project for somewhere around 100 hours a week.

Now I have some knowledge and perspective, even the ability to pick out the fads and novelties from time to time. I can try new things on small projects and I can go with tried and true for the big things and I tend to understand which projects are better for which approach. I can visualize the data, the models, the inputs and outputs, and think through the logic from beginning to end, all within about 30-50 hours per week. That took lots of late nights and a ton of trial and error.

There's a good place for the less experienced and more experienced on any well balanced team, to be certain.

I’m in my late 30’s and almost all of my good friends with a CS/BIT degree has cycled into management. The interesting thing I’m noticing now is that they all have ideas for projects but zero capacity to build anything because they’ve lost their building muscles. They also seem reluctant to really want to learn anything else which is antithetical to software engineering where a new framework is out every day.

It's not about losing the building muscle. It's about losing the motivation to do so in tech companies. After many years of being IC, things can get boring. You keep doing the same things, even if you move companies and technologies change. And you always do it in a sub-optimal environment - things like working with tooling you don't like, under managers or with co-workers you don't really like, on projects you're not super excited about, etc. So moving to management is almost inevitable. Not to mention there are other reasons such as better salary and more visibility to change things. Also, people need change in their life. Moving to management is one way to change things. Some people go back to being IC, but most don't.

You know what's funny. I would not call the environment you're describing as a tech company. I would call what you're talking about a tech-enabled company. If you are working at a tech-enabled company, then I 100% agree with you about moving to management becoming inevitable. Tech-enabled IC's are basically rendering JSON/SQL/NoSQL in on Desktop/Web/Mobile. If you're company is just spinning up .Net Core/Node/Django/fill-in-framework-here to build a website to show some data, that's not a tech company to me.

When I say tech company, I'm talking about companies where software engineering skills actually matter. where a O(n²) algorithm also will cost your business. In my definition of a tech company there are tons of new things happening. Look at the spaces like AR, self-driving, rocketry, and machine-learning, and computational photography — no one in any of those fields is doing the same thing they were doing even 2 years ago.

I dunno man, this sounds like some unnecessary exclusionary gate keeping on the really broad term of “tech company”. Instead of trying to take this broad term and scope it down for your own purposes why not use a narrow term for your narrow definition. This is like when CS graduates claim people without a CS degree aren’t software engineers and shouldn’t be hired into the same roles... to render JSON on a mobile device.

It's just my own mental model that I use to separate companies like Hertz and Taxi's (both of whole have have apps) from Lyft and Uber.

How would you describe the difference between those two types of companies?

"Negligible", or "a matter of scale" from technology POV?

Your definition of tech company excludes most projects at Google, Microsoft, Amazon, and Apple.

You're absolutely correct. That's why if I applied to work at any of those companies (I've already worked at Amazon) I would be extremely particular about what project I work on.

Yes, don't think that is wrong though. There are very few teams even in the FAANGs that are working on interesting problems. So instead of talking about tech companies, we should be talking about tech teams.

These are ad companies.

Out of these, only Google has substantial share of revenue from the ads.

Then you have your own special definition. A "tech company" is a company whose primary business is tech, including companies pushing out boring line-of-business applications like you describe. It is in contrast to companies whose primary focus is something else e.g. agriculture or pharmaceuticals.

IMO, management was worse. Talk about things I didn't like. Moved back to engineering and never regretted it.

I loathe being in management and am totally aware that I'm probably a bad manager, so I've got a lot of respect for engineers who wind up being put in management roles, but the one thing that keeps pushing me to want to go back into management is getting put into a team with an abysmal or completely missing culture that I feel like mostly stems from terrible leadership.

I'm finally to the point where I'm confident enough with my career to actually make noise and complain about things that I'm tired of dealing with which I wasn't in my 20s, but now I'm dealing with my complaints not causing any change in the end.

This is the thought i've been obsessed with last 6 months and after evaluating everything think almost everyone is better off going to management in late 30's or 40's. There is no way out of it.

Some of the reasons

1. boredom of doing the same thing over and over in a different languange/framework

2. Still have to get "permission" from (possibly younger) manager before doing anything.

3. Have to leetcode after work to switch jobs.

4. Have to keep learning latest framework after work.

5. In direct competition with new comers who are much hungrier and with ppl who don't have many personal responsibilities.

6. Honestly, it feels a bit weird to be the oldest person on the team by a huge margin.

7. Younger devs assume you might be bad at your job to not grow in your career and don't give you much respect.

Am 43 and regularly coding in a lead position, would chime in on some of your points.

> 1. boredom of doing the same thing over and over in a different languange/framework

Sounds like being stuck in the same junior developer position for the rest for your life. Not really common.

> 2. Still have to get "permission" from (possibly younger) manager before doing anything.

What's wrong with that? Age in no virtue per se, and really it's the same situation in military, business and other organisations elsewhere.

> 6. Honestly, it feels a bit weird to be the oldest person on the team by a huge margin.

Not in an appropriately senior role.

> 7. Younger devs assume you might be bad at your job to not grow in your career and don't give you much respect.

Never an issue if you can step in and do their job if necessary.

This is kind of what drives me nuts about software engineering. I would love to work a 9 - 5 but in software you have to constantly be learning, it ends up being a 50+ hour a week career. It does pay well which is great so I have little room to complain but I feel like I am always at work.

yeah exactly. Even if you manage your time well to take care of personal responsibilities, you always have a nagging feeling that you are falling behind.

Have they lost their building muscles, or are they just not building what you want how you think they should? At 55, I'm not as prolific as I used to be, but it's not because my skills have atrophied. It's partly because I tend to work on the bits everyone else is avoiding (often because they're difficult), partly because I try to do things right instead of hacking and slashing, partly because I find it hard to concentrate on doing things within the absolutely insane structures and idioms my younger coworkers have created.

Example: the almost universally used service infra where I work is a nightmare of excessive context switches and tuning to avoid starvation/deadlocks. Why? Because the kidz who developed it apparently didn't read enough to know that the basic paradigm it's based on is known to have such problems. The people around me think this is normal or inevitable, and just live with it, but even the person who did most to popularize these ideas recanted a decade ago. Too bad; we're just stuck with it, because it's the young folk who refuse to learn.

Your older coworkers probably haven't lost their "building muscles" and aren't reluctant to learning anything. They're reluctant to repeat or build on past mistakes. Overall, your comment seems like a good example of how older programmers are often misrepresented by those who don't share their experience. Let people represent themselves.

The GP wrote these people are from the same age cohort, not older. They also ain't coding differently, but don't code at all.

> which is antithetical to software engineering where a new framework is out every day.

... In some fields of software engineering!

There is a whole world of Software Engineering outside Web/Infrastructure. There are plenty of Software Engineering fields out there where slow changing standards and toolchain reliability are considered a valued feature.

That said I don't disagree with the need for self-study in Software Engineering, I just disagree that this need originates from some issue with the high churning of languages/tools/frameworks(which is local to Web/Infrastructure)

Even in the web, it doesn't really change as fast as people say. There have been 3 front-end frameworks that actually matter for 6 years. Have there been new backend frameworks constantly coming out? where does this meme come from?

Hmm, I'm in my mid 30s (I'm still pretending to be 34 on the basis that my birthday was in May, which I contend hasn't _really_ happened yet due to the pandemic lockdown; it's still March), and fewer of my peers than I'd have necessarily expected a decade ago have gone into management; the "senior/lead/principal engineer" route seems more common.

It sure is more fun.

Half the muscle requirement is used up just to set up the "environment" for new fangled technology/framework even before Hello World can be compiled.

As an older developer (46), I stay away from the clusterf%%% of modern front end development.

In the last three or four years, I’ve been going back and forth between C# (.Net Core) using Visual Studio and Javascript (Node), Python, and Go using VS Code. The setup was. 1. Download VS/VS Code 2. Install appropriate extensions .

It makes no sense to me to try to stay up to date with front end development when front end developers are rapidly becoming a commodity with a bunch of boot camp grads. The money isn’t worth it.

I'm the same age. But I am also guilty of old habits when attempting new frameworks or languages. For example, I always want to set up the environment with the least amount of "auto loads" of libraries etc. I do not use NPM. I still play around with new JS libraries by direct downloads from the official websites or Github and always use local copies of .js .css etc. I don't even connect to CDNs for fonts! I want to know all dependencies etc. I am paranoid like that. And this can be exhausting.

Can't agree more. Recently started a personal project in .Net Core + Visual Studio. What a joy. Can't get away from the front-end completely, but I can at least enjoy the backend side immensely.

Have you tried Blazor? I wouldn’t invest time in it personally because I don’t see a market for it, but if I were just doing a side project for the project itself and not to learn a new marketable skill, I would give it a go.

I'm in my mid-20's and everyone my age with a BIT/CIS/%Information% degree looks down on actual coding. They all act like it's way beneath them and focus on some hand-wavy consulting stuff. I work with a bunch of people with this degree and I genuinely couldn't tell you what they do other than add connections on Linkedin.

I'm in my mind 30's and I'm becoming more reluctant to learning new things.

In my 20's I used to enjoy coding just for coding itself. I was always excited to learn a new language, a new library/framework/etc.

When I reached 30-ish, I was more interested in building things. The "coding" part was more of a chore to be honest, I just liked doing it because there was a product to build to solve problems. And the thing is, I can probably build any product with the languages I know today. So many of these "hot new things" are just rehashing ideas that have been around for decades.

It's interesting to see multiple people in their 30s here mentioning the decline in interest of learning. I'm young 30s and I'm literally quitting my job because I can't do it anymore - I can't be bothered to learn new stuff I need for a new project. I'd literally rather quit. I feel so broken because of it and I know my next job is going to be a third of what I'm paid now because it won't be software.

Don't hear me wrong, I should have been more specific, but the reluctance to learning is specific to tech and programming. I learn a lot of new things during my spare time. Even in the field of pure computer science there are a lot of interesting programming theories to learn.

After decades in the field, I feel like I'm mostly always doing the same thing. Grabbing data from here and there, making sure I rate limit and fail gracefully, serialize shit and deserialize a response, or vice-versa. Writing software is just plumbing and it's infuriating when people just keep changing the size and shape of the pipes just for the sake of it. That's why to me the actual product being built is more important than playing with tech just to play with tech.

... same

I went to school for CS and ended up being a data janitor. I get paid FU money to do what is essentially ETL in many fancy and various forms. All the excitement is gone.

Got half a mind to switch to interior design.

That's funny, I've been thinking about getting into indoor/outdoor painting. I don't know if I'm really serious about it because it would probably divide my current salary by at least 4 or 5, but maybe we need to start a business lol

Are you dealing with anxiety from covid/quarantine/everything right now? I think a lot of us are in this same boat, want to get out of tech, etc. I remember what it was like making a $45k paycheck though. I had such little agency in life. Obviously people can get by on that and do much better than I did but I've gotten so accustomed to having the freedom to basically buy anything I want (obviously in reason) when I want. I don't even flinch at a $150/mo gym membership which theres no way I could afford back then. I remember wanting to learn judo and kendo when I was around 23 and being shocked at how expensive it was and I couldn't do it until I was in my late 20s.

Do I need any of that or is it worth it? Surely not. But the freedom I feel now making 3x what I did when I worked in retail is something I can't really put into words well.

I agree that going back to $50k a year is going to blow chunks. But it's going to be very different this time versus when I was straight out of college - I have developer skills I can fall back on in the worst case scenario, I have six figures in the bank to fall back on, my student loans and car are both paid off, I'm not moving around constantly, I'm not spending money on expensive dating. I think I will be ok. I've never been an extravagant spender on myself, only on others.

Well I'm envious. I want to pull the trigger with 3 months of savings in my account and its terrifying. I absolutely loathe the people I work for right now and how they're treating their employees during COVID. I'm just really worried with the way things are right now I won't actually find a paycheck to take home in those 3 months even with my highly sought after skill. That and I don't have the energy to prepare for FAANG interviews and whiteboarding right now, I feel stuck and I hate this anxiety.

Sorry if this is a downer, but honestly even if I wanted a new job immediately I think it would take many many months to find one. I'm planning on it taking up to a year. I've been looking at job boards a lot and even for developer jobs, they're really only hiring senior level positions right now and I'm sure the rare non-senior position is extremely competitive. My last two roles have both been senior, but I would very much be junior for anything outside web development (which I no longer want to do).

Life happens in your 30s. Meaning you get a spouse, house, and kids. Suddenly work becomes less of a priority and even if it doesn't you have a lot less time and energy to dedicate to it.

I also think there's a frustration limit. Like if your spouse is being a jerk, the kids are being dicks, and the plumbing broke you don't want to mess around with a new technology. You want something easy and easy is what you already know.

>I feel so broken because of it and I know my next job is going to be a third of what I'm paid now because it won't be software.

I would not do that. You're probably just burned out. Take some time off and switch jobs before switching careers. Especially one that will pay 1/3 of what you make now. At least start saving up 2/3s of your salary for a while to see what it'd be like.

The problem is that I've done that before. I took a year off not working at all, just traveled and fucked around, and then got my current job. Also my partner is great and I have no kids or really any problems at all in home life. I've only lasted at my current job for a year (already gave notice). Granted my current job is very similar to my last one - maybe that's the problem.

I'm currently saving about half of what I make. Finances will be easier when I'm not the only one with an income and my partner is days away from a very probable job offer.

I'm wanting to transition to project management and eventually product management. Product management can pay pretty well ($150k+) once you're established. I can afford to not really save much for 5 years until I'm back into six figure income.

Maybe you're just in a shitty job? I thought I was burned out on programming a few years ago, turns out I was just burned out on a specific stressful job.

I mean shitty is relative. Compared to manual labor jobs I'm sure my current job would be a blessing. Six figures, sit around all day, job is easy if I had the mental fortitude to force myself to do it, I can get away with only working maybe 20 hours a week.

It's shitty because it doesn't advance me at all, there's no room for growth, no opportunity for learning something that would be meaningful at literally any other job, there is zero social aspect to it since I'm 100% remote and work on projects alone, all my coworkers are on the opposite coast of the country.

I have tried using my free time to get more familiar with ML/AI stuff but my brain just shuts down as soon as I begin to try. I would love, in theory, to transition towards AI work but the learning curve feels so steep. But the pay would be great, I think there is an absolute shit ton up upward room to grow, I could probably work on some pretty interesting problems (though I'm sure there's tons of "make ads more effective" ML jobs out there too). Maybe if I take some time off work I can try to get back into it, spend 6 months learning, and try to get an entry level ML job.

It can burn you out even if it's "objectively" better than being a coal miner or cleaning toilets.

Sounds like you might be burnt out rather than broken. If it is burnout there are ways to recover and enjoy developing software again

Does either camp have second thoughts about how their careers have progressed? You've highlighted one or two downsides for those who've gone down (or should I say 'up') the management route. Would you trade places with any of those friends? Would either of them trade places with you? Do you have an 'exit' plan out of software engineering or do you plan to stick with it to the very end? I appreciate some of these are personal questions and you may not want to discuss it but these are the perspectives that interest me.

In 2008, I was 35 and had let my career, skills and salary stagnate. I had been a company for almost a decade mostly writing a combination of C, C++, Perl, and VB6 programs for backend processes. I finally woke up, did a career reset and pivoted toward “enterprise development”.

Fast forward to 2016, I was married, with a step son who was a freshman, tired of working on yet another software as a service CRUD app at my 3rd job since 2008 as an IC, and jumped on an opportunity to be a dev lead at a medium size non software company.

I thought the next step was to either stay a hands on dev lead/ “architect” and just muddle along for the next 20 years, go into management, or go the r/cscareerquestions route and “learn leetCode and work for a FAANG” and move to the west coast.

Neither sounded appealing. Then management decided to “move to the cloud”. I didn’t know anything about AWS at the time and saw how much the “consultants” were making and that opened my eyes. If these old school netops folks could pass one certification, click around in the console and make. $200K+ a year, imagine what I could do if I knew AWS from the infrastructure and dev ops side and I knew how to develop and architect using all of the AWS fiddly bits.

It took three years and teo job changes in between, but I really like consulting. It’s the perfect combination of development, high level architecture, customer engagement and you never know what you will be doing in three months - or in what language.

Thanks for sharing! Could you expand more on those 3 years in making that transition? Do you work at a consulting company or did you start your own?

The company I worked for as a dev lead was acquired by private equity and by the time I had any knowledge about AWS the infrastructure gatekeepers and consultants took over.

I started looking for a job and got lucky that another company was trying to build an in house development department led by a new CTO. They had outsourced all of the development before.

The new CTO was very forward looking and wanted to make the company “cloud native” and improve the processes. He only had a high level understanding of AWS as did I. He took a chance on me and I became both the de facto “cloud architect” and the person he called when he wanted a customer facing project done from the ground up without having to deal with the slow moving “scrum process”.

I was quite happy at the company and would have stayed a couple of years probably even knowing I could make more money somewhere else and then Covid hit along with an across the board pay cut.

I was still not really looking, a 10% pay cut at a time when we couldn’t travel or really go out was an inconvenience but not earth shattering.

Then a recruiter contacted me for a software development position at Amazon. I wasn’t willing to relocate or do the leetCode monkey dance but we talked a little and then she forwarded my information to a recruiter on the AWS side.

I saw the interview process was basically a high level technical interview to determine whether I knew the basics of AWS (I did) and all about the Leadership Principles. I knew I could answer the “tell me about a time when...” questions with the best of them and the interview process was going to be fully remote.

To keep a long story from getting longer - I work at Amazon as an AWS Consultant from the comfort of my own home in the suburbs in a low cost of living area.

That's awesome, congrats! Thank you for sharing those details. Sounds like something I'd like to do some day.

> Does either camp have second thoughts about how their careers have progressed?

Anecdotally from our conversations, I would say that they see computer programming as being less valuable than people programming. My friends do have lots of ideas for things and one in particular will keep saying that he wants to "re-learn iOS dev" or "learn Elixir", but he never does. I've started down the path of learning how to angel invest, which is where I'm trying to learn my people management skills.

> Would you trade places with any of those friends?

Absolutely not.

> Would either of them trade places with you?

I doubt it.

> Do you have an 'exit' plan out of software engineering or do you plan to stick with it to the very end?

I'm already on a trajectory where I won't need to work a traditional job the rest of my life. None of my friends who are climbing the management chain are anywhere near that. I honestly love building and learning (I just finished up a 3.5 day hack-a-thon yesterday). My next path will lead me either to building a company, helping people start tech-enabled companies or helping someone co-found a company. Unless I become a CEO, I don't plan to stop programming.

As with my age,I'm in the middle of both worlds too: I'm a manager but I also do development whenever that's needed ( it's not a tech company). Development,while can be frustrating,is very enjoyable,as you feel that you creating something tangible and useful. The management side is never ending issues. More often than not I have to operate without having a full picture,so arbitrary decision. You also get to be the person who makes decisions for people when they don't want to do it. From the wider business perspective it's quite interesting,as you get to know a lot what's going on in the company. I'd love to do development full time, maybe in a smaller company rather than some big corp environment. Ideally,I'd like to do 10-15 years of development and only then go into management,but not sure how viable it can be.

> They also seem reluctant to really want to learn anything else which is antithetical to software engineering where a new framework is out every day.

Actually I'd say that's a sign of maturity. They've grown wise to the idea that our industry has a fetish for reinventing the wheel and refuse to take part in it.

"which is antithetical to software engineering where a new framework is out every day."

Hey! Objection!

I'm a software engineer and I have no need to keep up with random web frameworks popping up left and right. I'm fine writing my C++ and C#, thank you. I have to keep up to date but the stack evolves over a decade, not every quarter (or what ever the cadence is for web stuff).

Become an expert in a field and then it's irrelevant if you know framework xyz or not - it's no longer a critical requirment. It's critical you have domain knowledge, and what ever the tech stack is it's expected you can get up to speed in it just fine on the job.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact