Hacker News new | past | comments | ask | show | jobs | submit login
Is software development really a dead-end job after age 35-40? (quora.com)
580 points by rbanffy on Nov 17, 2017 | hide | past | web | favorite | 393 comments

I was going to add this as a reply to a comment downthread but I think it's too important to bury.

The fundamental rule of career progression in the technology industry is to position yourself as a domain expert in something more specific than "programming".

It's very difficult to "own" "programming" in a way where you can use scarcity to drive integer multiple increases of the median salary. It is not as difficult to "own" distributed systems, sensor fusion, software security, or OS kernels.

As a general rule, the people getting outlandish half-mil-a-year offers from big tech companies are domain experts, and the people on Twitter shocked and upset that they're never seeing these offers are either (a) generalists or (b) people who have chosen to specialize on the craft of programming, rather than a domain to which programming can be applied.

That's not a value judgement! There are a lot of ways in which refining a true engineering discipline for assembling software is harder than kernel development or fast computer vision processing. It's just a statement about supply and demand, and about ease of demonstrating value.

The big danger is "owning" some domain that falls out of fashion. You might "own" distributed systems and do great for a time, but then companies start transitioning from that architecture and nobody returns your calls anymore.

Also I've noticed a lot of sub-fields like computer vision or self driving cars where there's a great deal of nerd interest but very little actual employment.

In fact, I have a feeling that computer programming is the stick that's holding the entire STEM tent up---if not for programming as a consolation prize you'd have a whole lot of physics and chemical engineers making lattes at starbucks---and further that web development is the stick that's holding the programming tent up.

I am a programmer. I am pretty good at algorithmics. I love implementing something out of a published paper. I love to navigate the math. I can't sell that though.

I just used to say I am a computer vision expert, because in the ton of things I can do, this is a relevant domain.

Recently, advances in deep learning made half of the computer vision field obsolete. I just read some cutting edge DL papers, linked that to the things I know, and now I am a deep learning integrator.

I was taught in engineering school that my craft is learning and problems solving. I sell it as knowledge and expertise but the truth is that I am constantly learning in the domain I sell as my specialty, which is what everyone who is decent at anything actually does.

Hi Iv. I'm looking for help to implement a computer vision paper. Can you contact me, please? Sorry about the message here, couldn't find a way to send directly. rperez333@inboxbear.com

> Also I've noticed a lot of sub-fields like computer vision or self driving cars where there's a great deal of nerd interest but very little actual employment.

I'm strongly convinced that the less (nerd) sexy a field is, the more money there is in it for a good programmer.

The crappy programmer working conditions in the gaming industry is one example supporting this theory.

It's the same supply-demand issue as any other field. There is no money in writing poetry, but plenty of cash writing ad copy.

'The big danger is "owning" some domain that falls out of fashion'

That is part of the reason for the high pay. There is always risk associated with specialization. You can't become an expert without specializing in something, but if you specialize in the wrong thing your effort may not be rewarded.

> There is always risk associated with specialization

That might be the reason people demand higher pay, but it wouldn't be the reason why companies pay the higher pay. They pay it because there's a scarcity of people, and/or because they get some multipled value back. No company cares if I've specialized in the 'wrong thing' - they just won't hire me.

That’s an important “or”. It’s not always scarcity, it’s often the perceived value. CEOs are just one of many examples where the pay is not at all because of scarcity.

Are there really that many people capable and willing to work as CEO? From my experience there are never enough good managers.

I meant to add 'perceived' on value, and the 'scarcity' aspect can play in to the perception of value too.

>In fact, I have a feeling that computer programming is the stick that's holding the entire STEM tent up---if not for programming as a consolation prize you'd have a whole lot of physics and chemical engineers making lattes at starbucks---and further that web development is the stick that's holding the programming tent up.

I swear, I repeat this comment in almost every news article posted about how "We need more STEM graduates" and "The STEM job market is super hot right now".

Yeah, programming is. I know so many graduates of a BS in math, physics, chemistry, biology, ecology that just couldn't find satisfactory employment. They either got a second, mostly unrelated degree (in management or something more industrial) or are working at a mostly unrelated job.

Even in traditional engineering disciplines, while it's easier to get at least OK employment the demand just doesn't seem that strong compared to CS. I have many friends that graduated with civil, mechanical, or chemical engineering degrees (in at least one case, an MS) who are vastly overqualified for their lab tech or drafter positions.

I have a strong background in mech engineering and robotics. Yet, for the two years I worked in that domain, I never had a single recruiter contact me out of the blue. Now, my mostly amateur computer skills somehow landed me a "tech" (read: IT) job. In the 2 months since I've changed my current title on linkedin, I've had 3 recruiters contact me.

There is a difference in engineering and programming. An engineer uses programming to solve an engineering problem. If all you can do is program, you're just not as valuable. Programming is one of many tools I use in my every day job as an engineer.

I think this goes back to being an expert in a "domain". Being able to apply programming to that domain is far more valuable.

Quite possible but since everyone uses the term "engineer" without any context or specific qualifications, we should expect any distinctions to dissolve as people realize total morons are using it too.

* I have a bachelors in electrical engineering from a top ranked school.

We used to call ourselves programmers. Then it became sexy to say developer. Now it is engineer. (Since you pointed out, it means something different to everyone, so its meaningless).

The future will be a new snazzy term. Same job, different title.

Software development is not engineering.

My point wasn't really the distinction between engineering and programming, but between everything IT-related (red-hot job market) and all-other-STEM related (not so hot job market): sciences (chem, bio, physics, math) and traditional engineering (civil, mechanical, chemical, electrical)

Agreed. See my dad. An expert on systems running AIX and HPUX. There are not a ton of those around anymore. In the 80's and 90's he told me he would get multiple offers all the time. People fighting over him. Today, he has to fight to find jobs. Luckily he just needs to get by a few more years until retirement.

As a computer vision person - the jobs are there. The reason it doesn't look like much employment is that there aren't that many of us relative to the hoardes of web developers. Part of the problem is that the bar is high. Most jobs require a few years of practical experience, a research degree, or both.

It's the classic example of complaining that there are no jobs, while the industry complains that there aren't enough talented people. I get a few recruiters pinging me weekly and I'm not looking to move jobs - their story is always that there is a huge demand for good people.

I think the CV job market is unlikely to get much better, at least in terms of numbers of positions available. On the other hand, it's a stable job market that isn't going anywhere. The pay is already above average and it's not likely to go down because of the experience required. There's no shortage of applications and it's not dependent on a particular language, toolset or hardware. Virtually every big factory in the world employs some kind of machine vision system, for instance. And pretty much every computer vision deployment has to be modified for the application.

There are much easier ways to make money though. If you want to be a wealthy programmer, get on the web/app development train and accept that you might need to pivot every couple of years.

You can switch domains.

This is kind of unrealistic but very easy to say. I have changed domains but that was because I wanted to move into the domain of my interest i.e. embedded and also was young and wanted to feel the breadth. After investing almost 5 years of my life in it and endless reading and doing things things in my field I am considered an expert. Now the embedded field in my country is in a steep decline and I am at a cross roads. All the so called "new" tech does not seem to entice me at this age as I can go very deep into my embedded field. On the other hand I know that if I don't jump ship I would be jobless pretty soon. This makes me very bitter in choosing CSE education as my peers in medical have way more stability in their domains than me.

I don't know man. There's enough use-cases to make both boats float. Just gotta find it.

By sinking your precious time invests sure. But you won’t be able to claim you have 20 years of experience in X anymore, you will be a newbie for awhile. That, and your PhD is already in something else...your papers are already published in other conferences....it can be a problem trying to move from say PL to ML.

Sure, but those are the trade-offs. You don't get the reward without taking on a bit of risk.

It's a little ironic to hear talk like this, since around here I hear a lot of "sorry your coal mining job is going away, but you need to get with the times and train for something relevant". Same thing applies here. If you pick a programming specialization that eventually falls out of favor, buck up and learn a new one. Maybe that'll mean a pay cut for a bit as you ramp up, but that's just how it works.

I don't think it's unreasonable for people to expect that they should be able to work in a field for the majority of their careers. Retraining is not an efficient use of ones productive years.

I think that if someone chooses engineering then he must love to learn. For me it would be too boring to do the same thing for my whole life, I always try to learn more languages, more paradigms, architectures and whatever. I can jump in the code of a different team and be immediately productive. Usually I suggest and implement small changes that have a large impact on the maintanability of the system as soon as I start to grasp it as a whole. I couldn't ever do it if I stayed in the same field, in the same environment, for all my life. I think that if someone expects to work in a field doing the same thing for the majority of his career then probably should not choose engineering.

But you are doing the same thing. This is the frustration. I've worked in several languages with various tools and it's very common to just be learning slightly different syntax to approach a problem in the exact same way (and then eventually maybe you'll reach a higher plane where you approach it in a way somewhat characteristic of the tool, but still not fundamentally different). Technologies that really change the way you work or think about problems are few and far between, and most of the algorithms and data structures you'll use most of the time in typical applications were discovered in the 1970s. It's not a big deal to pick up a new tool on the job, but I have a hard time motivating myself to spend free time learning yet another MVC framework or whatever.

It's a different story, I guess, if you go into a completely different type of programming from what you're used to (say Web apps to embedded programming), but I don't think that's necessarily easy.

I think it is unreasonable to expect that, especially in a field where even a decade ago people were saying “50% of the skills on your CV become obsolete every 18 months”.

If people were saying that a decade ago they were wrong.

Looks like a lot of people, net, agreed with the claim: https://softwareengineering.stackexchange.com/questions/4417...

This comment seems to be more about putting all your eggs into one basket than the idea that every technology is going to die in a few years. Most of the popular production languages today are 10+ years old, for instance.

Not every technology, but rather random technology from the set of things-which-currently-look-good-when-seeking-work.

Important difference.

For example, at one extreme, all tech lasts exactly three years; at one of several other extremes, 50% of tech is eternal and the rest lasts exactly 1.5 years.

Maybe it’s not unreasonable, but unrealistic?

Huh? Those sentiments (retaining is hard, retraining is necessary) aren’t incompatible.

I don't think getting a PhD is an especially good maximize-earnings plan.

I think wall street is still the best option to make a lot of money.

In which case, there is a credible argument that a PhD is integral to income maximization.

I think so if you want to quant..

I wouldn’t disagree with that :).

In theory. I'm sure everybody knows, but at times you might 1) lose willpower 2) find yourself in an unknown climate all of a sudden (say going from old codebase to nodejs..) that makes the learning curve very steep

This is why you market yourself/your business as specialising in a particular domain, but you study good fundamentals and general principles, which you then apply to your current domain(s) of interest using whatever development tools are of use in that domain.

Straight up coding is such a pop culture that some subset of good principles are unknown or misunderstood by each successive hip new movement, every one flawed in a different way to their predecessors, but doomed to make similar mistakes by lack of experience. The flaws easily extend into "best practices" around any given tool; and then you stick out like a sore thumb for recognising what should be done, but everybody else cargo cults the orthodoxy.

It's a bit more involved than this. Current market doesn't really follows fundamentals. I dug into advanced theory quite a bit and that won't help me peanuts when trying to unfold what people mean with their patterns and ways. Quite the opposite some times, angular 1 for instance, it was a bit insane what they assembled.. unlike anything fundamental if you ask me.

I suppose I would argue that developers with a lot of experience and good knowledge of fundamentals might well steer clear of a great deal of the unnecessary complexity and short-lived tooling in today's web development world.

It’s good to see I am not alone it felt like everyone was taking crazy pills when angular came along heralded as the greatest thing in web development and anyone pointing at the foundamental issues was suddenly a pharia around here

Is this easier for someone who was a "domain expert" in an obsolete domain than for a programming generalist? i.e. could a programming generalist also "own" a domain in a couple years?

I was surprised several times, and have since stopped being surprised, that someone who was immersed in a technology for years, loved it intimately, and knew every wart on its asshole, so to speak, and loved it anyway, the person you know as the guy who's really into PHP or the guy who's really into Oracle or the guy who's really into jQuery will not flail when the company abandons that technology. I would always worry, and think to myself, "Man, using PHP at least we have Tom. I'm not sure upgrading to a better technology is worth losing that kind of expertise. And what will poor Tom do now? PHP is his whole life." And of course six months later Tom knows every wart on the asshole of Node and React/Redux. Tom was an expert because he had an ability to take an obsessive interest in something and create his own expertise. Non-technology domains are similar, with one caveat. Somebody who is a master of a customer domain got that way by being able to take an interest and being able to empathize with customers, get them talking, and keep them talking. They can apply that to a new domain and get up to speed with terrifying speed. The caveat is that some people have domain knowledge from working in a field for years before getting into software development, and they can't replicate that experience in a new domain.

Can't upvote this enough. In my experience, if you have the capacity for the occasional obsessive voracious burst of knowledge acquisition, there's no limit to the amount of technology cycles you'll be able to surf, and you'll always be able to sell your expertise and get good work. I feel bad for the calmer more balanced people who don't have that manic streak, who leave work at work, they'll always be at a disadvantage.

I'd like to think I'm becoming Tom here, right down to the techs you mentioned. I also went through git, apache httpd, bash, and Ruby phases.

My philosophy is that if PHP keeps the lights on, I better really know what it is, where it's going, what are it's limitations, because it would be pretty scary not to.

Plus once you learn a language or a database well, the next one is easier. The trick is knowing when it's time for the next one.

easy to switch , not easy to own.

If by "not easy" you mean "a couple of years", sure.

Don't mistake what I'm saying. I'm using the word "own" rhetorically. Obviously there will be no point in your career where you are likely to be the world's leading expert on computer vision. What you're aiming for is to be the leading expert that is available to employers within a reasonable amount of time at the level of compensation you're hoping for.

It's a spectrum.

Obviously, if you invest heavily in a discipline and it is decisively obsoleted, that's a setback. I'm just saying it's not an insurmountable setback, and I'll add here that it's still unlikely to leave you behind the pure generalists; in the process of building expertise and accomplishment in cryptography or OS kernels or whatever, you're likely to become a pretty far above-average software developer as well.

An LOB programmer could work on computer vision side projects for a couple years and probably put together a neat portfolio of work.

But the few people who are really hiring people who "own" CV want PHDs with a relevant thesis and you're not going to get one of those working weeknights after the kids go to bed.

Meanwhile people pursuing PHDs have no idea if their area of research is going to be hot in industry when they're done.

Granted, a lucky few people who follow the autodidactical route may find themselves authors of a really popular OSS project. Or they may be able to luck or maneuver themselves into a position at their current employer working in a capacity that future employers will see as "relevant industry experience." Obviously this is hardly a sure thing and if they plan it wrong or just get unlucky they're not going to have a nice outcome.

In the specific case of CV, a programmer looking to move into that area is probably doing themselves a disservice if they just apply for jobs that are actually looking to hire computer vision specialists.

There are so, so many businesses out there that could benefit from CV, but they don't realize it. Maybe they're spending a ton of money on off the shelf OCR software that isn't great for their needs. Maybe they're spending a few hundred thousand dollars a year manually organizing and classifying documents when they could automate it.

These companies won't be hiring someone to own CV for them, because they don't know they need it. They might not even know what computer vision is. And they won't care about whether you have a PhD or a relevant thesis. They'll care about whether can can demonstrate that you can solve their problems and save them a pile on money.

Granted, finding companies with these types of problems is perhaps not something that a typical LOB programmer knows how to approach. And if they see themselves as just a LOB programmer, it might stay like this.

If they make an effort to get to know people and network outside of the usual tech circles, they might just find it possible to position themselves as solvers of specific types of problems - problems to which CV happens to be the best solution.

I know that's not a path that'll work for everyone. But it's one path that leads from 'LOB Programmer' to 'Someone who makes lots of money doing development with computer vision'.

If you want to convince me that computer vision is a bad domain to specialize in for journeyman programmers because it takes post-grad education to get a job, that'll be a quick and easy argument.

As a counterexample, though, most serious work in cryptography (for instance) is done by grad students, but the best jobs in crypto are not automatically going to grad students.

Either way: if your goal is to maximize lifetime earnings, I think dedicating your career to "programming" is a bad call.

I'll have to defer to your knowledge of the market for self-taught cryptologists. My understanding is that most journeyman programmers working in security are in Burb Suite Operator positions.

My point isn't that becoming a subject matter expert isn't a good way to supercharge your earnings for a few years. I agree with you there.

Clearly, a guy with lots of experience with and deep knowledge of the Oracle database could do pretty well for himself 10 years ago. A person with lots of experience with and deep knowledge of OpenStack could do the same five years ago.

My point is that it's not a good, safe or stable way to ride out the last half of a 30+ year long career.

My further point is that demand for lots of these specialists doesn't appear to be particularly elastic, and therefore "become a subject matter expert" is not a universally applicable bit of career advice for every LOB engineer worried about ageism: if everyone successfully took your advice there would be a whole bunch of unemployed crypto experts.

I see the miscommunication here.

It's not my claim that simply by moving to a specialized subfield of programming that you'll instantly quadruple your salary and future-proof your career.

Obviously, you're going to start in some domain speciality as a commoditized practitioner and, with effort, move yourself to more elite ranks of that subfield.

My point is that it is better to work on that problem --- of being an elite in a specific domain --- than it is to be an elite at the general problem of building reliable software.

(It's "Burp", by the way.)

> Clearly, a guy with lots of experience with and deep knowledge of the Oracle database could do pretty well for himself 10 years ago

A guy with lots of experience and deep knowledge of databases in general, though, would do pretty well for himself both 10 years ago, and today. Even better if he worked on some database himself. Really, that's what owning a domain means: you don't want 10 years of experience as a _user_ of some technology, you want that experience as a _builder_ of said technology.

There's a difference between being an Oracle expert and being a database expert. If all you know is Oracle, yes you have a problem. If you understand database theory and implementation and can apply it to RDBMSs and NoSQL databases of various kinds, you are set for life.

Management is always the safest path.

One of the reasons people are switching from Oracle is that the DBA's are so damn expensive.

"If you want to convince me that computer vision is a bad domain to specialize in for journeyman programmers because it takes post-grad education to get a job, that'll be a quick and easy argument."

Are you sure about that? I mean sure, if we're talking someone who wants a deep learning research project to improve the metric for recognizing human faces from 99.7% to 99.78%, then we're talking about PHDs.

But there are plenty of "simpler" problems that one can work on, and plenty of more engineering-oriented problems that one can work on. Lots of companies just need to apply semi-standard techniques to problems that are specific to their business, and nothing more complicated or research-y.

Once you've learned how to master a domain, mastering the next one is easier.

I was thinking that the domains would be broader things that are not likely to fall out of fashion or change faster than an individual can keep up. Things like “security” or “networking.”

Yes. More generally, it should be "the sort of thing you can imagine investing a career in". Does anyone really plan to invest a career in building expertise in Golang (for instance), outside of the small group of people who actually work on the language itself?

(Being a PL design & implementation expert --- not "having very good taste in PLs"! --- seems like a defensible position in the market, except perhaps that there are a lot of people who want to work on compilers and you're competing with a healthy pipeline of them.)

Computer vision has very little actual employment? That's just not true at all. I want what you're smoking lol.

If you think Web Programming is where the interest is at, you haven't been searching "data analyst" on the job boards

Maybe in SV and other tech centers.

In "flyover country" and the midsize college town I quite enjoy living in, indeed currently returns only a handful of BI-related postings for search terms like data analyst. Web developer returns tens of postings.

I'm speaking from a UK perspective. I should have mentioned that.

Agree. I think it is financially irresponsible for yourself and your family if you don't have a plan to transition yourself out of programmer job. It is okay if you are financially secure or independent, you can do whatever you want. But seriously, have a plan.

I'm not against having plans, but to have no other option than to transition yourself out of programming? That's an absurd way to look at aging in a technology field.

This is incredibly good and valuable advice. I hope everyone heeds it (if what they want is to increase earning potential, of course).

I'm not sure every programmer should start with trying to specialize in a domain. But definitely, at some point, you're reaching diminishing returns for getting better at programming.

Put another way: Getting better at the craft of programming allows you to write programs faster and better. Specializing in new domains allows you to do things you couldn't otherwise. This is the realization that made me decide to stop learning new programming languages, and instead spend my time learning new domains - I realized that learning Haskell, while incredibly fun and rewarding to my abilities as a programmer, will not let me make fundamentally different things than with Python. Learning e.g. Deep Learning, however, would.

Btw - even learning totally other skills - e.g. learning 3d modeling helps me a lot in gaining deeper knowledge of 3d than I would otherwise have - even if it's not 3d programming per se, it's still relevant. And learning new Maths is not as directly applicable or as valuable as learning new software domains, but can open up new avenues you wouldn't otherwise have, that learning "yet more programming languages" or similar won't. (It's still more valuable to work directly on relevant things, of course, but sometimes it's fun to broaden your scope in relevant ways.)

I don't think there is significant overlap between Python's ideal use cases and Haskells. For one, Haskell's advanced static type system lends itself nicely to formal verification and the handling of domain specific languages. Both domains are important in safety-critical software. It also has much better ways of dealing with concurrency (STM and linear types). On the other hand, Python's ecosystem seems to be centred around backend web development, data science, and machine learning.

They're both general-purpose high level languages suitable for application programming. The difference isn't so much in the sort of software you can write, but rather what you want your development/testing/debugging experience to be like.

I've never used Python, but I like Haskell for the kinds of problems where I'm trying to do something complicated and a little confusing, and I want the compiler to let me know if I've asked it to do something that doesn't make sense. That sort of thing can save a lot of time in the long run, and I usually feel good that the resulting software is reasonably robust. I expect a good Python programmer could write an equivalent program, it's just not the way I like to work.

Libraries are of course often a deciding factor, and so are performance requirements.

Most programming problems have multiple viable implementation approaches, and implementation language is not usually the gating factor on success.

It takes some esoteric innovative features to make a difference, but that will also cut both ways; it'll make it harder to hire for and scale, too.

I cannot agree with this enough.

I'm a pretty capable "generalist software engineer", but what keeps me employed is expertise in virtualization and high-performance virtual networking. Prior to that, it was expertise in distributed systems and a knack for debugging distributed failures. The history here is relevant: later in the thread Thomas points out that you can switch domains. My time spent in distributed systems with multi-millisecond quorum periods is more or less directly applicable to debugging synchronization issues at CPU clock speeds. The tools used to observe the issue are different, but the reasoning process for untangling the set of plausible partial orderings is the same.

Specialize in something valuable. Continuously evaluate the next most valuable skill to acquire based on where you are and where you want to be.

Yup, totally.

There's a pretty common trend in GameDev -> Embedded Systems | High-Speed Trading.

The skillsets are very similar while the domains are quite different.

It would be interesting to see some kind map (like the civ tech tree) of which domains share skills

+1 . Where would we start collecting data for this?

TBH, I really don't care about career advancement. I don't need "half-mil-a-year offers" to fund my lifestyle, so as long as I can continue to draw a steady paycheck, I'm good.

Not everyone's a climber, and being in "a dead-end job" isn't really a bad thing if you're not ambitious.

I'd just like to spread the message that it is absolutely OK to plateau. You don't need to be a renowned domain expert, you don't need to go into management, you don't need to start your own company... if you're happy where you are, don't sweat it.

The problem becomes when you're a 50 year old individual contributor who gets laid of when the company isn't doing well, and then you can't get a job because of a mix of ageism and the fact that you don't have any in-demand specific skills beyond "can program".

Not saying this will definitely happen, but it's a risk, and you don't need to be a ladder-climber to protect against it.

I think most people would probably do better if they just stopped allowing their expenses to grow with their income, and saved and invested more, both in equities and in forms of passive income (like rental real estate). That way, not only are you insulated from shifts in the industry around your primary employment, but you can also probably retire early.

I agree very much with this, and I will add that this can lead to a reasonably stress-free life, which is even better than it sounds.

To me, the real perk of working in tech is not that there are so many jobs that I can find one that pays incredibly well; it's that I can find one that pays nicely but aside from that has some other perks.

A lot of friends and workmates have told me that I could be working in a better paying place. After looking into quite a few job offers, my conclusion is that for now I'll stay in my 8-to-4 job that pays me twice as much as the average in my area (average in the sense of all the population, not just tech jobs, obviously). I used to have a high-paying, high-stress job; I don't want to go back to that, it's not worth the money.

I must add that I'm EU-based. I understand that in the US, with much less of a safety net, maybe optimizing for money is a more attractive idea. I can't really say for sure.

Honestly that just sounds sad to me. I don't see how finding a plateau in life is ever a good thing.

I just care about things other than my career, and I'd rather spend most of my mental energy on things I do for pure enjoyment and not on things I have to do out of economic necessity.

For example: I've spent a large chunk of my free time over the last few months learning how to typeset signs in fansubs and make them look good, something that has no economic utility whatsoever, but makes me really happy when I typeset something that looks good. Yeah, sure, I could have spent that time furthering my career by developing a marketable tech skill, writing tools for my company on my own time to impress management, or writing an open-source piece of software to pad my GitHub portfolio, but I'd rather do something fun for myself than do any of those.

Substitute the word balance for plateau and you'll begin to understand. Desire is suffering.

also lets be real, programming isn't worth infinity money.

It's a plateau in earnings, not a plateau in life.

I wish this is true.

The problem in software development that current way how company financing is set up is that you do not need domain expert to make it.

In other words, it is nearly impossible to be hired as domain expert because very small number of startups needs one: just do some "growth hacking", get to YC, and raise money. Maybe founders need to domain expert to get to YC but they you will kick him or her out as soon as they get "YC stamp".

And this is reason why there is no database experts in all these new database startups, there is no security experts in security startups, nobody can figure out to make even half decent Dropbox clone, etc. The list goes on.

And there is also ageism: all "domain experts" are 30 something.

You're right that most startups don't need domain experts, but most startups won't be paying 250+k salaries, either. So a domain expert would look elsewhere anyways. Even if they still work for a startup, it'd be one that needs their expertise.

I heavily assume that even in IT startups are still a very small proportion of the actual job market.

This is excellent advice. I would add that another key ingredient is never resting on your laurels, and continually learning and improving. Successful people (and companies) often get to a comfortable place where they can get away with doing the same thing over and over, but the money keeps rolling in. It can be very tempting to step off the gas and just enjoy the fruits of your labors for a while.

Some people get to this point and never step back on the gas, because it seems like things are going to be great forever. Once people get too comfortable, it's only a matter of time until it bites them.

IMO it's important to continually reinvent yourself throughout your career, to keep things interesting, but also to always put you in the best position to take whatever the next step is. If you wait until it's time to take that next step before figuring out where you want to go, it's too late.

So... I agree that growing constantly is important.

But I don't agree that having constant side projects and learning the new cool language/framework du jour is necessarily valuable.

I think that there's just too many technologies out there, moving too fast, to be proficient in a significant fraction of them, even if you invest all your time in learning them.

I think there's more value in learning how to do engineering well in one language than learning how to do basic stuff in 20 of them. Because the skills and knowledge of going deeper transfer much better than knowing syntax, etc - stuff you can always pick up for the next job once you know what it is.

> The fundamental rule of career progression in the technology industry is to position yourself as a domain expert in something more specific than "programming".

Is it a rule, though? I hope not. I'm a 40-year-old generalist, and so far I haven't had problems finding work. Sure, I'm not going to get a half-mil offer from anyone, but that was never the case at any age.

Agreed. There is a strong drive in industry to commoditize "coding". For example, the wide proliferation of bootcamps. The world being what it is, if the perceived skill is equal companies would prefer to hire the cheaper 20 year old bootcamp grad vs the more expensive 45 yr old. So it's important to grow your skills and knowledge in areas that are not vulnerable to bootcamp commodification.

It’s not just coders who are commoditized. I find every specialist gets commoditized in the mind of people who are not specialists in that specific field. “He’s a programmer, he’s a quant, he’s a surgeon, he’s an astronaut...”. It’s because we have biases where we believe our own path is a bit harder than everyone else’s. I think it’s natural because everyone has a clearer sense of their own struggles, but only a superficial sense of the struggles endured to acquire expertise they don’t have. Furthermore we have a bias that insufficiently accounts for expertise atrophy so we tend to think of our expertise as the accumulation of all our experience but if we applied the right discount function we’d see that expertise has to be deliberately practiced to be acculumulated to the level in our egos. Few experts do that. The net effect is that the world doesn’t value our own expertise as much as we are inclined to ourselves.

In addition to all the bootcamps, just look at the mania in primary school education about getting every kid to learn to code. If that isn't just a fad and it keeps going on through secondary and beyond then in the next decade or two the commoditization of coding will be complete.

Thanks fo sharing. Just wanted to comment that while generalists do not get outlandish compensation offers, they are more likely to qualify for a wider set of jobs. In a way, it is a different strategy in the employment game: you go for a less risky bet than specializing in something, and by that increase the chances of being employed.

Not really... because then you're just competition against a larger pool of people. I.e. a generalist is common, while a domain expert is not, so more generalists means more competition.

Right. That's why the compensation is lower. There are a lot more jobs available for generalist skills though, so it's much easier to find jobs to apply for (which was my original point).

That's why compensation is lower and companies can hire more generalists to soak up supply ...

this worries me sometimes; i have finally (in my early 40s) moved from "specialising in the craft of programming" to getting a job in a domain that excites me (language tooling), but i am now a relative amateur in a field that is full of phds and the like, and i am not sure how large the field itself is going to be in say another 10 years.

i've been wondering if i should pick up something entirely different on the side (e.g. machine learning for broad applicability, though that does not excite me at all), or to instead diversify from my current niche but learn to write tools across a broader variety of languages and ecosystems. it's scary sometimes how the entire programming ecosystem can change overnight and leave once-valuable skills in the dust.

If you asked a random sampling of 1000+ karma HN users what my specialty was, a bunch of them would say "crypto", but I have 1 semester of college and didn't know CBC mode from CFB mode 8 years ago.

Some fields might be locked up by postgrads (crypto engineering is not one of them, and I doubt language tooling is either). You should check before you invest too much in it. But I would advise against being spooked by academic credentials. Good places to work care more about conversance than credentials, and you get to draft off of all the research work the academics publish before moving into industry to compete with you.

thanks, it is useful to step back and think about all the other people in the field who don't have phds, or who (like me) don't even have PL backgrounds, but are still happily working on interesting problems. it's just been "different" moving from working on code architecture to working in a field with way more academic than industry representation; i can't help but notice that i can write the code but i'm far from being able to write the papers.

Let me tell you, don't worry about it.. You'll use your skills to smoothly transition to something else..

I've been in this industry since copper and telco was hot shit.. I went from doing Telco programming > IT > Web Development > scalable enterprise applications.

Why bother stressing over something you have no control over? Do what you love and the money follows.

so far not worrying about it has worked well for me; i've been in tech a bit over 15 years and enjoyed all of it. but at a certain point all the stories about people aging out of the industry start to add up, and you have to wonder if the time has come to be at least a little more ant and less grasshopper :) at 42 "do what you love and money follows" is still working for me; another decade down the line i'm guessing i'll have to put a bit more active work into staying relevant.

i do have some hope that i've gotten into my current niche (static type inference for dynamic languages) reasonably early in its life cycle, but if i'm wrong i don't want to have not kept up with the larger world at least to some extent.

(to be precise, i'm not so much worried about being able to transition my skills to something else, as about not being given a chance to do so unless i can already demonstrate the new skillset people are looking for.)

I thought "generalists" are evergreen. These big tech companies are mostly (or always) looking for generalists because they understand the basics so well that they can do anything.

This is typical but not mutually exclusive of the fact these companies also find themselves from time to time, wanting someone who has deeper or more specific knowledge in order to resolve issues identified by the generalists.

I second this. I'm currently on a team that has a couple senior engineers, but neither of them are true experts of what our team does (iOS). One has decades of web dev experience while the other has decades of Java service experience. Our team badly needs an iOS expert to make the tough decisions and lead architecture.

You did miss the point of the parent. To be more clear, programming for the most part is easy. Knowing what to program is hard. I don't know what your iOS app does, but lets pretend it's something to do with healthcare. Someone who is an expert in the healthcare domain and can program (or learn whatever stack you are using) is going to be much more valuable than an expert iOS person who has no clue about healthcare. That same person will also remain valuable as technology changes (which it will) since the company will likely still be in healthcare.

> To be more clear, programming for the most part is easy.

I vehemently disagree with this. Kludging together an unmaintainable codebase that barely meets the spec might be easy. Aiming for a notably higher level quality beyond that is NOT an easy thing at all.

Obviously, the nature of this discussion depends on the complexity of the task at hand. There are many profitable ventures with low levels of complexity in the software implementation, and your comment may hold true for such scenarios.

However, there also many ventures that require significantly complex software to meet their goals, and will require skilled developers to have any hope of not collapsing under the weight of shoddy implementations and impenetrable abstractions.

I'm sure there are exceptions, but IME the biggest cause of 'kludging together an unmaintainable codebase' is lack of understanding of the problem domain. Software solutions do not exist in some technical vacuum outside of the problem they were built to solve. Better understanding of the problem domain naturally leads to better software.

My experience is opposite.

On one of my previous jobs one problem domain was some area of theoretical physics. There was a guy with brilliant understanding of the problem domain. He had couple of decades of experience teaching physics in the country’s best university. He could explain very hard problems in easy to understand way. The company filed multiple patents with the amazing stuff he invented.

It didn’t help the guy to write code. It wasn’t good. Not only unmaintainable, also slow and unstable (not all algorithms work equally well when mathematically-real numbers are substituted by software-real IEEE754 floats). We ended up just throwing out whatever he coded, and instead learned enough problem domain from him to be able to understand and implement what he actually was trying to do.

Not just knowing what to program, but how - clever and mindful architecture will always beat clever programming.

Developing transferrable strategic and architectural skills between verticals (client facing, or technologies) is a key aspect that I see being missed with programmers.

No framework will make or keep your career, nor will a specific language or piece of tech. Knowing how to apply technology well, in a way that doesn't need to be entirely thrown away every 2-3 years is invaluable, and as a result increases a developers value.

What kind of tough decisions does an "iOS expert" need to take?

Mobile app development is becoming quite clear cut, one only needs to read the docs and know the APIs.

It only took me a few weeks to get caught up and be a proficient iOS developer, but it takes several months to know all the pods, architecture design patterns, tooling and gotchas to be able to lead a team and avoid months of mistakes and rewrites. The official APIs are a small part of it.

Agreed to some extent. You still need some expertise as a general software developer in order to write performant, stable apps, but it's certainly a lot easier to go from zero iOS knowledge to a fully working app than it was 5 years ago.

That would still be programming specialization, as opposed to domain specialization where programming can be applied.

I used to know this guy who knew the ins and outs of the health club industry, and he could program a little. He made a lot of money. I think this is more in line with what tptacek is describing.

I think it's still a domain specialization to a degree. The domain being, mobile.

Being an expert in iOS is more than just knowing how to program in Swift/Objective-C. It encompasses a deeper understanding of mobile development and the apple ecosystem.

True senior level iOS people will understand core user interface design principles as they relate to mobile, and will be intimately aware of Apple's Human Interface Guidelines. They'll understand mobile accessibility and be deeply familiar with the myriad of tools in iOS. They'll understand internationalization and how to design an application to support various calendar systems, and text layout paradigms. They know the ins-and-outs of Apple's provisioning and certificate system and can quickly sniff out configuration issues. They understand how important testing is and how well supported it is. They know how to setup automated builds using xcodebuild, and how to support automatic unit and ui tests.

The list goes on and on. Certainly one could just be an app developer and stop there. But being a mobile domain expert in my opinion is a much more valuable albeit niche position.

None of this stuff really sounds "defensible", in the marketing sense, to me. This is just programming. It's the goal of the platform to make programming as accessible as possible to everyone.

It's better to be good at mobile than to be good at nothing at all, sure. But I'd be really careful about picking a specialty that amounts to "my domain of expertise is AppKit", or "my domain of expertise is Scala", or anything like that.

The OP mentions a couple domains that you could also dismiss as "just programming", but actually have a depth to them which allows for specialization: Security, and kernal development.

You could claim that software security is "just programming" for instance... But you likely wouldn't, because it requires a level of understanding that is especially deep, and niche. It doesn't really matter that a security expert likely reads a ton of code, and writes some too, it's that they understand a subset of the industry to a high enough degree that they stand out among their peers.

I propose that you can be a mobile expert, just as you could be a VR expert for example. At the end of the day it's more then just programming, it's a combination of hardware / software / tools that retains a degree of uniqueness.

I think I managed to communicate about half my point to you, but missed the mark with the other half.

Yes: you can specialize in being a really good programmer for a particular platform. I have a lot of friends who specialize in being Serious Engineers, who understand how to get interesting and important things done with lenses and how to effectively get a whole project away from null pointers and into monadic error checking and how to use type systems to prove programs mostly correct as they're being written.

That is a real specialty. It takes real skill and investment of time to get to the point of being expert with that stuff. I'm not diminishing the value of being an expert software developer.

I'm just saying that basket of specialties is unlikely to command a very high wage premium, not because it's unspecialized, but because it's hard to fashion a value proposition for it relative to your competition in the market.

There will be times during which particular platforms or languages will get hot, and it'll be good to be expert with them. But (a) those phenomenon will be more transient than other problem domains, and (b) the wage premium for being specialized in them will still probably be lower than the wage premiums for the other domains.

At any rate: as long as you're thinking about the domains you're specializing in and not a career as a "programmer" or a "builder of software", we can probably agree to disagree about the rest of the details.

> "not because it's unspecialized, but because it's hard to fashion a value proposition for it relative to your competition in the market."

I agree with this entirely. But it makes me realize that perhaps I haven't explained my position well.

I would agree that being a mobile expert is likely going to be LESS lucrative than being, say, an expert in recombinant DNA technology with a fluent C++ background and a PHD in machine learning.

The only issue I see with advising people to invest tons of time and money into niche fields, is exactly what others have pointed out in this thread - You can't really predict what's going to be valuable in 10 years.

So take your pick - Go the safe route and become an expert in a more general field like, for example, mobile. Or pick something incredibly niche, which will shrink your competitor pool and increase your relative value proposition. A classic risk/reward scenario if I've ever seen one.

> Go the safe route and become an expert in a more general field like, for example, mobile.

I don't think this is all that safe, really. What we consider "mobile" development has really only been around for about a decade with the releases of iOS and Android. (Sure, there were mobile platforms before that, but most people didn't really care about them, and specializing in them would likely limit your options, not expand them.) In that time, mobile development has gotten nothing if not easier. So, over time, being a "mobile dev" will become less and less of an important specialization. More people will do it, and the frameworks around it (both those released by Apple/Google, and third-party things on top of it), will make it easier and easier. By that point you might be an "elite" mobile developer who can do certain things on mobile that most others cannot, but the market for that level of specialization will likely also be small.

What about specializing in a platform or language makes it hard to fashion a value proposition for it relative to one’s competitors in the market? (Not disagreeing; just trying to wrap my head around this. Also, I think your comments in this thread have been super useful.)

Every senior developer in the market who codes at all in your platform is going to represent themselves as a subject matter expert in that platform.

To make matters worse, developers will disagree with each other about what the Right Answers are for a given platform or language or engineering approach.

Cryptographers (as a counterexample) also disagree with each other about the Right Answers in crypto. But serious crypto people are generally competing for jobs only against other serious crypto people, and there aren't that many of them.

Moreover, the general domain of cryptography isn't going anywhere. But look at Objective C, or Windows Mobile.

Finally, proving value as a subject matter expert on a language or platform depends on companies having hiring processes that reliably spot expertise. But, of course, hiring processes do a terrible job at that.

Thanks so much for the response. Can you say any more about what kind of domains tend to be lucrative? It seems like all your examples (distributed systems, sensor fusion, software security, OS kernels) seem very “technical”, for lack of a better word, compared to, say, knowing a lot about the health club industry (a different example mentioned in this thread). Do you think that’s an important distinction?

I think knowing a lot about a particular vertical is hugely distinctive and valuable, often more so than technical specialties.

Thanks :-)

You misses the point

Similar to the notion that you're only worth what your client think; programming was a mean to an end (even though now we need to make programs structured and enjoyable).

True not just for software, but tbh any field.

There a couple of aspects to this issue. I'm 53.

Management: I would argue that it's critical to garner management skills well before you turn 40. If you wait until 50, you're going to find it very difficult to move from talent-oriented jobs to management ones.

Coding: I hate to generalize, but it's very likely you'll learn so much over your career that you will become an ineffective developer. You will know how to do things well and will have a difficult time doing things just to get them done. It's fairly common for projects to need completion over correctness and quality. This is where younger developers are great. They don't know they're creating technical debt, so they have no angst over it. But you will and this is bad for the project and for you. You probably need to find a place in software development where you can mentor and lead, but reduce your involvement with actual day-to-day coding.

Challenges: I personally suffer from "it-must-be-challenging-or-I-get-bored" syndrome. The longer you write code, the harder this is to suppress and the more you look for shiny things to work on. This is bad for you because it's bad for your employer. If you don't suffer from this, you're amazing and any employer would love to keep you until you're dead.

> .. it's very likely you'll learn so much over your career that you will become an ineffective developer. You will know how to do things well and will have a difficult time doing things just to get them done.

And then you will learn to be pragmatic and solve technical debt only when there is enough time for it or when it brings a clear benefit in terms of business value or development speed.

I can tell this from my own experience. As a developer you typically progress along a path; in the beginning you just don't know how to do things 'well' so you're introducing technical debt without knowing it. Then you'll learn about clean code, design patterns and what not, and you think you have to apply this knowledge everywhere. Until you learn that there are times to be pragmatic as well.

This pattern is not unique to programming, it's universal. See for instance https://en.wikipedia.org/wiki/Shuhari

> Coding: I hate to generalize, but it's very likely you'll learn so much over your career that you will become an ineffective developer. You will know how to do things well and will have a difficult time doing things just to get them done. It's fairly common for projects to need completion over correctness and quality. This is where younger developers are great. They don't know they're creating technical debt, so they have no angst over it. But you will and this is bad for the project and for you.

I disagree: it is when you are young that:

* you spend and lose your time reinventing the wheel, either because you don't know the wheel you want already exists, or because you are sure you can do a better wheel (and after an extra 3 months it ends up being square);

* lose time overdesigning things because you want to apply all your lessons and your new readings, despite the fact that they do not map to the problem you need to solve.

I agree with your points, and also suffer from that syndrome.

So I have to ask - how do you do the management portion of that? Getting into it, I mean, more than just developing the skills or interest.

I've been trying to get into the management side of things but get stymied over and over, and typically people who express disinterest (and no particular aptitude) keep getting chosen to do it. This has had a tendency to wind up with most of the people involved (except the person who gave the promotion) quitting, and it's really frustrating to watch it play out again and again.

I suppose in one case I did get promoted to lead, but I was lead of nothing, because literally everyone else had quit.

The point is I want to move - badly - in that direction, and see lots of opportunity to do lots of things there, but keep getting stymied. How can I get past that?

If you are looking for a smooth transition, get on a large team that is under-managed (say, >14 people and only one manager/tech-lead), and in addition to doing your regular tasks, start getting heavily involved in sprint planning and technical & product specs. Gradually do more and more of that and less individual work. Voila, you are a manager!

I think this is the primary path to management.

When I tried this, what happened was one of the engineers who didn't want to manage, but had been around longer than me, got promoted - and I got moved on to his team. People came to me expecting me to be leading the team (or at least know what's going on), but I suddenly got cut out of doing that. It was chaos.

The person who was selected as lead left a month later, since they weren't that interested in doing it anyway. The powers that be still didn't let me lead the team, though by that point attrition had gotten so high, it kind of didn't make sense. (There wasn't really a team to lead).

The team before that was a similar deal (chosen lead quit not too long after) but I got made a lead... of a team with two people on it (including me). The company went basically bankrupt, and having only 6 months experience with 1 report didn't impress people.

I don't think being a trans woman helps.

I really do try; I come up with new initiatives, I follow through on them, I give talks to educate the team, but in the end, I'm always waiting some number of years for the next magical re-org, which may or may not go my way.

This is simultaneously what I needed to hear and what I didn't want to hear. You pretty much nailed it all for me.

>"it-must-be-challenging-or-I-get-bored" syndrome

I don't know if I can handle this one getting worse.

I hear ya. I've long had the challenge that a problem is interesting and exciting until I see how I would solve it. Then I can barely force myself to implement the solution. I get it done, but if it gets any worse I'll never be able to finish anything.

Has this condition been named and explored further? Anyway, if you have this tendency become an architect at a large company.

In the general sense, this is the wrong place to ask. It's a self-selecting group in that most people over 40 here will be among the group that lasted in the career. It won't give a sense of how large that group is in the wider world, nor how many didn't stick with it.

On the other hand, if you want to hear how to make it past 40 in software, this is probably an excellent place to ask!

Then let me ask: how do you make it past 40 in software? Specifically, how do you position yourself so that you are actively sought after (rather than merely employable)?

I’m 40 years old and I have recruiters coming after me non-stop.

1) Don’t include irrelevant experience on your resume. Nobody cares that you were writing php websites in 1997. Try not to put anything on linked in or you’re resume that would indicate your age unless you’re one of the top people in your field and your experience makes you stand out.

2) Keep up with new languages. Don’t be the guy that only knows perl when everyone else is using python. If you aren’t learning rust and go today, you’re going to be left behind five years from now. And once everyone is on go and rust, you should be learning the next thing.

3) Stay curious. I know you have a family and kids and other commitments, but you have to stay interested in the world. Keep up with business news, and science news and stay connected with pop culture as much as possible. If you’re applying for a startup with a bunch of 20 or 30 something’s you, you need to be able to meet them where they are.

4) Don’t get complacent. You’re always a bad quarter away from getting laid off, and that becomes more true the older and more higher paid you are. Keep your resume updated. If recruiters aren’t beating down your door, you need to ask yourself why. Because if people aren’t trying to hire you, your employer probably isn’t excited to keep paying you either. I was a junior guy on a team with all sysadmins 5 years ago. They were all the same age as me, but with many more years experience all at the same company, doing the same thing and really resistant to changing. I came in and really dove into the deep end with devops, despite having little programming or sysadmin experience (I had a networking background). Within 5 years I was a senior developer, making more money than any of the rest of the people on the team, and eventually got poached by a recruiter offering 40% more money. They’re all still there barely holding on to their jobs.

I'm 50 and I've been a freelancer since 2006. I position myself either as developer (Rails and jQuery back then, still Ruby and Python and Elixir now, I'm starting to code with React) or architect and coordinator of developers. That works well with small companies. I attend to tech events in my city (Milan, Italy) and I organize an event myself. It's a good way to keep the word of mounth going on. I don't know if this is a feasible strategy for the next 10 years, but who knows what's going to happen by then anyway. I'll adapt.

The problem with freelancing is you spend a lot of unpaid time lining up jobs. Does that really pay off?

But that is why your rates are (hopefully) so much higher than an employee...

Rule of thumb, your business should be viable at 50% billed time...

I spend a lot of spare time doing work for previous customers while being on a full-time project at the same time. Keeping multiple customers warm is a must. All paid though. A few weeks of not having a full-time engagement means finally being able to clear out the backlog and enjoying long weekends.

Every year I take an unpaid break for over a month around the holiday season. I love it but it always is a bit scary to be away from customers at the same time.

And how has that worked for you financially?

Fair question.

Reasonably well. I made more money with my last job as employee, which was paid above the average of the market. However I own my timetable now and I work mostly from home. That is worth some money too and it improves the quality of life, which is hard to quantify but it's very important. I'll never be an employee again unless I absolutely have no alternatives.

My father-in-law is pushing 70 and still coding. He just semi-retired, but up until last year he was employed, despite having little in the way of social skills and living in a remote area, far from any coding hub. If he was just a bit more personable and willing to move within a couple hours drive of Boston he could have fielded multiple offers. Pure anecdote, but if you're decent, have a little hustle, and don't mind working in a soulless office park, it seems like you'll never struggle to find a gig.

Java and Spring, or C# and .net. Java is derided as the next cobol, but there is a lot of new development, lots of jobs, and strong demand for senior talent. Source; me, been doing java since 2003. Age 40+.

Can confirm. Also, I suspect a lot of this "there's no life after 40" stuff mainly relates to startups and certain small companies. It certainly doesn't apply to BigCorps.

I suspect this is true. I hear the "middle-aged programmers aren't employable" story a lot on the internet, but in the large organization where I work I, in my mid-20s, am conspicuously very junior. Most of my coworkers are middle-aged, most of the program management are getting into their '50s, as you might expect of someone in a role seen as senior.

I work at a mid-sized B2B telecom (just over 500 employees), and looking at the ages of my coworkers, we have more 30+ people than not, and there is a very large contingent of 40+ and even some 50+ people here.

It's definitely the kind of company you want to work at if you're middle-aged.

but how many people who are middle age try to "enter" the industry and succeed? I can answer that for you: those who have close friends in power who wish to hire them. As you age yourself, you may see this if you are sensitive to it. We need to separate the two groups, because thrre are two issues here. 1.aging programmers already working in the industry and 2.those who are accomplished in another dying field who decided to/had the good fortune to be able to reeducate themselves in a new field for them--programming. It's the second group that have trouble even getting an interview, and when they do, are often passed up for a younger "junior dev" who is not as skilled. I realize I can only speak to what I've seen here in NYC, but I keep seeing it, and I see it a lot. So are we going to acknowledge it? i hope so. I hope we can find a way to allow people to change paths in life when their path ends, if they are humble and willing to work hard. That is the kind of society I'd like to foster.

Skepticism of bootcamps (and other non traditional paths) is not the same thing as agism. This industry is further along in figuring out the former than others, but you're right that it has further to go. It's just not the same as agism, though.

You are correct, of course. But when someone over 40 has a much deeper understanding of programming in several languages, a CS degree, AND experience from 2 bootcamps, AND at teaching (in 2 bootcamps, at a University, and with private clients) AND has coding experience at a school, with private clients, and one's own side projects in several languages- a linguistics blog, journalism and technical writing experience, and they are then turned down for a junior technical writing role at a company where the tech writers can't code at the same level, I don't know... it is hard to say what that is, isn't it? But yes, I completely understand skepticism of the breadth and depth of the skills of those whose ONLY experience of programming was at a bootcamp. And when that same company makes a mistake in their own assessment (which you politely point out after you are told you are incorrect in your technical writing), and when that same company hires someone with virtually no experience in any of these things as its next writer hire and also for its manager of technical writing, and while those two employees are in their 20s...what would you think? Am I excluding a possible interpretation? "Culture fit"? The candidate we speak of doesn't fit the expected cultural profile in several areas for a tech company. Might it be that in some other form? It is just really tough to say. I realize it stinks to see how blatant ageism can be. It makes all of us feel unsettled for our own futures-- it is also just stupid and embarrassing in an industry that prides itself on being so smart. But we have to get real about it. We also have to get real about sexism and racism. It seems to take more than a village (of women at Google, Uber, Tessla, etc.) shouting it out with lawsuits and news articles. We have not even begun to handle race issues in tech, perhaps because so few can break through. We need to face these things and stop trying to explain them away with coded words like "culture fit" or "updated skills" (when the skills are updated, of course). We need to look around our departments and seriously ask if there was ever a new hire over 40. Then, we can dig into these issues. We have to stop knee-jerk dismissing the stories of those who are shafted by these practices. Yes, it is uncomfortable.

> Most of my coworkers are middle-aged, most of the program management are getting into their '50s

List this firm please, it might help someone.

Actually, we all need to grapple with the fact that big corporations are behaving more and more like startups because they can. If Google has a higher standard (as has been reported by Google SWEs who perform interviews at Google right here on HN) for entering SWEs based on their age, then we need to take a hard look at that. Yeah, it's illegal and unfair, but no one over 40 and really over 35 is allowed to be a great "junior dev" - they are only comfortable w over 35 who have been in the industry. The ladder is pulled up at most places for even excellently skilled beginners after 35. One should be able to enter if one has skill at a higher level, but we all need to face that the level one must hit at 35+, as a woman, as a person in any underrepresented category in this industry is much higher than it is for those who are a good "culture fit".

Yes, you make good points. A distinction should be made between junior 35+ people and 35+ people with at least a decade of experience.

But I don't think that's exclusively a software engineering issue. If I tried to become an electrician at 40, my guess is I'd encounter some bias, though probably less than in software.

One other thing: by "BigCorps", I'm referring to things like banks, Wal-Mart and so forth, not technology companies like eg Google.

> One other thing: by "BigCorps", I'm referring to things like banks, Wal-Mart and so forth, not technology companies like eg Google.

Yeah, I'd much rather work at a traditional conservative BigCorp than a Silicon Valley company that never grew up from startup mentality.

If you do want to stay I tech, I'd recommend looking at the B2B telecom sector. It's far, far more conservative then any other part of the tech industry. They're the only tech companies that I've seen ban T-shirts in the office.

I'm working in a B2B telecom and I haven't seen that level of conservativeness, although in general I agree that they are conservative. But it's a wonderful place to work in many aspects: it's stable, pays well, and in general less clients means less headaches and a more focused development.

They don't immediately adopt new technologies, but frankly, nowadays I see that as more as a feature than a bug (although once in a blue moon I wish the delay in finally adopting them, when they have been successfully proven themselves, wasn't so long).

The most glaringly conservative aspect of my current job is its very rigid adherence to process. Again I must say that, having worked in places where it wasn't the case, I'm so glad that process is followed strictly.

The big problem with this job is that I've acquired a ton of domain knowledge that probably won't be useful in another job. This means that I need to keep other tech skills (algorithms, programming languages...) much more finely honed. On the other hand, I've acquired top notch documentation skills.

EDIT: I also can confirm that there is plenty of >40yo people here. Definitely a good place to be when you reach that age, although I'm only in my eearly thirties. I've found also a high level of diversity in general, which is very, very good.

I'd say this is a great tip for a candidate who isn't underrepresented in tech. I just came from an interview at a conservative midwest company that isn't technically a "tech" company but has a huge tech dept. Got to the end of 5 interviews at breakneck speed and with lots of acclaim from the team about my speed and competence...that is until they saw me. Everyone, and I mean everyone in the engineering dept., save one woman (she must be a 20Xer) was firmly within the typical tech demographic. I felt I did quite well despite suddenly being asked CTO-type questions. I was told there would be whiteboard algorithms and questions about the company itself. They knew I'd never worked at a big company with a giant code base before, and yet they were enthusiastic about my ability to contribute-- it had come up and was not an issue. They wanted to fly me out and put me up in a hotel. It turned out I didn't need it because I happened to be there, but they offered. They even talked to me about salary numbers. When I got to the on-site, I got questions more fitting for a candidate with many years' experience with a giant code base at a big company. Still, I felt I answered well. In the end, they told me I didn't answer those questions quickly enough (not correctly enough-- just quickly enough). I'm fairly certain that If I were given the chance to actually work at a large company, I'd know the answers to these system design issues without having to think to arrive at the correct answer. I'd know from experience. If I'd looked like more of a "culture fit" let's say, but still over 40, I think it might have been fine. By the way, the city that this company is in is 82% African American. Its tech department was 0% African American.

I was under the impression that these sorts of BigCorps were actually a better place for the underrepresented and frequently have more diverse technical workforces (along race/age/gender lines).

I just got hired by one in Texas that reflects that impression, which I had before I ever interviewed there.

That is good to know. I think Austin shows promise in this way.

Java is absolutely rightly derided as the next COBOL because it is. It is the abstract concept of bureaucracy incarnated in code. But that means it will likely follow COBOLs trajectory. A million legacy systems written in it will underpin the central core of a million businesses, being the (entirely unrecognized) most important component of the entire corporate structure bar none. And those systems will need care and feeding in their old age.


"I don't think Java is all that bad, and I can enjoy well-done object-oriented programming."

This is probably the most accurate opinion of Java the language I've seen, one that isn't tarnished by what anyone has seen in any proprietary codebase.

I would rather program in Java the programming language than Javascript any day of the year, even if I dislike the vast majority of popular Java frameworks like Spring, Hibernate, etc.

Yes, I should have made it clear that Java the language is not inherently terrible. And the JVM itself is really quite nifty. It is unfortunately the infection of Java codebases (literally all of them I have seen with the sole exception of short Processing sketches made for graphical effects) with a mindset of design patterns as lego bricks used to build applications. Java did have a rough start though, there was clearly an edict at Sun to invent a nomenclature which deviated from any terminology commonly used by Microsoft which operated only to its detriment.

I think the JVM itself will be Java's saving grace - languages like Scala and even Clojure now are gaining huge popularity that not only leverage the JVM and existing ecosystem but have the ability to rapidly advance and abandon Java's warts. So it's not like COBOL in that regard. Because of the power of the JVM ecosystem I think even Java's glacial pace of adopting new features will be enough to keep up for many years or even decades to come.

Spotted the one who's never written cobol (or jcl).

You are correct that I have never written COBOL (or JCL for that matter), though I'm not sure why what I said would indicate such a thing. I have seen COBOL and working COBOL systems. They are absurdly verbose. That feels like a hyperbolic understatement, really. Everything is very structured, in the nature of micromanagement in organizations, formed in that way in order to intentionally forbid flexibility. Is the COBOL I have seen 'old fashioned' perhaps? It was well over a decade ago, closer to 2, since I've looked at any. Perhaps there is a 'lightweight' COBOL?

Find a niche that rewards experience in something other than code. This could be non-technical (e.g. "business"), or it could be something more deeply technical (e.g. scaling, machine learning, search, etc.) To some extent, the older you get, the more this happens, naturally. But you can and should make strategic decisions about where you choose to devote your attention.

The important principle is that you should devote your learning time to things that are more likely to survive. A good way to do this is to pick something that has been around for a long time. Knowing how to write an optimized linux kernel driver is far more long-term-valuable than, say, a Javascript framework. Knowing how to do quantitative marketing is even more valuable than that.

The absolute worst thing you can do is to chase the new shiny every year. If you do that, you will never be more experienced than the most junior member of your team.

Why would JS be less long-term-valuable? It's only 4 years younger than linux, it's more common than linux and has a much larger community (especially compared to the linux kernel driver one). I don't see either going anywhere in the next 20 years.

It was born as a 'just good enough' hack, jammed into a static document presentation system into which interactive application functionality was shoehorned with pain points at every turn. It currently writhes in a morass of dependency hell, sprouting a hundred duplicate efforts of trying to soften those pain points every day. And it managed to get so big because the W3C continued, year after year, to obstinately refuse to acknowledge the web as an application platform until very recently.

But, they did acknowledge it. And they're now making design decisions to facilitate the web as an application platform, rather than intentionally trying to make application development as difficult as possible to dissuade it. So now we've got WebAssembly and tomorrow or the day after we will have any language anyone wants to use being used. In that arena, Javascript will get pantsed and laughed out of the room. It will probably happen with pretty amazing speed, as there are legions of developers who have been holding their breath for this for 20 years.

Although you've been downvoted, I think your question is fair, so I'll try to give it a fair answer.

In this discussion, when we talk about value, I think it makes sense to look at value in the sense of how valuable a skill is to the longevity of your career.

In that sense, JavaScript has some upsides - it's everywhere, and will likely continue to be everywhere. There will be lots of demand for JavaScript development.

But because JavaScript is everywhere, tons of people know it, and tons of people are learning it. So although there will be lots of demand for JS development, if you're a JS developer, there will also be a ton of people offering a skillset that is broadly similar to yours.

As you mentioned, the demand for something like Linux driver developers is tiny in comparison to the demand for JS developers. But the talent pool is relatively tiny, too. So it might be easier for you to differentiate yourself because when an employer is looking for someone with your skillset, they're not going to have to wade through hundreds of resumes from people who look pretty qualified.

In general, I think it's most valuable to specialize in solving a certain type of problem, where programming just happens to usually be the best way of solving it. It's often easier to sell yourself as a solver of specific kinds of problems than it is to sell yourself as a generalist with experience in JS, HTML, CSS, etc.

It's true that differentiation is a factor in career longevity, but it's not the point I was trying to make, nor the one that I would emphasize. The true advantage of learning how to do kernel development (over, say, a Javascript framework) is mostly that you're learning a technology is likely to be in demand in 20 years.

Given how much JS development thrashes, it's a pretty reasonable bet that whatever you're learning this year will be out of fashion next year, and irrelevant in five years. That's the high-order bit, when you're thinking about your career.

Even for something as "narrow" as linux kernel development, I can pretty much guarantee that someone will want hire you in 20 years, so long as you are competent. I can't make that guarantee for most technologies.

I changed it to "Javascript framework" to stop triggering the JS devs here, while making the identical point: if you specialize in a technology that was invented last year, you are taking a big risk that it won't survive. If you pick a technology that has been in active use for over 20 years and is currently popular, you are taking less of a risk. This is harder to do, but like many things that are hard, the reward is higher.

I really wish it were possible to make reference to JS in a less-than-glowing light without being downvoted, but alas. Substitute any other brand-new technology for "Javascript framework" if you're having difficulty seeing the argument.

I did not downvote, but I do think JS is a bad example. JS has been active for over 20 years and is currently popular. Sure, new frameworks are popping up and everybody rallies around "the latest", but come on, it's not like Java and C# are written in stone.

I wouldn't use Java or C# for that comparison, either (though yes, it's a safer bet that Java will be popular 20 years from now -- Javascript has only become popular as a language in the last 5-10 years, whereas Java has been popular since the 90s.)

But would I bet on Javascript over (for example) C? No way. Any new developer would be wise to learn C, even if they completely ignored Javascript. And I don't need any technical details about the language to make that call.

Learn soft skills along with staying up to date on technical skills.

Work on understanding the business needs and not just business requirements for your feature.

Learn how to foster team growth, not just your own personal growth.

Figure out processes to help the team and not just building your feature.

Don't be intimidated by younger engineers who are trying to climb the ladder. Help them succeed.

You have to be better than you were at 25 (more productive, making fewer mistakes, able to take on greater responsibilities). You have to be better than you were at 30.

You don't have to be as actively sought after. You should be staying at positions longer - 5 years, maybe even 10. (Note well, however: By this point, you probably have seen several toxic environments. If you're in one, don't wait - get out. Life is too short to put up with it.)

I have a file where I keep track of headhunters that I think are worth their salt. I'll use it if and when I feel like it's time to move on.

For the record, I'm 55.

Understand business needs. Be able to communicate. Be upstack: infrastructure, DNS, SSL, etc. Go deep on security: many concerns cut across many languages, but if you're a language/framework of the week developer, you're too busy learning the same thing over and over to have that depth, which businesses desperately need.

(I'm 40 and my current role is with a small company where I implement solutions rather than have a title)

> Go deep on security


Security is one of the few cross-discipline, cross-domain specialities where it is possible to be a reasonably good domain expert and still have a good coverage across other domains. The fundamentals don't change. (And I say this as someone who's been immersed in the field for 25 years, so of course I'm biased.)

There are few other domains that can offer the same level of constant demand.

The beauty - and the depressing aspect - of security is that maybe 10% of security is about software. The remaining 90% is all about what is between people's earlobes. To become good at security, you'll need to learn how to explain extremely difficult and often subtle concepts. And you'll have to do that for both technical and non-technical crowd. That's a fantastic and continuous trial by fire. It's also lots of fun!

Bonus: because everything is a tradeoff, you really can't avoid the engineering approach. Teaching the concepts and reasons for tradeoffs to less senior developers will be part of the job specification. Fun.

> language/framework of the week

To quote something I have often stated in our interviews - there are only four programming language families. Everything else is syntax.

Can I ask which are those families?

Of course.

1: Imperative - C, Fortran, Pascal, ...

2: Object-oriented - C++, Java, Python, Ruby, (maybe Delphi's Object-Pascal), ...

3: Functional - OCaml, Erlang, Haskell, F#, ...

4: Declarative - Makefiles, QML, SQL, ....

To be perfectly honest, I don't know which bucket I should use for Prolog. It's supposedly logical, declarative and functional at the same time. I've never managed to understand it, despite trying.

And for the record: perl in basic form is imperative. With the introduction of "bless" keyword it crosses over to object-oriented domain but following the syntax is not necessarily straightforward. [I've spent a non-insignificant number of days auditing OO-perl. It's not a pleasant experience.]

Those are more dimensions than buckets, and not very orthogonal ones at that. In particular there is strong overlap between declarative and functional language features (you could argue functional is just a special case), and less strong overlap between imperative and OO languages, and OO and functional. OCaml fits pretty comfortably in all four of those categories, when you want it to.

If I wanted to jam languages into 4 categories, it would probably be the Algol, Lisp, and ML families, plus All Those Other Languages. :)

I reckon imperative and OO are almost identical; if you don't have objects, you use function pointers or discriminated unions instead and end up with something very similar. OO is a design pattern that often has language support. And I could make a similar argument that functional is mostly just a style of coding, where some more restrictive rules are followed - but it also depends on a few specific language features, e.g. garbage collection and possibly tail calls. You can get by without pattern matching and first class functions, but they make life much more pleasant.

Declarative vs imperative is more interesting: what vs how, building descriptions of problems rather than chains of calculations or lists of steps. I don't think functional style is strongly related to declarative, except in the very low level sense that a functional program doesn't necessarily have a sequential evaluation semantics. But that's increasingly true of seemingly imperative languages also.

I've had teachers separate languages (or styles, or paradigm) in 3 categories:

* Imperative languages describe how to perform the solution.

* Functional languages describe the solution.

* Logic languages describe the problem.

I'm not sure I actually agree with this categorisation, though.

I'm curious, what makes the ML families different than Algol?

The ML family tend to come from a research background, have a more rigorous approach to typing, solid type inference, and very differently flavored syntax. MLs are functional (first-class functions, expression-orientation), whereas the Algol family in the shape of C, Java etc are mostly imperative. They tend to have nice pattern matching facilities. There's more, but that's a lot of it. A lot of these features have been transplanted into languages that are basically on the Algol tree, so it's often a bit fuzzy these days.

I'm not quite 40 yet (36), but part of what I've done is the specialization mentioned in other posts here (in my case, distributed systems with a focus on communications tech), and I've also cultivated my "product sense" to the point where (as an individual contributor) I'll often define and design a product and how customers will use it, in addition to doing the actual implementation. In that sense I have a little bit of breadth; I'm not "just" an engineer, I can also address the customer needs that lead to building a product, and then later refining it.

Understanding customer needs and translating that into product definitions is something that will likely never go out of demand, and is needed in industries outside tech. And if demand for distsys goes out of style I'll just learn something else. I've already kind of done that, having cut my teeth on embedded systems, followed by a short stint in mobile before getting to where I am now.

Judging by what I see around me, I don't see the strategy being any less effective in 10 or more years.

I started at 21 and somehow just kept getting older.

More seriously, having had a look around my cohort, there aren't that many people who've left programming altogether, and those that have have done so for personal reasons. Generally people have moved "upwards" and acquired management track positions. Small companies are good for this - because it's so flat you can easily get a high-ranking job title which you can then leverage into the next job.

Look "up". Look at the older and more senior people in your organisation. Maybe even directly ask them about careers. Recognise that if nobody around you is over 30, you're either in a very unusual place like an SV startup or you've wandered into Logan's Run.

I started at 17 (Basic on a terminal via a dialup modem to a mainframe).

Physically, I age linearly n.

Mentally, I age logarithmically 17 + ln(n).

Apply to a government job. There is less ageism in the Gov than in the private sector. At least in my country were you must do pass a very difficult and competitive contest.

Unfortunately, at least in the US, pay is not very competitive. Even with upward adjustments for cost of living in more expensive markets, many of the people on here with SV or SV-like tech jobs would end up taking a pretty large pay cut to work for the US government. (But it's of course a great option if you're in a bind and your alternative is no job at all. Or if you're a civic-minded individual who does it out of a desire to improve the sad state of our public services.)

- Keep your knowledge/experience relevant - Stay focused and positive - Promote collaboration, ownership, and leadership

Think of it like being on a first date: your idealized self. You're real, but like 120% real.

One way to make it past 40 in software is continue demonstrating that you're learning, creating, releasing, and share it publicly like you may have in your 20's.

Be an ASM expert ;)

> if you want to hear how to make it past 40 in software, this is probably an excellent place to ask!

So far in this thread:

- continue to learn / evolve

- go into management(well, someone will mention it)

- Go into: consulting / start your own business / remote work

I'm not too far from 40 myself. Any advice?

If you're coming up on 40 you should know a lot of people. You should have a work history so ridiculously long you need to omit 75% of it to fit it on a two page resume. You should know the business, which companies are succeeding, which are failing, and what problems are most interesting to you.

This is a huge advantage compared to some recent graduate that has no idea, knows nobody, and is still learning the geography of the industry.

Don't stop learning. Don't stop making contacts, friends, and other connections. At some point you won't need a resume to get a job, you'll just need to know who to call.

Thanks for the advice(and I mean that in all seriousness)! I need to start making some more friends it seems, and while I've been employed for 12+ years, my work history doesn't quite meet that standard yet.. looks like I have some more hustling to do..

I've been in "the business" since 1988, so maybe my experience is different. In that time I've met, worked with, worked for, and managed a number of great people.

It's not so much about hustling as it is developing meaningful professional relationships with people. You don't need to be a huge extrovert to make it work, you just need to engage with people. Solve problems together. Help each other out. If you get someone out of a jam they'll remember it, and when it comes time for a reference maybe they'll be there to back you up and vouch that you're the right material.

It's pretty easy to go about doing your job without really paying attention to anyone else on your team or at your company.

In a certain sense, if one has the talent, luck or planning to position one's self to make it in programming after 40, one may well survive to well over 40. One's position shifts from worker to expert.

But if the fact is that a large percentage of folks doing programming a 29 won't be doing it at 45 and aren't going to have a good exit plan (if they made a high salary in that time), then in a sense what's being said is:

"Programming is a dead-end job for those under forty" and about forty is the end-point for them.

I am 50 and I do OK. I also see people around my age and older who do well too. It seems to me that a lot of people who get filtered out in their 30s should never have been in the profession either for lack of motivation, interest or talent. If you are not self motivated to constantly evolve you have a problem. Moreso than in a lot of other professions.

"If you are not self motivated to constantly evolve you have a problem. Moreso than in a lot of other professions."

I was self-motivated to constantly learn, and had no trouble doing it... when I was young. It was the right profession for me then, when I had the interest and passion.

After a while, though, much of it starts to seem the same. The towering vistas that were once full of mystery, adventure, and discovery turned in to endless plateaus of rinse and repeat learning of technical minutia and buzzword tech of the day. I also developed lots of new interests, and started to want to have an actual life outside of work.

So then I very consciously decided not to strive to excel in my field anymore, because it would just take way too much of my time, which I'd rather spend doing other things. Then, before too long, I burnt out, and took a long break, eventually coming back to the field because I burnt through all my savings and needed money. This happened a bunch of times, with ever longer breaks in between.

Every time, I was able to brush up on the technology knowledge and skills that I needed to get a job, but I was never as excited about it as I was when I was young, and actually started to dread working with it, as I found it mind-numbingly boring.

I should have definitely completely switched careers after my first major burnout, but I didn't, and I've come back to the field again and again instead. This has definitely been a mistake, but here I am. I'm good enough at what I do to get work, and to even impress my managers... while I still haven't burnt out this time around and am still capable of putting in the overtime to get a lot done. But it's just a matter of time until I burn out again, and this pattern of not working for extended periods of time looks horrible on my resume, I haven't learned nearly as much as I would have had I stayed employed the whole time, and my career is nowhere near as advanced as that of people who can hack full-time employment long-term.

I don't think my case is typical, as most people seem to stay employed continuously in the long run. But I can't, and I feel I'm way too old for a career change now... and, anyway, I suspect whatever it is that I'd switch careers to would get boring before long and I'd burn out again. My interests are far too varied and I can't stick to doing any one thing for long before getting bored.

This is not to mention all of the endless corporate bullshit one has to put up with at work. Some people are really career-oriented and can deal with it. I'm not.

That describes me exactly. Coincidentally, I turned 40 a few months ago.

I took my current job with the goal of transitioning into a leadership role, either product or management. I was walking into a situation where I knew people already here and was hoping to leverage that. Those people left a few months after I started, and now I'm kind of stuck on an island. I've approached my current boss but he's completely uninterested in promoting any sort of career development.

Bottom line is I'm burnt out, depressed, and bored out of my mind. The idea of learning Yet Another Web Framework fills me with dread (I'm currently doing Node/React stuff and hating it), not because I don't think I can learn the tech, but because it's no longer fulfilling in any kind of personal or professional sense.

So, I really don't know where I'm going, or what I'm going to be when I grow up.

I know that feeling.

A lot of times I feel like there is not career track after you hit "Senior Software Engineer", especially outside of super specialty tech. It feels like the only real track is management, and most places don't want to promote or change the management structure at all.

It's honestly very hard to get excited about going from Senior Engineer to Staff or Principal or whatever the text title is. Nothing will change in terms of the job. Pay is pretty well pegged to what you started at with single-digit percentage increases every year or so.

Skill wise? Learning new frameworks is just not as exciting as it used to be. Even then - the reality is that whether you're the most skilled dev in the world or a mediocre dev with only a 9 week bootcamp of experience, you're going to not understand the new environment you're dropped into when you start. It's full of weird quirks and historical oddities. It'll take time to understand those and get productive in whatever the particular stack is.

All of which feels like... every time you start a new job, you're starting over. With a little more insight and vision, but fundamentally, doing the same thing over and over.

This is me exactly. Started in the tech field, burnt out and became a teacher, returned to tech for the money, and I'm currently in another burnout period. I agree with everything you said about learning being exciting when you're new, but not after you've been doing it for a while. Also agree with the bit about corporate bullshit and just not being a career-oriented person.

I really think this time I'm out for good. I'm approaching that age where it's hard to get programming jobs anyway. I've got some side businesses doing OK; I'm working on ramping them up, and trying to get some teaching gigs to fill the gap in the meantime. And if all that fails I'm considering getting back into the lawn care business like I did when I was a student.

I used to be very passionate about programming: I started in 4th grade, started my first software business in college, and was always doing side projects.

I think I escaped this because I got into tech fairly late. I dropped out of school, did windows desktop support, got tired of that, worked for a voip company, got tired of that, became a network engineer, got tired of that, went backpacking for three months, became a sysadmin, and almost immediately started shifting towards devops — at which point I hit the career lottery— something that I really enjoyed and also paid a lot of money. But I was already 36. So I’m 42 now and I absolutely love my job and have recruiters beating down my door. I imagine I won’t get burned out for five or six years and by then I’ll be ready for management.

I have already felt the way you do and I'm only 31. I think 90% of that feeling comes from the type of problems your job is asking you to solve. It's definitely tough to keep the interesting work flowing in, and avoid the tedious "hey build yet another crud app!"

My strategy for my career is to try and position myself into a position of enough authority to decide priorities around software engineering, and what my team is working on. Then I can delegate all of the boilerplate to the younger engineers, and spend more time collaborating on much more interesting and difficult problems.

Do you take vacations? It's a wonder what it does for your state of mind.

Hi, me!

I look around at my bigN and I see all these 20something people who just love what they're working on and are passionate about it, and I get it. I was the same way at that age.

I just don't care anymore, tbh. I want to work on something fun, and things that are fun don't usually provide a paycheck. So I grind away, and the years waste away, and I don't know what to do about it. I'm smart enough to get by without investing a lot of effort, but I'm really just wasting my life.

Isn't this the story of most people in our society?

You need to start consulting, or body shop contracting. Working six months of the year as a warm body is doable, ditto for consulting but you need to do all your own marketing and sales, and that’s not really compatible with completely dropping off the face of the earth. You can do it in five to ten hours a week but you need to be disciplined and consistent about it.

>If you are not self motivated to constantly evolve you have a problem.

Programmers to programmers: You must be a super agile ninja capable of teaching yourself everything at a moment's notice.

Businesses to programmers: Yeah, we still need people doing Java 6 and VB6. We're also going to disregard all of your security and technical debt concerns. We're also going to hire managers who suddenly change a bunch of tech, causing the rest of the team to re-train themselves, harming their career.

I still don't get where this need comes from. The world still needs LOB developers. The uber-ninjas haven't solved that problem yet. It isn't going to maximize your lifetime earnings, but you're not likely to retire under $100k either.

Programmers are extra-ordinarily hard on themselves and other programmers in industries where businesses still just don't give a fuck because they are making so much.

I really wish I knew why you guys did this to us.

> Programmers to programmers: You must be a super agile ninja capable of teaching yourself everything at a moment's notice.

I'd be interested in a demographic breakdown of the people who say this. I'd be willing to bet that most of them are in their twenties, live in SF/SV or Seattle, and/or work at startups or insanely demanding trendsetting companies (e.g. Google, Amazon, Tesla, Apple). In other words, these claims are made by people who care about being "cool".

I don't think you'll hear this much from people who live in less trendy areas and work at less trendy companies. Older people and people who work at conservative companies in conservative areas of conservative states (Hi, I work at a B2B telecom in Plano, Texas, and I'm in my 30s) probably aren't going to be making those kinds of claims.

If you stop caring about being "cool", you'll find plenty of work that'll sustain you until you retire.

> > You must be a super agile ninja capable of teaching yourself everything at a moment's notice.

I imagine there's also a good helping of: "Hey, the MVP works and I won't be here in years 2-10 maintaining it, so everything I can see indicates I successfully learned X on my own in a month."

The best software engineers I have ever worked with were all 30+ some 40+, some even 50+. It is simple, practice makes perfect. All of the old timers were perfect culture fit as well, if somebody curious about that. I know entire companies built by only 40+ software engineers too.

What if one evolved into another career?

i did just that, but kept programming as a 'hobby' (a real engaging one, time-wise and all). I feel, as older I get (turned 37 a week ago) that more power I wield. Tasks that were daunting when I was younger now seem trivial, new technologies are understood with more in-depth views and I grasp them faster than ever. Focus is much more intense too. I feel it's not even my proverbial final form. Age is an advantage as much as youth is regarding energy (although I still haven't noticed a drop in work stamina yet).

No problem with that. My comment was about staying relevant as a software engineer.

True, the software constantly evolves, but that doesn't mean your hardware has too as well.

I think this bias needs to evolve, and maybe birth year is more descriptive than age.

In 2000, a 40 year old was born in 1960, when the home PC was nearly 20 years off.

A programmer who is 40 today was born in 1977, and they could learn to program while they were in diapers.

These are very different experiences, and the bar will move over time.

I am your 1960 born example (my bday was yesterday!). I'm 57. I've been a professional programmer (read, I've been getting paid for it) for 30 years. I live in a small town in Northern Ontario. I'd like to be paid more, but I'm in the hunt on the payscale. My resume says I'm a full stack dev. You'd believe it if you saw it.

That being said, if you've been in the occupation long enough, you figured out that its a continuous learning occupation. The thing is, the fundamentals I learned in the 80's and 90's still do me in good stead to this day. I don't write in assembler or even C anymore, but concepts they taught are still deeply ingrained.

I've got another few years in me before I can retire, and when I do, I'll continue writing code for fun. Simply because I like it.

(Oh ya, first computer was 3.5Kb (3583 bytes!) VIC 20. The first computer I ever SAW was a neighbours PET 4040 he brought home from school over the weekend. I bought the VIC-20 the next day.

Southern Ontario here. 36. Programming professionally for 14 years or so. Picked it up as a hobby when I was 9 or 10.

The "fundamentals" predate you and I by quite a few decades. Alonzo Church, Turing, Gentzen, Liskov... their work really paved the way for us all. I suspect the lambda calculus will be as useful 30 years from now as it is today. As long as I'm working in a language with first-class functions as values I think I'll be okay.

It's funny that many people still hold the belief that even mathematics is a young person's game. There are plenty of examples of mathematicians making significant contributions after the age of 30.

For me I think the life of a programmer begins at thirty.

I was born in 1977 and had a computer that plugged into the TV. I remember copying basic programs out of a book, but do not remember my age at the time. I had no way to save the programs, so it was pretty stressful when the power would get shut off. I continued to tinker with computers/tech (my Nintendo was in multiple pieces, yet still worked :) ) off and on, but didn't get serious about programming until 1995 and have been doing it ever since.

I'm sure some companies have an age bias, but I think it will be less and less as the generation who grew up with computers gets older.

Confirmed. I was born in 1977 and learned to program on a pocket TRS-80 at age 4 (no joke).

Born in 1984, learned to program on a C64. Also at age 4.

Worrying for most of us who saw the first computer at the age of 16, started coding at the age of 19, and still have to make it in the software engineering world.(26 y.o now).

nah, 4 year olds are bad at learning.

Another 1977-er, but I didn't get to it until I got a second-hand C-64 in 1986.

Yep, 79er here, LOAD "$",8 baby! :)

Oh the memories...


This. Exactly this. In five years time it'll be the 40-45 range, and in ten years 45-50. There is a point in time to where computers and software development really started to take off, becoming ingrained in every person's life from the start.

> There is a point in time to where computers and software development really started to take off, becoming ingrained in every person's life from the start.

Computers and software use, maybe; for software development it hasn't happened (it seemed to be heading that way in the mid-1980s, backed off starting around the beginning of the 1990s, and seems recently to have turned back around at least in terms of what lots of people are saying. We’ll see...)

> A programmer who is 40 today was born in 1977, and they could learn to program while they were in diapers.

oh, those 1st world countries...

The answer is pretty reasonable in the Quora post (Answer: no). In my experience, people who get something more out writing software than a salary tend to stay in software development, at some level, for their entire career.

That said, if you're a software developer and you're also good at communicating and managing people, seriously consider moving into management. While I think the stories of the '10x' developer are true, I also think a good manager can double or quadruple the effectiveness of their entire team. That has a huge impact both on the careers of the people being managed, and at the company.

I also think a good manager can double or quadruple the effectiveness of their entire team

Another way of thinking about this: Managing people really well is really hard. Most managers are not the best managers. Average managers often end up doing things that reduce the output of their group in the interest of things not going horribly wrong. The Pareto distribution rears its ugly head, once again.

I wish I had more than one upvote. If you have no management experience, and no mentorship in managing, you have no business managing people. Managing requires empathy and a desire to help your subordinates level up, not SWE skills. Your job is to deliver value to the org and to your individual contributors in equal parts.

At least as a software developer, you're only breaking code. But you start affecting people's lives, emotional well being, and career paths when you're a manager, and I find it rare where a manager understands this.

This is a big problem in startups. Developers with very little to nothing of not only management experience but even development experience become VPofE/Directors/CTOs and wreak psychological havoc on people they are supposed to manage. Sometimes they do it on purpose because of insecurities or many times just unintentionally.

Usually the bigger problem there isn't as much the individuals, but why the startup needed that role to begin with.

A lot of times early companies will go through the motions of, "needing" a VP of Engineering, or CTO, or whatever. In reality their process does not need that, and those roles are either meaningless titles at best or cause unnecessary process to crop up and hinder productivity.

This perfectly articulates something I tried to get across to upper management last time we hired a manager for our team. It seems so obvious and reasonable, but they wouldn't hear it. So, we ended up hiring someone who not only had no real management experience but who also didn't particularly _want_ to be a manager, instead preferring to write code.

This created conflict exactly as you'd expect and team morale fell apart, projects slipped, etc. Of course, management takes no responsibility for this, instead diffusing blame to employees via suck-it-up type platitudes (couched in "nicer" language but with that essential meaning).

How about team lead?

Team lead is acceptable as it's more of a servant leadership relationship, and under the right circumstances, a team lead should be able to grow into a full blown manager role.

I am a big fan of 'trying out' future managers in the team leader role.

Senior developers don't just write code. They know how to get other people to write code and frame problems in a way that they can be implemented within whatever organization employs them. Many companies value these talents quite highly. They are not that different from the skills that characterize great managers.

When I worked at Microsoft, I asked my dev manager why he took such a boring (as perceived by me) job. He said that he'd peaked at how productive he could be personally, and the only way he could improve was to scale out...

A good manager is really, really hard to find. Good developers are easy to find (relative to good managers). So, while I don't want to get into management, I also recommend it to anyone who might have potential in that regard.

I don't know man. Middle management is a really shitty job. You have to do stuff that you don't believe in (because upper management says so, and they _might_ be right, even if every fiber in your body tells you they're wrong). As a dev, even as a very senior dev - I find that it's ok to be true to yourself, and say that you don't believe in it even if you have to do it. You can also easily switch teams. As a manager - you _have to_ be positive and sell it to your subordinates, who may be (should be!) your friends too. If you don't sell the vision well, there's no real way to know whether it failed because it really was a bad idea, or it failed because you didn't motivate your team to work on it.

This, I found, is too much for me. I have no remorse if I demotivate my managers; and I can keep my mouth shut and not say anything, to avoid demotivating my junior-dev colleagues. But as a manager, "keeping my mouth shut" is not an option.

Why, though? I mean, you’re looking at the best case scenario: developer turning manager boosts productivity of company X.

But what’s in it for the developer? If the developers is happy and well-compensated bad has no managerial/executive ambitions/fantasies, why ruin a good thing (with a gamble, at that)?

Then don't. GP said "seriously consider moving", not "move". Presumably the developer would do it if they value potential impact over the risk.

Isn’t it a big loss to move into management when one loves to code?

Personally, I loved to code, but management has been much more fulfilling for me. Helping my team grow, professionally and personally, is a hugely rewarding experience. I still get to exercise some problem solving skills (though I let my developers make the technical decisions) and I still help craft a great user experience, which were my favorite parts of being a developer anyway.

In my experience it depends on what you love about coding. I happen to really like systems, and building a good team is much like building a resilient system in code. Each piece supports the others.

But it is not a good idea if you really are passionate about the writing. For example, my daughter majored in English because she wants to write novels, not because she wants to teach English. So in her case I certainly wouldn't tell her to go get a job as an English teacher.

In my opinion, the second worst kind of manager is one that doesn't want to be a manager and would rather stay an individual contributor. Their inner conflict is harmful to the group.

Once you know how to code really well shouldn't you be in a position to manage developers and ensure that they're doing a really good job at writing code? You'll be able to help them tackle bigger problems, work on more ambitious solutions, than they could if you're stuck cranking out code with no time to assist.

If it's your job to magnify the output of your developers, when you know intimately how to do their job, you can get spectacular results.

Non-technical managers can't do this, they don't have the depth and insight to help shape their team the same way a developer does.

I would agree with that, particularly if you totally abandon programming. But management skills are useful as you continue your programming career and development.

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact