Hacker News new | past | comments | ask | show | jobs | submit login
Programmers: Before you turn 40, get a plan B (2009) (improvingsoftware.com)
452 points by derwiki 46 days ago | hide | past | web | favorite | 430 comments

I'm not sure I'm buying that. The JVM ? Still rocking it 20 years after. Memory allocations pattern ? Still there. The network stack ? Well, doesn't seem to have changed a lot.

The older guys, they seem like they had the time to correctly learn the unix network tools, the jvm debugging ones, the memory inspection ones. I've known older devs for which I have the utmost respect because I felt like they can just debug the shit out of anything happening on a computer, with tools I don't even know of but have been there for decades.

And here I am, needing to google how use tcpdump or jstack, and scrambling to correctly understand the result.

I agree that new tech is always stacking, but I feel like it's pretty damn hard to catch up on the in fact still very relevant and important old one, because it's no longer taught, no longer a meetup topic, no longer hype. And the new one is really easy to learn when you realize that it's 90% rehashing of old concepts (Observables are all the hype in javascript ? Well, great, I've learnt that pattern 15 years ago...)

Admittedly, I have no clue if the management realizes that !

But the impact of this knowledge, from what I observed, is really huge on productivity, and is especially a boon when the production is on fire or generally when tricky stuff happens.

So, respect for the elders, and please come and teach in conferences and meetups, we need more wisdom and less hype !

Yeah, you hear all these horror stories about how developers have a "shelf life."

In the meantime, though, every single older developer I've met has been extremely knowledgeable and frankly much better at the craft than myself.

Look at any hobby -- say, guitar playing. The difference between someone who's been doing it for 4 years and someone who's been doing it 20 years is ridiculous. They're so much better. I feel software dev isn't that different.

The only problem is if you're older you do need to follow new technologies. Older devs I've talked to who were pushed out of the field are the types who worked on, say, mainframe technology well into the mid-aughts and never thought about learning anything new.

As long as you get off the sinking ship of technologies that are obviously waning (and you should have at least a decade of warning in advance, I think), I don't think it's honestly that hard to stay current. You don't have to follow every fad, but make sure people will continue to use your main programming language.

Also, startups discriminate on age because they need people that can work until 2 AM and drink their Kool Aide. But there's a lot of stable, non-startup jobs out there. People who have never left the scene probably think of it as death, but there's actually some really nice enterprise shops out there with great developers.

One thing I've noticed is that there seem to be a lot more older developers and engineers at hardware-oriented companies.

I currently work at Wing Aviation, the Alphabet drone delivery company. We were originally located in the Google X building, which is the old Mayfield Mall in Mountain View. I used to shop at that mall in the 1970s!

X is full of hardware startups, and when I started there I noticed one thing right away: for the first time in a while, I was not the oldest person in the building.

Now that we've moved to our own smaller building, I may be the oldest again, but not by a wide margin. There is a lot of gray hair in our teams.

I'm 67 and have been programming for 50 years.

That's probably not by coincidence. The cost required to fix a hardware/firmware defect vs a software defect is much more. I'd definitely want to put my best engineers on the product that can't be easily fixed once it's shipped vs software than can always be patched.

> I'm 67 and have been programming for 50 years.

Also, that's really impressive. You've probably forgotten more than I've ever learned!

Personally, I find it slightly weird that I've been in the industry longer than some of my co-workers have been alive.

Related experience in the opposite direction: I'm a relatively new software engineer that works on a 13 year old product. It's fun to see a particular commit and think about what I was doing in middle school, high school, or college at the time it was committed.

I mean, I feel like more programmers were working closer to the metal back in the day.

In the 80s you needed to understand computer hardware much better and there was a bigger fraction of C++ developers. Now all those people have migrated to hardware (or have kept doing it) because that's where their skills are needed.

My experience has been that many younger engineers don't want to work closer to the metal as much. They want to work on the bleeding edge, sexier technologies.

I would enjoy working close to the metal in some capacity, it's just not really what the majority of modern tech businesses require, and moving in that direction is likely not worth the sort of career reset it would require.

I've noticed that most dev work is "full-stack developer". This usually means that you'll be bouncing between back-end (python, rails, node, go, w/e), front-end (react, css), and possible devops.

I'm not sure how satisfying this is. You probably don't get much chance to dig deep and solve tough problems, or craft as you would in a hardware-oriented job.

I've never worked in hardware though, so I'm quite possibly complete wrong about their day-in-the-life. Just rampant speculation.

> As long as you get off the sinking ship of technologies that are obviously waning (and you should have at least a decade of warning in advance, I think)

You also don't need to get on the ground floor. Every technology boom creates next years legacy apps, there will be React work for years after the next tool gets popular for one example. The article touches on it, but you can make a good living out of consulting work for older technologies.

Another angle is moving into non-tech domain specialization. Becoming an eCommerce consultant, your value proposition isn't knowing React/Shopify/Demandware etc, it's knowing the eCommerce ecosystem and domain. I know devs who specialize in agribusiness, automotive software, construction, energy infrastructure and such. They build solutions to problems in those spaces that happen to be software. But they get hired based on their time in those industries not their current tech stack.

The guitar playing is a good analogy here. Say you've played classical for years and dabbled in rock and some jazz. Like any musician you've played around a little but probably focus on one style. You're best friend asks you to play flemanco for them at their wedding.

Sure, you won't be as good as someone that's played flemanco all their life. But if flemanco was invented 5 years ago you're sure going to pick it up fast and probably be better than someone who has only been playing it for the past few years.

Skills translate. Transfer learning is unsurprisingly a real thing. The funny thing is that a guitar player will pick up a brand new instrument and quickly gain and surpass someone who has only been playing that instrument for a few years. Even though each instrument is a different "language" per say, there are common patterns in the language tree of music. I don't think any programmer worth their salt would disagree that this is also true for programming.

No matter the language you use, there are common patterns. Someone who has been programing for years generally picks up a new language quickly (because of this). Certain languages will make you a better programmer in general too (low level languages help you understand what's going on behind the curtain).

So should a programmer with 30 years of experience in C and 1 year in rust be paid more than the programmer with 5 years of rust experience? Absolutely. Those 30 years they were learning to program, to debug, to solve problems. The skills translate. And I say this as someone under 30. I've seen the wizards solve problems in languages they've never used because they just understand programming.

>No matter the language you use, there are common patterns. Someone who has been programming for years generally picks up a new language quickly (because of this). So should a programmer with 30 years of experience in C and 1 year in rust be paid more than the programmer with 5 years of rust experience? Absolutely. Those 30 years they were learning to program, to debug, to solve problems. The skills translate.

Tell this to HR/recruiting departments.

I'm a highly experienced C developer with some python and JavaScript experience on the side yet can't get any jobs on those as HR/recruiters will just filter my resume out every time due to "insufficient python/JS experience".

HR has no idea how programming skills and experience translate across languages, they're just trained to filter out people based on buzzwords.

On the other side, everybody is looking for people already versed in the languages they need right NOW and aren't gonna take the chance on someone proficient in other languages hoping they'll master the new ones soon enough. Too risky for business.

I think that's much more true of large companies than small ones. At a small company, the engineers are more likely to be involved in the hiring process, and (unless they're very inexperienced themselves) they'll have a better sense for how skills translate.

For example, on my team, we're writing most of our new code in Kotlin. But even though the language has now been around for the better part of a decade, we regard prior Kotlin experience as only a very slight nice-to-have when we're evaluating someone. If you've programmed in Java or Scala or C# or C++ or pretty much any other statically-typed language, you'll pick it up quickly enough that the time to get productive in Kotlin will be dwarfed by the time to get familiar with our code base.

What we do usually filter out, though, is monoglots. If you have 10 years of experience and have only ever done, say, Ruby on Rails development, you will probably not have the breadth of engineering perspective to succeed on our team. But if you've done RoR as well as something else that's dramatically different, that's fine. We would probably even filter out a Kotlin monoglot in the unlikely event we ever came across one.

You would definitely pass a resume screen at my (large-ish) company. We don't screen people out for lacking specific experience in our tech stack. Don't get me wrong, we like experience in our tech stack, but with 30 years of C, we wouldn't care what else is on your resume.

Maybe you just picked the wrong companies.

30 years of experience in C? Would you mind dropping a resume to my username AT google dot com to give me a chance to get referral bonus? :D

I don't think many employers are interested in training anyone at any age anymore. Margins are often too lean to dedicate productive resources to training. When switching jobs, I always emphasize the aspects of previous jobs and accomplishments that would be most relevant to the employer (eg, delivering on time, under budget, reduction in reported defects). Having projects in GitHub or a portfolio also speaks volumes.

I like this analogy for a separate reason. Many musicians can relate to hitting a ceiling in terms of ability. You might have X years playing any instrument, but depending on talent, you may have plateaued for some period of time. It is highly likely for anyone that has stayed in one role (in a mature industry) that has not actively sought extra-curricular technology to experience this.

But plateaus are local. Usually they happen because you don't know how to progress anymore. Once you figure that out you jump ahead again. I think with age it is easier to become complacent and just say "this is enough". Which there's nothing wrong with that. But I think it is different than hitting a real ceiling. You're not at peak.

> every single older developer I've met has been extremely knowledgeable

To be fair, this observation could be massively influenced by survivorship bias - more talented devs tend to be the ones still programming in older age.

At the same time though the industry has an age distribution that reflects the arc of the industry more than the selection of the workers.

I entered the field in the mid-90s and saw the industry hugely explode at the end of that decade. So as all of those nascent developers look around they noticed how few older developers there were and said "boy, this really is a young person's profession!". I remember all of the "before you turn 30, get a plan B" articles then, everyone fearfully looking into management or dubious project management paths.

And as that cadre aged, of course most new workers are younger, so proportionally they drop from a majority to a minority, but there are a massive number of very gainfully employed, successful older developers, and it certainly has been normalized far more than then.

To go with the guitar playing analogy, there's plenty of people that have played the guitar for 20 years and always play the same 4 chords.

Indeed. My previous job was very Java-focused but wanted a more senior man on the team. So management hired a developer with 20 years of experience in C and Fortran.

Now in my country, when you hire someone, a probation period of two months is very common. In that period, you can be fired and/or leave without cause.

Unfortunately, he was not able to pick up object-oriented programming in those two months and left before the probation period expired.

That argument applies in the other direction as well: it's a very relatively young industry full of big industry-wide disruptions at least once a decade since it has been its own industry. Companies with young median and mean ages may have been more likely to survive (or evolve out of) some of those big early disruptions, but that may not be a long term equilibrium for the industry.

Keep in mind, too, how much the 80s, 90s, and 00s median/mean ages were impacted by "flash flood" millionaires that could retire early simply by how much money was thrown around in the various "software revolution" and "Dot Com" booms.

I seriously doubt the number of instant millionaires is significant enough to impact the total numbers. 25% of programmers did not become rich enough to retire, and if so, I'm doing everything wrong.

It was always a matter of luck. If there were ways to predict such a lottery ahead of time, I think the industry would look quite different today.

As with flash floods of rivers, the coverage of such events was complex. It probably affected some companies a lot more than others.

Anecdotally, it used to be an aphorism at Microsoft that if you hadn't made your first million by 30 (or was it 25?) you were doing something wrong. Certainly the demographics at Microsoft showed several clear waves of early retirements from stock booms and bonuses, and for a ~forty year old company the median age is still staggeringly young today, even accounting for industry ageism.

If it wasn't clear, I don't expect those flash floods to happen again, they definitely seem to have been flukes of luck. But I think it shouldn't be ignored that it had an impact on the industry demographics.

> I don't expect those flash floods to happen again

Google, Amazon, Facebook, Netflix... It already happened again multiple times after Microsoft.

Also, most languages are not sinking ships. Certain uses of them are of course. But even languages this site hates, such as C, C++, C#, and Java have very long lives ahead of them for certain uses. Obviously C++ vs Rails... well, you're going to be out of a job. But if you were using C++ for the sorts of things you would have used Rails in 2009 you were already a decade or so behind the curve.

I doubt it. C++ has come a long way in the past decade and still has a lot of momentum.

As for rails? Admittedly I'm in an anti-ruby bubble... but the Ruby developers I do know definitely don't use Rails anymore. Ruby seems to have matured into basically just Chef/Vagrant in my bubble and any Rails apps are being deprecated in favor of the bubble-biased languages.

Rails developer here. It's still quite easy to get a Rails job; maybe not for Google, but easy. There's much less hype but also less competition over jobs (many young developers come with Python/Nodejs backgrounds nowadays, many senior Rubyists left for greener fields).

> But even languages this site hates, such as C, C++, C#, and Java

What makes you think this? I have been on here for years and have never really got this impression.


- Any thread about rust

- Any thread about an exploit in a c variant

I think you miss the point about 'shelf life'

The shelf life is due to a business decision, where the knowledge the older person has is deemed insufficient to justify the compensation they've grown used to.

Sure, you love working with them and it's better for engineering, but the business wants college grads they can pay less, even if they don't do the same quality work.

Importantly, this doesn't have to be logical or good for the business. It just has to make sense in the context of quarterly targets.

It's not just quality, it's capability.

As I'm fond of saying, I may cost twice as much as an entry level engineer, but I can do things that two entry level engineers can't do.

As a former software guy, life long computer hacker, who moved to desktop support for years, I ran into new people hired as programmers who had to be coached to hit Ctrl-Alt-Del properly to log into their newly deployed laptop.

I'm working on hiring back into software roles now... I figure if the bar is that low, I can rock it out if I can just convince anyone that I can still code.

Interviews have been interesting.

More significantly, you can probably do things that no amount of entry-level engineers could do. The question is usually whether your (prospective) employer benefits from any of those things, whether they realise it or not.

I have a similar expression. My daily price might be higher, but they will cost you more.

I keep hearing this but sometimes you just need hands on keyboard to push things out fast. Yes I can do a passable job at front end, middleware, databases, and I'm pretty adept at cloud infrastructure and devops. Does that mean I'm as valuable as 5 people? I can only do one of those at the same time.

It depends entirely on how much autonomy and motivation you have. I have to do all of that for my own modest product, but it's only viable because I have complete autonomy to decide what the most efficient approach would be... and I enjoy it.

This isn't typically how it plays out when you start a new job for someone else. Which means they won't reap the rewards of your experience until enough trust has been built to give you the autonomy to make it happen (this may never happen). So it takes time, and not everyone wants to take that time or can see the benefit, and arguably in the startup space there may not be enough runway for it either.

It’s not about autonomy. I work for a small company where I have a reasonable amount of control of how I implement anything. But, I can only do one thing at a time. At a certain point, you need more people to get anything done in a reasonable amount of time.

This may be true, but your company always wants the most business value, not the best/most elegant/most maintainable code. If you can deliver your boss's business interests at even 1.5x, I think you will have a long career.

> As long as you get off the sinking ship of technologies that are obviously waning

But then you would have to compete with young people starting out in the same technologies. And they are more naive, more easily manipulated and exploited, corporations love that and prefer that to 20 year experience in the field they can't even understand could be important to all the new technologies.

That's sometimes the plan. But as always, you get what you pay for. Some management teams are savvier than others.

> In the meantime, though, every single older developer I've met has been extremely knowledgeable and frankly much better at the craft than myself.

While this is true for you, there is a lot of selection bias and survivor bias involved in this.

For These older developers you need to also consider their comrades they were working with 20 years ago. For every older dev you work with today, there are 10 of their former buddies that decided long ago they didn’t need to advance beyond VB 6, or IVM mainframe assembler or RPG/3.

Or they hit the stock lottery jackpot and have retired, or moved into management, or marketing, or started a sandwich shop.

But I do think there's a bit of generational shift that the stereotype of outdated engineers haven't caught up with. When I started out in the mid-90s, anyone more than a little older than me didn't have a CS degree and had come up in a world without source control, continuous integration, automated testing, iterative development processes, open source libraries, etc, etc. So in some ways they were from a different planet than the next generation. Although the tech du jour still changes quickly, programmers in their 40s these days still have a comparatively small gap when it comes to education and development process.

I was using source code control and open source software in the mid-80s.

The myth of developers not being able to keep up as they age seems to be mostly propagated by two types of people. 1) Managers who want to keep wages low and 2) Young developers who try to hard too prove themselves and don't want to listen to/or work with older developers. [0]

[0] Note, I'm not saying all young developers or all managers. Only a small subset of each.

> "mainframe technology well into the mid-aughts and never thought about learning anything new" - today this is called the cloud

What is the new ship we should all be jumping to?

Also, startups discriminate on age because they need people that can work until 2 AM and drink their Kool Aide.

There may well be a correlation there, but which is the cause and which is the effect is less clear.

I am at a startup just now and they are complaining about frontend devs as - they don't hang around - they don't want to touch the old frontend written in Dojo, just the React stuff.

I'll do whatever (I am officially a 80% backend dev who can do enough frontend when needed). The Dojo way of writing things actually seems a lot nicer than React, but it ain't cool.

I don't care about it being "cool", but I'm not going anywhere near a company doing Dojo because it isn't marketable.

> In the meantime, though, every single older developer I've met has been extremely knowledgeable and frankly much better at the craft than myself.

That just means the field isn't totally corrupt with fashion or nepotism or whatever. Survivor Bias is a perfectly plausible explanation, too.

Every single older guitarist I've met has been a fantastically successful rock musician.

>> In the meantime, though, every single older developer I've met has been extremely knowledgeable and frankly much better at the craft than myself.

This may seems strange from where you sit but trust me. If you stay in the field that length of time, you will probably become that guy. You don't need to set out to be him. The only thing it requires is time and a refusal to stagnate.

I think a lot of the age-ism has more to do with cost than skill. Older developers ask for more pay, negotiate more, and are less willing to put in uncompensated overtime.

The way to get past this is to make sure you develop skills that are exotic and high-end and hard to find so you can't easily be replaced by a cheaper college grad.

I went "full software" back at the age of 36. I did this because I saw the writing on the wall for systems/devops and being a half-developer, essentially from writing scripts to full OOP.

I put a ton of time and thought, and effort into the change. It wasn't easy to complete the transition and get hired, but I settled on C#. Chosen because C# and Java are similar enough (my only formal education in programming was in Java), and I could move between the two, if I had to. I also saw MS driving the future of .Net really hard, even after about 10 years before it looked like there was no future and MS was going back to unmanaged code. I also wanted as stable of a development platform as possible, an attempt at avoiding churn with all the kiddos in the web space. I'm still working fulltime as a C# dev and enjoy it. But if anything ever happens, I'll jump ship to the Java space. Both the city I live in and less-populated state that I'm from have copious amount of C# and Java jobs available, so I'm not on something so cutting edge that I have to live somewhere specific. It's my opinion that the best balance for most software projects was nailed with Java (and then it's descendent, C#). There's a reason it's so popular and they're not bad reasons.

My dream job is to come full circle and be "the IT guy" at a small company, handling software and systems as a one-man operation. That can be done in the MS space, as less expertise is necessary in general on the systems side to manage the entire stack. Even less so with Azure.

Will I be pushed out? I'm sure. I'm expecting less of a shelf life than the old hats because so many more people are raising their children to be ready for this field. Also, outsourcing/visa migration was fully implemented to diminish the industry for workers, since about 2000. The industry that was supposed to be the next buoy of the American middle class has been successfully undermined by American hypercapitalism. Greatly reducing prospects and the future for even my generation, the golden generation of children that grew up on a Commodore. A special time of astute technologists raised when we had personal computers at home, but before iOS collapsed the barrier of entry and nullified the mystery & effort required to get to entertainment.

Amen to all that.

OA> Herein is the source of the problem. The more irrelevant experience a candidate has, the more lopsided the utility/value equation becomes…

No, the source of the problem is the idea that 10 years of “C++ experience” is “irrelevant” to a project using Rails, as if development experience is somehow locked to the language you happened to do the development in.

Being a valuable senior developer is about 10% what technology you know and about 90% how good you are at working in a team to solve a business problem with a machine, and those fundamental skills haven’t changed much since 1960.

As you point out, the more experience you have, the more you realize that all these “different” technologies are mostly the same wine in different bottles. Or, as I like to say, “it’s all just software”.

I worry the real problem is not that managers are looking for someone who can do the job, but someone they can exploit.

The gleeful exuberance over 'new' things is something you can use against a younger programmer.

The older one who knows the vintages doesn't get as excited, because it's really not that exciting. You haven't discovered Shangri La. We've kinda already done 90% of this before, just maybe not all at the same time.

But there's good exciting and bad exciting, and it takes a while to learn the difference (I work with a bunch of people who apparently have not).

What's bad excitement look like? War rooms, for one. But they are the same dopamine hit as Stockholm Syndrome and so the battlescarred have bonded while the rest of us are doing everything we can to stay out of war rooms (I haven't been in one in well over a year).

I've started calling these people BASE jumpers, because they're adrenaline junkies. There are not so many older BASE jumpers (I hope it's not because they're all dead). I think you learn to value other things besides adrenaline. Like teamwork.

Maybe the problem is the big tech producers that introduce new technologies over and over, that don't raise productivity but create churn, force software rewrites, etc.

Tech history looked at from 40yr perspective, really seems like reinventing wheels over and over. I'm not sure if there is another industry with so much churn yet with a lot of rehashing of old ideas.

"It used to be the case that people were admonished to "not re-invent the wheel". We now live in an age that spends a lot of time "reinventing the flat tire!""

Alan Kay AMA: https://news.ycombinator.com/item?id=11941199

That's just the startup world. There are a million chill software jobs out there. If you get sick of it, save up a bunch of money and move to Dallas or Atlanta or any other large-ish city and find a quality enterprise shop.

I think a lot of problems some older developers have is that the culture has shifted. Maybe I'm off the mark, but I feel like older devs come from a world where programming was all about programming and it was less a formal career than it is now.

To succeed in programming, you have to work well in a business. You can't be Linus Torvalds, walking around with a big ego, unless you're so important it's hard to get rid of you.

You have to work with people well, and some of the older people I've worked with seem to understand this less.

Again, not an insult to older devs. Plenty of them are wonderful. And I could be completely wrong. Just an observation.

I agree with your premise but not your reasons. personally I'm a programmer - I know enough about business to know I don't have good insight.

everyone I talk to doesn't want* a programmer in the older sense. they want people to integrate external large packages. they don't want to to test, they want to prop something up and see if it makes money. they don't want to talk about architecture and features and make plans - they just want to respond on a dime to what the market is saying.

without any value judgement - I personally don't have a lot to offer in that environment. and to varying degrees I think the hiring managers understand that (sadly, quite a number don't)

I agree with you.

I recently started working on a project in which the company had outsourced it to a wordpress shop, despite it not being a wordpress site.

fast forward a few years and no one fully trusts the software so everything it does gets checked/verified by a human, and they're actually hiring people because the workload is too much.

So they brought it in-house, which is where I come in.

And let me tell you, this codebase is horrific. To give you an idea, they needed a progress bar and ended up writing a FSM with the transition between nodes being http redirects. Now, the developers who worked on this wouldn't know what a FSM is if it bit them in the ass, but that's what they built. And the data for detecting if the background processing was still running was duplicated in 3 or 4 places, which means you would get states in this FSM that disagreed with each other about whether or not processing was finished. These pages would literally "argue" with each other over it by redirecting back and forth until the browser threw a too many redirects error.

I came across some code yesterday in which the developers had created a string of multiple insert statements. They then split on the string "INSERT INTO". Said split removes the INSERT INTO from the string. They then looped over the resulting array, prepended "INSERT INTO" back onto the sql and called the DB 1 at a time. I don't think these guys realized you could separate MySQL statements by semicolon.

My point is that the codebase is horrific, written by someone who was used to the style of development you're referring to, and they made an absolute mess.


To your point. The company realizes how bad the software is and I get a surprising amount of respect when talking about what needs to happen. The result is that we're making great strides towards making improvements, and despite how horrific the code is I find myself enjoying the work because it allows me to have an aspect of my skill set respected that can sometimes get ignored by people because it's not forward facing. Good software dev is boring as shit, so when you're doing it well no one notices.

Also, this post got me to thinking. I should submit that SQL code snippet to the daily wtf.

Ah, speaking of code snippet of the day. Found someone’s code trying to check if a promo-code is already used, and if not to apply it:

    $codes = $db->exec(‘SELECT code from promocode’);
    if (!in_array($myCode, $codes)){
        // use it


Well, that's the thing. Most jobs are 80% stuff you don't want to do and 20% stuff you do. You're not being paid to do your hobby, you're paid to make money for the business and do what your bosses say.

There are many places though that understand writing good code, even if it's boring, is important, and any decent programmer can go fish until they find a workplace like that.

Having a solid older developer around is important even if most of the work is boring. You're there to help the younger folk from making costly mistakes.

> To succeed in programming, you have to work well in a business. You can't be Linus Torvalds, walking around with a big ego, unless you're so important it's hard to get rid of you.

I keep running into the inverse of that at companies. People who put on a smile and a nice demeanor, but then fail to do any actual work and subsequently lie and politic to cover that up, leaving the rest of the group to play cleanup and cover. I've always loved working with the Torvalds, you do your work and all is well.

There's a big difference between being a developer and a programmer. Wrote about it here:


In companies that I've worked in, the traits listed would (very roughly) distinguish a junior from senior developer.

There's various ways of describing these levels, but I think programmer/developer is less clear, and seems to suggest programmer almost as a pejorative.

Being a valuable senior developer is about 10% what technology you know and about 90% how good you are at working in a team to solve a business problem with a machine, and those fundamental skills haven’t changed much since 1960.

That’s true. But if you haven’t learned that in 10-15 years - there is something wrong. That means the difference between someone with 15 years and 30 years would be marginal.

I'm going to disagree. For every year I've spend doing something, I've got better at it. While there are definitely people with "1 year of experience repeated 30 times", I can't believe that you hit a point in software engineering where there is nothing material left to learn and/or where everything left to learn has marginal benefit.

There are things left to learn, but how many of those things are valuable to an employer who is just wanting yet another software as a service CRUD app, mobile app, or bespoke internal app that will never be seen outside of the company?

Unless working in some subfields of SE which have such depth.

The most popular nowadays don't though... mobile & web have a short half life.

It’s practically a trope that every time this subject comes up, the top-rated comment is a skeptical response...from a junior developer.

As someone who actually has worked as a programmer on the far side of 40, let me assure you that yes, older programmers have value. Your perceptions are not wrong. But that said, the article is right. The people who do the hiring and firing do not care about what you care about.

As the article noted clearly, the marginal benefit of an older engineer has to exceed the marginal costs. And if we’re being honest, the fact that a graybeard can use tcpdump without reading the docs carries little marginal benefit. Guess what, kid? You’ll figure it out, and you’ll do it quickly enough that the total cost of your learning won’t really compare to the cost of hiring me.

That’s why you see lots of anecdotes about the value of older engineers, but lots of articles from older engineers who know that the discrimination is real. And this is coming from someone who has managed to “stay relevant” a lot longer than most software devs.

The thing is as startups mature and become enterprise level shops, they need people who know how to work with enterprise software. Guess who has all the experience with that? Yep, the people with 10+ years experience. My company is hiring very senior devs like crazy right now for exactly this reason.

I think there are some orgs that definitely benefit, but it’s an 80/20 thing: 80% of software shops are more than content to hire junior devs and live with the costs of their mistakes. (Especially because most of those costs never really get experienced by the people who make the decisions.)

I litterally wrote that I don't how management values what I value.

I also never wrote that the article is wrong, I wrote that I didn't find it convincing.

I'm simply observing that after more than ten years developing and learning the craft, I still find myself not able to correctly know the old tools, and instead I merely have an idea of some that exist, and I'm sometimes able to google the correct one.

As so, I'm stating that when I see a graybeard with actual knowledge of fundamentals that I still totally lack, well, I respect that, and I _hope_ that he's not unemployable because if it is the case, I fear that our industry is doomed to produced always more buggy software due to a lack of basic understanding.

And I think than faced with no evidence from the article or you, it's well within my rights to be skeptical.

So, thank you for the unwarranted condescendance.

I honestly wasn’t trying to be condescending to you. I even said, right at the very beginning of my comment, that you’re right about many of your observations. My word choice may have been poor in some places, but overall I defend the content.

You’re (self-admittedly) missing some of the life and business experience that comes with being an older programmer, and then expressing skepticism about direct advice communicated by older programmers. This is also so common as to be a meme:

Person A: “I experienced $something_rare at work”

Person B: “I have never experienced such things. I am skeptical of your claims.”

So sure, you can call it whatever you like - skepticsm, disagreement or debate - but when people tell you about their experiences, and then you reject those experiences because you don’t have “evidence”...well, eventually you’re just confirming your own biases.

Again, it’s not that you’re wrong - I actually agree with your perceptions of older devs. I just think you’re missing critical life experiences that connect your observations with the arguments being made in the article.

Well, what you wrote was :

> It’s practically a trope that every time this subject comes up, the top-rated comment is a skeptical response...from a junior developer.

I looked at the top rated comment for the other times this exact same article was posted on hacker news. For none of those, the top rated comment "is a skeptical response...from a junior developer". Actually, it's more often (https://news.ycombinator.com/item?id=16934500, https://news.ycombinator.com/item?id=9361580) from a self declared senior developer. And none of those top comments really collaborate the article. Talk about confirming one's biases...

But taking into account all the anecdote from senior devs available just on hackernews, there's no consensus emerging whether the article makes sense or not.

So, at this point, I don't really feel the need for a plan B, even if being 40 years old is not really this far away for me. And it seems from the shared stories that whether this is needed or not is really based on personal experience, and is not a generally needed advice. I certainly hope to be in a path where I'm honing my skills well enough to be able to continue beind paid to engineer software after I'm 40.

Well, sure: you disagree, and you’re seeking out arguments that confirm your existing beliefs.

In any case, I’ve said what I have to say on the subject. I can’t make you listen to my direct experience, but I’m not going to spend time arguing with you about it.

I had a recent conversation with someone who was into electronics & programming 40+ years ago as a youth. He has had successful careers in other non-related fields. He recently was messing around with a Raspberry Pi & remarked how little things have changed & how easy it was to jump pack in.

I strongly believe an employer is very short sighted if they are more concerned in a specific framework than knowledge as a whole. I agree, learning a framework doesn't take much time. They all borrow their patterns & ideas.

If your needs are urgent for a person to hit the ground running on a specific framework you should probably hire a contractor. If you want an employee you should care more about people & project skills, plus overall programming/IT knowledge.

I’m always mirin’ my current boss when he opens midnight commander and figures out the answer to his question by looking at the hex of the binary.

I needed to change a string in a compiled binary and as I pulled the project down to change and compile from sources, he suggested we just patch the binary and we did in 1 minute with hexedit.

I really don’t buy that skills don’t transfer. Where this is actually the case must be people getting too locked into abstractions and never really understanding the essence

"Mirin is a common staple used in Japanese cooking. It's a type of rice wine, similar to sake, but with a lower alcohol and higher sugar content."

Unlikely this is what you meant. Admiring?

It is 100% a colloquialism..

I think it's short for "admiring": admiring -> admirin' -> mirin'

Meaning what?

Urban Dictionary is useful for looking up suggested meanings of colloquialisms. https://www.urbandictionary.com/define.php?term=mirin

What's funny is that it's already getting dated. I remember it was ubiquitous online in the forums I frequented in 2010 just like Urban Dictionary's graph shows. And that is already a decade ago (yikes)!

I find it unlikely that someone referring to himself as mixmastamyk (mix master mike?) isn’t familiar with millennial slang.

> I'm not sure I'm buying that. The JVM? Still rocking it 20 years after.

Java may not be my favorite language, but I respect how the language has both progressed and maintained its heritage. It allows the developer to build upon their skills rather than relearning nearly identical skills every time a new technology comes out.

The problem is that this 70 year old industry still behaves in very immature ways. While this may be acceptable when it comes down to attitudes towards technology, the unfortunate side-effect seems to be the adoption of similar attitudes towards people.

Every time someone says something like, "young people are more in tune with the current state of technology," or, "old people are more capable because they have more experience," they are legitimizing prejudice. That's true even when the statement is being made in an attempt to combat prejudice since it is applying generalizations to individuals.

Granted, it is very difficult to shed generalizations. Just look at my comment about the immaturity of the industry. That isn't universally true. Some companies are going to have a younger workforce, some older, some heterogeneous. Some companies are going to have old people working with old technologies, some are going to have young people working with young technologies, but some are going to have the young with old technologies and vice versa. Yet the generalization comes about because of how we frame the industry, which is one where the new replaces the old.

Which brings me back to why I respect Java: it looks both towards the future and back at the past. It is language of growth, rather than a language of revolution. Perhaps this inanimate technology has something to teach us about how we regard people: we should be encouraging growth and advancement rather than treating people as disposable.

This is the bottom section of TFA:

> You’ve got a cash cow, milk that sucker!

> I know you love programming because you like technology, so this may go against your very nature, but no one says you’ve got to jump every time some snot-nosed kid invents a new way to run byte-code. You have invested a lot of time and energy mastering the technology you use, and your experience differentiates you. Money follows scarcity, and snow-birding on an older technology, if you can stomach it, may just be the way to protect your earning potential. The industry turns on a dime, but is slow to retire proven technology. It is highly likely that you will still be able to earn some decent coin in the technology you know and love even after a few decades.

The article mentions this:

> I know you love programming because you like technology, so this may go against your very nature, but no one says you’ve got to jump every time some snot-nosed kid invents a new way to run byte-code. You have invested a lot of time and energy mastering the technology you use, and your experience differentiates you. Money follows scarcity, and snow-birding on an older technology, if you can stomach it, may just be the way to protect your earning potential. The industry turns on a dime, but is slow to retire proven technology. It is highly likely that you will still be able to earn some decent coin in the technology you know and love even after a few decades.

Yes. But as the article said, there is a diminishing amount of value after every year of true experience after a number of years. I would say around 10.

I definitely don’t believe in the “10x Engineer” (individual contributor) - yes they do exist but are so rare they aren’t worth talking about. I do believe in being a force multiplier as a team lead/mentor.

On the one hand, sure, percentage-wise you learn less in year 20 than in year 10, because you already know a lot more in year 19. But that is no more true of this field than any other field. Is a doctor, architect, civil engineer, or auto mechanic with 20 years experience more valuable than one with 10 years experience? Heck yes.

20 years ago,much of what I used today didn't existed there was no AWS, no C# (but C++ was close enough I guess), mobile where you had to worry about semi-connected networks and syncing, etc. There is no part of the human body that exists today that didn't exist 20 years ago.

You could argue the opposite is true: 20 years ago, there were dangerous misconceptions about how some body parts work. Many chemical pathways were completely unknown. Medicine and biology both evolved a lot in the 20 years.

The big picture however is still valid. Concepts and paradigms hold for decades. We still use TCP/IP. Computers still use the Von Neumann Architecture.

Looking at the details however, AWS is just a fancy GUI over time sharing on a mainframe. C# is just new syntax for concepts that are older than I am. While mobile brings new challenges, you now don't have to deal with the challenge of customers connected through a 330 baud modem.

In 1986 I had to not only know 65C02 assembly language to get any performance out of my 1Mhz Apple //e, I had to know that I could get 50% more performance for every access I did to memory on the first page as opposed to any other page. If I spent time doing that type of micro optimization today, I would be fired. I couldn’t imagine doing the types of things I could do today with modern technology.

In 1995, when I wrote my first paid for application in college, Internet was a thing for most colleges where I did some work on a HyperCard based Gopher server (long story), that wouldn’t have been possible in 10 years earlier.

In 2006, I was writing field service software for ruggedized Windows Mobile devices, architecting for semi connected smart devices is a completely different mindset than terminal programming or desktop programming. That wasn’t feasible before hardware became cheap and at least 2G was ubiquitous.

Even then what we could do, pales in comparison to the type of field service implementation I did in 2016 when mobile computing was much more capable, much cheaper and you could get a cheap Android devices and 3G/4G was common place.

But people thinking cloud computing is just “sharing mainframes” and don’t rearchirect either their systems or their processes is how we end up with “lift and shifters” and organizations spending way too much money on infrastructure and staff.

Also anyone who equates managing AWS to a “GUI” kind of makes my point, if you’re managing your AWS infrastructure from a GUI - you’re doing it wrong. 10-15 years ago you didn’t set up your entire data center by running a CloudFormation template or any other type of infrastructure as code.

How has medicine evolved in the last 20 years? I don’t doubt your statement, but you make it sound like common knowledge, and from my point of view (average non-medical person) not much has changed.

All of which is just details, which are much less important than the fundamental skills of building systems with whatever people and tools are available. (And I'm sorry, are you implying no significant changes have occurred in the tools and practice of medicine in 20 years?)

Thinking that moving from on prem to AWS for instance is now you end up with “AWS Architect” who were old school net ops guys who only know how to do a “lift and shift” and end up costing clients more. Because they pattern matched thought AWS was just an implementation detail and they could just set up everything like they would on prem.

Just one note about 10x, because I often see people ho don't believe it have the definition wrong. 10x programmers are not ten times better than the average programmer, they are ten times better than the worse programmers. This is based on an actual study and, by the metrics the study chose, this disparity does exist. Of course, measuring programming performance is notoriously intractable.

> But as the article said, there is a diminishing amount of value after every year of true experience after a number of years. I would say around 10.

I don't buy it. 10 years is about when you start moving into the actual expert category. Note I said start.

At 35, I finally had real, full control over multiple languages, could pick up CLR and understand and implement any algorithm in it, finally understood exactly why concurrency was so damn hard and how to mitigate that, and would pass practically every interview with flying colors. I could finally drive my tools with some facility and started to realize gdb was my friend.

At 45, I can predict the errors I and others are likely to make and take steps to mitigate them up front--although I still get irritated that I when I make the mistake anyway. My comments are now psychic--my team often remarks how "I just thought that I could really use a comment explaining this-and, behold, there it was". I can reduce interviewers to tears and can surprise all but the most knowledgeable experts in their own domains. I reach for gdb far more often, but am still frustrated at how much I don't know about it.

I still only consider myself an "expert" in very few subdomains--none of them involving programming languages.

One of my heavy hitter software guys once said: "Your code is the most straightforward code I have ever read." I apologized for being so simple. His laughing response: "Don't apologize. That was a compliment, dumbass."

At 35, I finally had real, full control over multiple languages, could pick up CLR and understand and implement any algorithm in it, finally understood exactly why concurrency was so damn hard and how to mitigate that, and would pass practically every interview with flying colors. I could finally drive my tools with some facility and started to realize gdb was my friend.

You may be an expert in multiple languages but as the article said, if the company is looking for Ruby developers to write a CRUD app, they no more care that I spent years doing C than they do the years I spent doing 65C02 assembly in the 80s.

No matter how many subdomains you are an expert in, if the company doesn’t need that experience, it doesn’t matter.

But some things translate much better than others: Django and Rails aren't that different (I'd also put Laravel in there). So I think the real problem is moving from a senior Django role to a senior Rails role and vice versa, but I see no reason why a 10 years Django developer can't get a mid-level Rails job (other than blatant age discrimination that is).

Isn’t that the point? That if you have 10 years worth of experience as Django developer, you aren’t as attractive to someone looking for a Rails developer as someone with 5 years of Rails development.

Simplicity is definitely a virtue. When I maintain large codebase of 10+ year project, it is easy to spot where someone tried to be very clever, and it is rarely benefit for codebase maintainability and extendability in the long term. Things rarely get reused that much to take advantage of overcomplicated abstract solution. Most of the times developers do not predict future business requirements correctly and simple solution would be the better one.

Seeing someone fire up wireshark, sniff some traffic and solve a complex production issue in five minutes by looking at the raw packets never gets old.

I worked a consulting gig with a 60yo who had just learned Objective-C so he could do mobile work. This was in 2013.

I guess if it’s a seed round company with a 22yo founder they might have some bias. But the rest of the world absolutely needs devs of any age.

> Admittedly, I have no clue if the management realises that ! Well, that's the key thing in many places: it's management that decides whether to hire and how much to offer, not you :)

The first instinct would be to put 'resources' that costs little into a project, and that usually means people with less experience that requires less pay and are easier to drag around. Imagine the difficulty they will be having trying to grok your line of reasoning for respecting and going after these older devs.

The javascript ecosystem however...

The entire stack literally (for the actual meaning of literally) change completely every year. Frameworks, language, package manager, libraries, etc.

I'm pretty sure you have misused the word literally.

Try to update a one year old npm project and you will find that literally 75% of the code in node_modules changes, and also that the app no longer works.

And then it was yarn. And then npm again.

Sure. But learning a new framework is like 100th on the list of hard parts of being an engineer. Its new incantations but ultimately the loop of solving business problems with automation is precisely the same.

100% of my team had basically zero C++ experience before joining. Our application is developed entirely in C++. I didn't care one bit because learning the programming idioms is so trivial compared to the actually hard stuff.

Ecosystem, sure, but I am still using the JavaScript I learned x years ago when I need to see how something is implemented in Angular or React under the hood, etc.

That’s basically how I find work. I’ve been in Unix admin since the 90s.

Do I know the latest coding pattern some adjunct professor is teaching after copying it from a math book into code, like the college grads I often encounter?


Do they know much about operating at scale? How USE their tools?

They know what ideas are important to computer science. They often don’t know how to make anything work that isn’t an insecure mess.

I think that’s the important take here: I program less these days because I’m teaching people how to make shit that’s more complex than their reduced to a few talking points homework assignments.

Well, TCP is a hot mess that should and will be phased out in favor of more congestion friendly protocols.

We barely could get IPV6 to get adopted. Replacing the whole TCP ? Most probably not. The successor to TCP if it ever gets adopted, will look more like TCP than a new protocol.

Agreed. Entire industries are based off the distinction of the stack and it's interoperability(sp?).

What are those alternatives?

QUIC (https://en.wikipedia.org/wiki/QUIC) is one alternative.

doesn't quic use bbr, a cc scheme developed by and supported in tcp?

> Observables are all the hype in javascript ? Well, great, I've learnt that pattern 15 years ago...

Well, the listener/observer pattern is used to explain observables in order to make them familiar, but no, it's not the same thing.

Observables are streams and you work with them the via composition of streams. When viewed from that high level, the underlying protocol (the observer) becomes irrelevant and in fact, if you're often finding yourself working with that protocol, you're doing it wrong.

People familiar with functional programming will be very familiar with this style, because composition is very natural to an FP developer and the composition of all kinds of streams is in the repertoire of FP developers.

But I've seen plenty of colleagues, who are otherwise very capable people, really struggle with the concepts involved. The mentality shift from the imperative programming that people have been taught, to describing actions via function composition and then doing evaluation "at the end of the world" is a mind fuck.

Interestingly, functional programming has been with us for some time, being older than Java.

However people are not interested in actual functional programming and more recently there's this trend to classify junk as FP, just because you've got a shitty API that takes functions (often doing side effects) as arguments to other functions, but that's not FP.

So going back to Observables, as a piece of advice, don't mention that in an interview ;-)

Well, it was my obvservation reading typescript code (never wrote a line in this language) with a background from java and then c# and then scala. The observables part felt really easy even without knowing the language syntax, but I suppose that the past 4 years of scala made me instantly translate the code in functional fashion and it was all smooth.

On the other, while it's wrapped differently, the whole idea of "this object will start emitting events, and we will be having functions as callback on them defined in various objects" really feels a lot like good old java observables/listeners, even if it's applied with a different paradigm. The stream part really feels like an implementation detail to me, it's just less boilerplate than before.

Then again, it was just an anecdote based on reading code in a language I don't use ^^ javascript and typescript aren't even mentioned on my resume, for good reasons, so I should be safe in interviews !

This is a replay of a comment I wrote several years ago to the same topic:

I'm 60+. I've been coding my whole career and I'm still coding. Never hit a plateau in pay, but nonetheless, I've found the best way to ratchet up is to change jobs which has been sad, but true - I've left some pretty decent jobs because somebody else was willing to pay more. This has been true in every decade of my career. There's been a constant push towards management that I've always resisted. People I've known who have gone into management generally didn't really want to be programming - it was just the means to kick start their careers. The same is true for any STEM field that isn't academic. If you want to go into management, do it, but if you don't and you're being pushed into it, talk to your boss. Any decent boss wants to keep good developers and will be happy to accomodate your desire to keep coding - they probably think they're doing you a favor by pushing you toward management.

I don't recommend becoming a specialist in any programming paradigm because you don't know what is coming next. Be a generalist, but keep learning everything you can. So far I've coded professionally in COBOL, Basic, Fortran, C, Ada, C++, APL, Java, Python, PERL, C#, Clojure and various assembly languages each one of which would have been tempting to become a specialist in. Somebody else pointed out that relearning the same thing over and over in new contexts gets old and that can be true, but I don't see how it can be avoided as long as there doesn't exist the "one true language". That said, I've got a neighbor about my age who still makes a great living as a COBOL programmer on legacy systems.

Now for the important part if you want to keep programming and you aren't an academic. If you want to make a living being a programmer, you can count on a decent living, but if you want to do well and have reasonable job security you've got to learn about and become an expert in something else - ideally something you're actually coding. Maybe it's banking, or process control, or contact management - it doesn't matter as long as it's something. As a developer, you are coding stuff that's important to somebody or they wouldn't be paying you to do it. Learn what you're coding beyond the level that you need just to get your work done. You almost for certain have access to resources since you need them to do your job, and if you don't figure out how to get them. Never stop learning.

I love your comment and can back you on your last point. I know plenty of people who are very successful because they took the time to become experts on some industry that also aligns well with programming. Compare:

Person A has become one of the world's ten foremost experts on the GPS system and other industry-critical location/positioning technologies. She is also a good, above-average programmer but nothing special.

Person B is an academic, a master of C++ who can recite chapter and verse from the language standards and writes bug-free code. He can point out undefined behavior, implementation-defined behavior, and memory leaks with ease in code reviews. He builds entire systems using template metaprogramming and is already an expert on C++28.

Person C is a highly productive generalist. His career jumped from a bank to an enterprise company to an operating system vendor to an online store. Always working on API-to-API middleware, expertly pushing Protobufs and JSON around and designing vast systems, but never gaining any expertise in an actual application topic.

Person A is going to be much more marketable later on in life, assuming she placed her bet on the right industry vertical. Person B and C may have good, successful early careers, but are often at risk of being replaced by Yet-Another-Protobuf-Slinger fresh out of college. Be as good a programmer as you can, but also build up knowledge of the business or detailed knowledge of a specific technology application that you know is not going away.

Indeed, this has been my career path to, I'm in my 50's and have always avoided 'going vertical' as I think of it - becoming a specialist in one little area. Stick to the bread and butter and just keep learning new stuff, really programming hasn't changed much at all if you know C and assembler, and a bit of hardware, know some maths, all the rest is just rearrangement of the words.

There’s a trap in software dev careers. If you fall into it, you can really get stuck post 40.

If you work at a small-ish dev organization - especially in-house dev in a non-tech company - you can rise pretty far and become pretty senior, and your indispensability can net you a decent income. But on the open job market those skills don’t transfer as well. The senior roles in small dev shops are filled by promoting from within (because they value in house knowledge), and the senior roles in BIG dev shops are filled by hiring people who know how to operate in a big company, and you don’t have those skills.

Best way to avoid the trap is, try to work somewhere big for a bit before you get to 40, to keep that door open and make it possible to be hired into a senior role.

> ...senior roles in BIG dev shops are filled by hiring people who know how to operate in a big company, and you don’t have those skills.

I'm not convinced of that. I was recently hired as a senior engineer at a large tech company that has more engineers than the sum total number of employees at every other company I worked at previously. My skills transferred just fine.

I'm sure of that, but the problem is not when on the job, but while interviewing.

I reckon it's possible to acquire most of those skills at small companies. It just really depends on the small company. It's also easy to get stuck in a deadend of specialised practices at a small company if you're not careful, and those people will struggle to get jobs later.

Not sure what you mean.

Recruiters and interviewers focus too much on "technology of the day" that will always change and be out of date, and not enough on foundational knowledge that is always transferable to new tech.

So if you don't have they right modern tech stack on your resume then you have no chance of getting hired even if you are perfectly capable of doing the work.

This is where "resume driven development" comes from. Smart and rational developers will choose tech stacks that look good on their resume, not necessarily what is best for the work at hand, because they want to make sure that they stay hireable in the future. It no good to do the best work you can do today, if you cannot get hired in the future. This is partly how new tech stacks are always so hyped even if they are always just more or less rehashes of what already exists.

Big companies seem to typically be on the opposite side, and not care about tech of the day.

That is not specific to big companies vs small companies, which is what I was asking about upthread.

If you're saying that it doesn't matter if you're at a small or large company using an internal company specific tech stack, then I agree. Both will leave you as unhireable even if your actual development skills can transfer just fine.

To add to your comments, there are remarkable opportunities for developers of all walks of life in the market. It takes some luck to find an open door, but people are willing to pay for excellent talent if they can find it.

Yet another +1 on this: I worked at the same tiny shop (a partnership-like handful of people) for about 15 years; We navigated a few technological transitions throughout those years and made more money than we probably should have, but when I (at age 40+) found myself at a larger shop, learning the latest tech stacks was definitely not the biggest challenge. Indeed, the challenge was slowing down a bit, submitting thoughtful PRs, coordinating with designers, writing documentation... you know, actual engineering process. I got my hands smacked mightily a few times at first, and I got PRs rejected, but I got the hang of it. And yes, my long-dormant telephone magically started ringing again as the technologies, people, and companies on my Linked profile started catching recruiters' eyes again like they did back in the day. It can happen to anyone, but I can see how it'd be particular common among people over 40. After all, I spent my entire 30s with the attitude of "this is working and making plenty of cash, why change it?"

+1 This is really solid advice and what I inadvertently did.

I'm in my 30s and spent a couple years at a no-name shop out of college. I was getting 0 recruiters calling me. Once I realized it was a dead end I switched over to working at a Unicorn for a few years and have since switched again to big tech. Now recruiters are interested in me for senior dev roles.

^ This became apparent as I looked at older "role model" employees in big orgs when i was still very junior when I'd wash up there after startups crap out

Probably the best thing is to change around every couple of years. IMO, small/medium-but-growing companies offer the best opportunities to work on your technical skills, while larger companies offer the chance to grow your ability to navigate complex organizations.

I want to say thank you for all the "youngsters'" posts. As someone who been in the industry for over four decades, I do get lost in some of the newfangled things. Turns out, most of the time it is a language change, not a dramatic world-view change.

Of the three points of the article, the last two (10 years major shift, ; shift creates leveling playing field) I disagree with.

I have seen some things that felt like major shifts, but once under the hood, they tend to be re-shaping and combining existing technologies. This often put me ahead of new-comers. I had not only the current technology, but I also understood the underlying historical technology.

There is no such thing as "irrelevant experience" in my opinion. For example - it is unlikely that Algol, SNOBOL or Fortan will make a new revival. But, all that experience gave me the edge to be able to recognize short-comings or advantages of newer programming languages.

So thanks for all the nice words here. It's time for me to swap my tapes in my PDP.

I don't know. I feel like there's a pretty clear power law governing the skill-ceiling below which additional experience adds actual value to dev work. ~80% of programming labor (testing, basic REST services, static web content, managing a small-to-medium-size SQL database, etc.) has a very low skill-ceiling. I've been in the industry for 7 years, I now run my own show doing all of these things. It's just simple grunt work. Then there's a very long tail of projects that require actual software engineering, where the skill ceiling is very, very high. Those latter positions exist all over the place, but you have to either create them or actively seek them out.

Conversely the skill-ceiling on managing technical projects is incredibly high. So although one certainly can efficiently stay on as an engineer, there is a much larger pool of opportunities in technical management, and they are much easier to find if you're grinding away at a FANG / enterprise company / growing startup / whatever.

If you want to do work that rewards many years of experience, it's there. Management work is just much more likely to fall into your lap (which, let's face it, is how the median employee finds anything).

For additional context, I also don't really think "programmer" is a profession in a vacuum. You need to level up inside a business vertical to really start adding specialized value, otherwise yes, you're a fungible cog. There's no shame in that, it just is what it is.

> ~80% of programming labor (testing, basic REST services, static web content, managing a small-to-medium-size SQL database, etc.) has a very low skill-ceiling.

You’re specifically talking about web development and have listed 0 of the things I, am embedded developer, do in an average day. All programming !== web development. There are more difficult problems out there than throwing up a web page. Especially in performance-sensitive applications.

Sure, I also didn't mention kernel programmers, database engineers, machine learning specialists, or game engine programmers, or any of the other myriad of semi-specialized programming domains.

I am specifically talking about the kind of development that employs, at a cursory glance, more than 65% of the development workforce. The lion's share of the remainder works on executables (commercial software) that run on a consumer device (most of which, again, is incredibly similar to web-client development in 2019), and within those specialized domains, I suggest the power-law largely still holds.

If you have some numbers to suggest that embedded engineering is a particularly good field for engineers who want a profession that rewards a high degree of specialization, I'm all ears.

Also: having worked on performance-sensitive backends for most of my career, most applications are performance-sensitive only in the shallowest sense and the marginal returns for improving performance do not in general justify the cost. One can (again, like I said) seek out domains where understanding how to build very low-latency or very energy efficient or highly concurrent (or whatever measure of performance you care about - very durable? very reliable? etc.) applications matters. But that's not 80% of the work out there.

Edit: also I find embedded engineering to be a very strange choice of counter-point. My last job alongside embedded engineers (at TE Connectivity) saw two of them defect to become fungible Java backend engineers so they could find stable work and transition into management. The remainder worked on specialized switches. Their day-to-day was managing a piece of software that polled chip readers and a REST service that hosted that information. I don't recall ever overhearing a conversation about optimizing TCAM usage or packet switching latency under load or whatever, maybe some of them got to get into the weeds on that every once in awhile. I suspect, again, that this is largely akin to having kernel development skills on a team that largely operates in user-space -- not part of the 80% -- but I am absolutely not an expert in the field).

The whole point of the post was that knowledge-intensive domains exist (and are quite common, 20% is a large fraction), and that some domains reward vast amounts of knowledge (hence power-law distributed, not normally/exponentially/etc. distributed), but that you had to seek them out.

I'd say 90+% programmer jobs these days are working on web applications one way or another (whether UI, back end, or a service to support back end). Even when they're rich client UIs, like phone applications, the main guts of the application generally lives online and requires an internet connection to access.

The fact that you've summarized web development as just throwing up a web page, and that you've assumed web development is never performance sensitive, leads me to believe that although you're an embedded developer - you're likely an inexperienced one.

> Conversely the skill-ceiling on managing technical projects is incredibly high.

You don't always have to go into management to follow this path though. At companies I've worked at, as you become more senior as an IC, your role expands from contributing locally on your team to contributing to the entire org. There is also often a split between the technical leadership and the people leadership on teams, which allows lead engineers and managers respectively to focus on each area.

IMO an underrated and difficult aspect of modern tech is figuring out technical practices that scale with team size, so that you can add more people without sacrificing quality, focus, speed, etc. I'd say microservices and continuous deployment are two examples of technical practices that address this, but there are more.

> grinding away at a FANG

I've started to propose we s/FANG/Big N/g "FANG" leaves off other obvious big companies, e.g. Uber, and it wouldn't scale to keep trying to add them to the acronym. Here N is used like when one talks about a list "N items long"

Uber and Lyft don’t belong on any “Big N” list until they show a profit rather than pissing away billions a year.

But that highlights the inconsistent usage of the term. Often it's used to just mean "a company with a lot of engineers with similar practices and culture"

How much profit does Amazon show?

Amazon made $2.62 billion in the last quarter, and $3.5 billion the quarter before that.

Amazon makes a ton of money, and (quite transparently I might add) they plow a lot of their earnings from some of their businesses into investing into their other businesses, which is what most investors want.

This is completely different from Uber and Lyft, where it's not clear that their profit potential supports their current valuations.

Amazon has demonstrated its ability to show a profit. Uber and Lyft have not.

If the acronym is only about profit then it is both misused and excludes other obvious companies like Microsoft.

It’s not entirely about profit, but I would argue you have to show a profit, or, demonstrate you could show a profit like Amazon has, to be considered a “top company” in any sense.

How does it exclude Microsoft, who reported $8.4 billion net income for Q2 2019?

Does it seem odd to exclude Microsoft from FAANG when it's the largest?

MAGA - Microsoft Apple Google Amazon:


I just realized what else that stood for...

It's fine, we'll print the message on hats so people know were talking about something else haha.

It's okay. Might as well settle into the next 5 years and try to have a good time. Lots of people are.


I agree, I guess I just meant "enterprise" - the purpose of including FANG was to call out that there's nothing particularly distinguished about working for those companies, the workloads are very similar to ABC enterprise workloads these days.




> While a technology shift doesn’t completely negate the skills of veterans, it certainly levels the playing field for recent grads.

It can more than level the playing field, especially when the hiring managers are particularly narrow-minded. I'm seeing more and more "degree in data science" (an academic discipline that didn't exist 10 years ago) as a requirement for what are effectively programming jobs. Of course, we can agree that this is stupid, and that not getting a job at such a company is "dodging a bullet", but unemployed is a pretty nasty bullet in and of itself.

This is generally true.

It's a combination of many things, from brain "degradation", ageism, simple economic math ( fresh meat is cheaper and more malleable ), other more important interests like having a real life and when you really mute all the marketing BS, your delusions and wishful thinking, the actual work of being a developer/programmer in a professional setting is a really shitty job and sad life for the majority of people on those jobs.

I'm a few years from 40 and not having experience or interest in large organizations, I don't know what to do.

Programming is cool, but I always found it to be a tool to get somewhere, and that somewhere hasn't happened and I don't even know what it is or looks like.

But either way, always have a plan B and don't forget about it.

> the actual work of being a developer/programmer in a professional setting is a really shitty job and sad life

The thing is, it doesn't have to be and, until relatively recently, honestly wasn't. It's a shitty job and a sad life because we have open offices, ticket-tracking systems and daily standups. There was a time when programming was exciting and rewarding.

I don't have a problem with standups. I actually like having ticket tracking systems (how else are you supposed to know what is high priority to work on?). But, open offices are objectively terrible. Research shows it, people here complain about it all the time, and yet companies keep building them.

Where I'm at, the floor is divided in such a way that teams sitting together can sometimes have their own little "bull pen" area that's a little bit separated from other teams (if only by a whiteboard wall). This helps, and I like working in an area with just my team, but I would prefer real walls.

I hate standups with a fucking passion.

All a standup to me is an interruption 30-45 minutes into the day which causes me to lose an hour or more of time every single day. I basically just fart around with things that don't need much focus until the standup happens.

The worst part is that I suggested instead of standups we just send what we we would have said in the standup via slack. That lasted 3 days. The reason no one liked it? "I don't read what other people put into slack". Yeah, ex-fucking-actly. It's literally not helpful because you don't actually need to know what someone is working on that day, and if you do it's because you're also working on it and you can communicate that privately.

Sounds like your team might be doing standups wrong. Firstly they should be 3-8 mins tops. Secondly, I consistently find that when someone mentions what they're working on, more days than not, someone shares something useful (hey, we did something similar last year . . .) and when someone is blocked, someone agrees to get them going again. It's also just nice to have a sense what the whole team is working on.

[EDIT] I also misread length of standup. And agreed, it's tough to have an interruption 30-45 mins is as you lose that time. Wonder if you'd be better with middle or day standups or something so at least it was between worthwhile chunks of time?

if your standup is taking 30-45 minutes you are doing it wrong.

All you need is: What you did yesterday, what you are doing today, and what you are stuck on/need someone else's time with.

Some clarifying questions/interruptions from others in the stand-up are fine ("yesterday I got stuck on foo and I-" "Was it foo-bar, or foo-quux?" "Quux" "Oh did you try the flibble button?" "Why would I need to do that?" "Well since last month's release if you dont use the flibble button..." etc), but if they go on for more than 30 seconds they need to be shut down and done outside of the standup.

My team has up to 20-25 people in a standup and we're usualy done well inside of 15 minutes.

Other tips are scheduling them before a set time (e.g. before the canteen opens so if you over-run you are late to lunch), or doing them early in the day so they don't interrupt the daily grind too much -> get in at 8:30, get your coffee, 20 minutes email triage -> 10 min standup, and its 9am and you've got the whole rest of the day to do

If it takes 30-45 minutes something is wrong.

I hated standups as well until I worked with right people.

you misunderstood, the standup happens 30-45 minutes into the workday.

Show up 5 minutes before standup, or 5 minutes into it, then. Problem solved. ;)

On a more serious note, read email, get a drink, and plan your day. You’ll be talking about the plan in less than an hour, anyway.

I would add code reviews to that. The advent of those was when it really stopped being fun for me, because it's really hard to keep that from turning into "your boss nitpicking every line you write."

In my opinion this is much, much better than trying to decipher what the heck that one co-worker pushed to production and how exactly it broke things. Of course there are ways to smuggle rubbish through code review, but it's still better than no reviews.

Both have problems. The best system is one where a passing review means "I feel comfortable maintaining this;" too often a passing review means "this is how I would have written it."

I have suffered under far too many "this is how I would have written it" reviews. It can be really demoralizing to write perfectly good, working, readable, performant, well-tested code only to have it rejected and have to rewrite it because the reviewer wishes you had used a different C++ feature.

I tried something like your "I feel comfortable maintaining this" approach recently when I was the reviewer. We have an internal geodata visualization tool that I've done some work on lately, and a talented C++ developer added a cool and useful new feature to the JavaScript front end.

He asked me to review it and said, "I'm sure you will find a lot of things I should change! This is my first real JavaScript project other than some hobby stuff."

And he was right. There was a lot of stuff I would have done differently. There was a mix of jQuery and document.getElementById, a fair amount of repeated code, var instead of const or let, and so on.

Our "house style" for reviews would have been for me to comment on every single one of these things line by line and expect him to change them all to my satisfaction. It would then require a second review pass to clear up misunderstandings, and a third to see if it all ended up OK.

Instead I told him:

"Your new feature is awesome! You're right, there are a lot of nitpicky things I would change in the code to bring it up to more modern JS style, and I'm glad you asked about that. But I didn't find anything that looks broken or dangerous. And the feature works, right? I could spend half a day writing comments explaining everything I would do differently, and that would keep you busy for another half day fixing it up. But I know you have more important things to work on right now, so maybe we can try something different. Go ahead and submit the code as is so people can start using it. When I get a little time I will just go ahead and make the changes I'm thinking of. I'll add review comments on my own changes and send it to you for review so you can see what I changed and why. That will avoid a lot of back-and-forth. I think it will save us both a lot of time and be a more pleasant experience too."

Needless to say, the developer liked this idea. And his manager was within earshot when we discussed it and thanked me for thinking of this approach.

I still haven't made that update, and this reminds me to do it sometime soon. But the code still works, people are using it productively, it has not failed once, and who really cares if there are some minor imperfections in the code style? None of our code is perfect!

You'll never do that update right? The more you wait, the more that developer will get annoyed when you interrupt them with it. :)

In this case actually I would have taken the half an hour and sat with them to explain what I think should be done differently and why, prioritized by importance.

Reviews done through tools work if there aren't many comments and if the basic design is ok.

I've seen reviews where someone's not skilled enough and they have to be taught idioms in dozens of comments in 10-20 patch sets. It's pretty horrible, but the positive side is that at least they're not merging crap.

> You'll never do that update right?

I resemble that remark! Yes, procrastinator here. ;-)

The thing is, his code is plenty good for now. Even if I never make that update, the code works and people are getting useful results from it. There's nothing overly complicated about it either; I or anyone else could pick it up and easily make any simplifications and improvements whenever needed.

And this developer is working on far more important things for the company, mostly in C++. I made the call that it would be better for our business to let him get back to that, since any improvements to the JavaScript code style on this internal tool simply weren't that urgent.

But I do appreciate the reminder and I will get to that update soon!

That sounds very pragmatic. I probably would have suggested sitting down and making the changes together. However anything that avoids a couple of cycles of review is a bonus, it's super corrosive to productivity and often moral.


Few things to consider: - linters to help move away from code style comments - Peer code reviews (if possible) - inform your team/boss of bike shedding - use a tool like reviewable.io where comments can be marked as blocking or not. I’ll often comment on preferences and block on issues.

Yep. I've worked on great teams that did all those things. And not-so-great teams that didn't. The latter is really draining.

IMHO, review processes should concentrate on identifying problems, not solutions.

Code review is one of those things where the culture around reviewing and the experience of the reviewers makes all the difference.

A code review where the boss reviews code is pathological. When experienced peers do the review it can be brilliant.

In fact I would only trust a team of very competent seniors not to need code review. Even then, getting another opinion on your code can be worth it.

Everybody needs code review.

Though the way I scan the code during review does change by how much I know and respect the author on a technical level.

That correlates with but is not identical to their formal seniority.

I don't think are any good work techniques when you're surrounded by assholes.

For me daily stand-ups are a great tool for closely collaborating with colleagues. I get that managers have often turned them into top-down status report meetings. (And that almost anything "Agile" has similarly, despite initial aims, been turned into an instrument of control.) But my best working environments have been in strong teams, and I'd hate to see the baby get thrown out with the bathwater.

The thing is, it doesn't have to be and, until relatively recently, honestly wasn't.

Indeed. It's almost as if it's a bad idea to take people working in a field where creativity and careful thought are fundamental and a wide variety of experience with different but related ideas is often helpful and then try to turn them into interchangeable commodities with the whole process dumbed down to the level of the least competent developer or manager in the team.

For my own career, escaping that sort of foolishness was the single biggest benefit when I made the jump to freelance work and even more so when I first became a founder. As soon as you go independent, you're no longer dealing with a subordinate employer-employee relationship, where everything you do is subject to the whimsy and caprice of whoever pays your salary. Instead, you're dealing with a business to business client-service relationship, where your objective is to provide the service the client needs. How you choose to solve whatever problem they have in terms of technology or process is much more down to your own professional judgement. If you do follow entrepreneurial route, you become effectively both client and service provider in that sense, and it's what you learn from your market research and customer relations that guides your direction in often even more general terms.

There was a time when programming was exciting and rewarding.

IMHO, it still is, as long as you find an environment where using fun things to get useful results is the emphasis. Mind-numbing corporate box-ticking is soul-destroying in any field.

We're privileged to work in a field where all we really need to do useful work is often a laptop, an Internet connection, and a willingness to use our brains, and where that work can still be valuable enough to others to make a very nice living off it, and where there is no shortage of potential customers.

Now, there's nothing wrong with being a competent professional who turns up and writes software during office hours and then goes home to enjoy life with their family/friends/hobbies like anyone else, and maybe for those people the structured corporate environment is helpful. And of course going independent has other challenges that aren't just about technical skills, and those are not for everyone. But for anyone who wants more than an office job and doesn't mind taking on a broader skill set to operate independently, I don't understand why they would continue to work in the kind of toxic employment environment we are discussing today. I suspect that in many cases it is simply ignorance (meant literally, not derogatively) of the possible alternatives and the paths to transition to them.

Incidentally, as a convenient side effect of working as an independent professional or through your own hopefully more enlightened business, the sort of ageist nonsense that motivated today's discussion also largely disappears. For a lot of clients outside the tech bubble, a 25-year-old in trendy clothes spouting the latest buzzwords is much less impressive than a 45-year-old who immediately gives off a competent, professional vibe. And if you're the 45-year-old who did keep up with developments and has had an extra 20 years of honing their skills, your rates can reflect your greater capability and productivity, which is much harder to achieve if you're still someone's employee at that point.

I love your comment and agree with everything you said.

I am a few years away from 40 as well, but as a freelance software developer who works remotely, I can only say I /love/ my job (most of the time) and hope to do it for many more years in this capacity.

Yup. Gray-haired contracting life is a different beast altogether compared to being an employee.

How did you start?.. the fear of not being able to acquire work is what's stopping me.

Started by talking with contractors who worked where I was full-time. Many had long term contracts & were making 2x what I made, doing the same work. Test the market before making the jump. Also, keep in mind that staying at the same job is not as secure as it seems; contracting is less risky than full times these days, especially as you get older.

This comment is so refreshing, could I email you some questions about starting a freelance/consultancy?

That seems relevant to today's subject. Perhaps you would like to post your questions here so others can read and write answers as well?

None of this is true if you work for yourself.

As long as we exempt engineering teams from OKRs I can tolerate it

>open offices

Noise-cancelling headphones are mankind's greatest weapon in the hopeless war against braindead management practices.

Do you guys now understand why I've said it's a sad life?

We are so "important" to mankind, yet the best we can do is to buy Noise-cancelling headphones.

Fair point, but to be honest nearly all modern work is borderline dystopic, regardless of how "important" it's deemed by society.

If you think being a programmer in BigOrg, Inc. sucks, then take a summer and work on a construction site or with a landscaping crew. Trust me, you'll yearn for that loud, open-plan office soon enough. Modern society would quickly implode without plumbers, yet actually being a plumber and crawling around under houses in rat shit is not very fun.

This doesn't work for me. It seems depressing as hell that we have to put in earplugs and pipe distracting media into our brains just to make work tolerable. How is cutting off one of our senses from our environment a solution?

Surely evidence of the declining mind share of unions.

Just want to nitpick on a couple of these:

> brain "degradation"

At 40? If you're suffering from loss of mental capabilities to the point that it affects your performance at work at 40, you have medical issues that need immediate attention.

> simple economic math ( fresh meat is cheaper and more malleable )

I've seen this stated many times, but nothing is forcing anyone to offer a particular salary or demand a particular salary.

Not sure that I agree on the first point. In my 20s, I could easily work 12 hour days, keep tons of complex state in my head, and almost never walked into a room and wondered why I went there (both literally and in the programming metaphorical equivalent).

In my 40s, much of that has changed. I'm beat mentally after a 6-7 hour day of coding. I can only keep smaller portions of the system in my head easily.

It's possible that I was blissfully aware of how shallowly I understood things or how ineffective I was in hours 7-12 of a workday in my 20s (and both of those no doubt have shreds of truth in them), but it seems way more likely that I am noticing a genuine difference in mental ability over the intervening two decades. None of that seems medically abnormal to me.

I'm mid 30s, and if anything I can keep more of the code in my head than when i was in my 20s. My abstractions got better, and I've seen lots of stuff before, I think this has had the effect of compressing everything.

I think of all the edge cases and pitfalls that I wouldn't have in my 20s. I have an easier time reading and understanding documentation, using libraries, reading and understanding other people's code. I think I'm also much more sympathetic to the poor sod who wrote this broken code under a time crunch 5 years ago.

I also get tired earlier, but the code I write is more efficient and maintainable now, leading to greater efficiencies in the long term.

I remember 10 hour days in my twenties that shortly turned out to be a total waste of time.

I think I'm probably a little less sharp than I was in my 20s (now 47), but I'm a FAR more valuable employee

I think you're selling yourself short. In your 20s, you have a lot less mental baggage to deal with, on so many levels. This is a good thing when you need to work hard (which we all did when we were young :) ).

Even if you can only work 6-7 hours a day now, you're most definitely spending that time a lot more efficiently than you did a decade ago. You don't need to keep more information in your head at the moment because you have decades of wisdom guiding your decisions.

It's hard to deny that one's memory deteriorates before age 40. But does it really matter that much to job performance? I'm not sure, but then I'm not a professional software developer, so that might explain our differing views.

I do have to disagree about the number of hours worked being a sign of mental degradation. I'd call that physical degradation, and to be honest, I'm able to work longer days now than when I was 30. Everyone's mileage will vary on that, though.

For me, what dominates everything else wrt mental performance is that I'm much more efficient at learning things now than I was in grad school 20 years ago. Whether it's a new technology, a mathematical proof, or reading someone else's source code, I'm massively more productive than I was at 25. If anything causes me problems, it's that I enjoy learning new things too much, and I spend less time than I should doing the work that pays the bills.

Executive and working memory decline is a real thing that happens with age, I didn't say I'm demented to a point where I suck at my job, but I definitely felt a slight decrease from my 20's to my 30's.

> I've seen this stated many times, but nothing is forcing anyone to offer a particular salary or demand a particular salary.

I didn't think bosses& hiring managers trying to pay the minimum and get the maximum was a contentious point. Our industry is still made of young people who prefer younger people to do their work, for both economic ( many times short sighted ) and social reasons.

> Executive and working memory decline is a real thing that happens with age

I thought that wasn't really supported by anything significant. Life, health getting in a way is not a decline of mental abilities.

> I've seen this stated many times, but nothing is forcing anyone to offer a particular salary or demand a particular salary.

What do you mean by this comment?

Younger people are willing to work for less and I don't see how your comment relates to that idea.

Literally everywhere outside of Silicon Valley you can get great jobs in Software that move at a reasonable pace. SV is very different from the rest of US

Not to mention the great overall quality of life that a regular 9 to 5 job as a software developer provides in the rest of the country outside of SF/NY/SEA/LA/BO.

Be a median software developer, pair up with a spouse making the national full-time median, now you've got a $160,000 household income (which doesn't sound special in SV). Married that's $115k-$125k in take-home pay in most of the country.

$160,000 family income vs a $350,000 house. That's the kind of ratio that tilts life a lot further in your favor in many regards.

Absolutely. I admire the idealism in Silicon Valley 100%, and personally enjoy working here. But it’s important to point out that a) Most people who come here don’t become millionaires and b) You don’t need to be a millionaire to have an amazing quality of life in other parts of the US.

You're absolutely correct, and it's something so many people do not realize.

"The actual work of being a developer/programmer in a professional setting is a really shitty job and sad life for the majority of people on those jobs."

I'd love to give you a tour of what people do for work, and then you can personally decide how bad programming is as a profession.

Funnily enough, there's a good chance I know more about it than you.

I just don't use other people's tremendous shitty jobs to glorify mine.

This is obviously a wild assumption, but you seem to have made up your mind. Good day.

I’m sorry but if you think programming is not a “shitty job and a sad life.”

While individuals certainly experience really bad situations in every walk of life, programming is an exceptionally valuable and versatile way to make a living, and offers a level of freedom and economic mobility that most people in most fields could only dream of.

While I'm aware and fully agree with your University of Phoenix pitch, please try to listen and understand that's only true given the right context ( both economic, social, time, etc. )

Most people sit at an office in front of a computer for at least 8 hours a day, some of them are developers. Most "problems" they "solve" are very distant from the "Changing the World and give meaning to your life, one PHP line at a time".

It's a great gig, but a sad life when you'll look back on your deathbed.

I'm not sure where you're working but where I work, being a developer is awesome.

Fresh, healthy, high quality, free food catered for 3 meals a day? Check.

Best healthcare possible provided at no cost? Check.

Stocked kitchen with all kinds of snacks, high end coffee, kombucha etc with ability to make requests? Check.

Freedom to come in when I want and leave when I want? Check.

Work remotely when I want? Check.

Any equipment I want at any cost? Check.

Top of market pay for size of company? Check.

Beautiful office with natural sunlight, plants everywhere, and fresh air? Check.

Autonomy and creativity in my role? Check.

Top percentile talent as coworkers who are genuinely amazing collaborators, interesting people and great to work with? Check.

Like with all professions, there are depressing jobs and great jobs. If you're a dev working at a paper mill in the midwest with draconian dinosaurs as management, yeah it might not be the best gig. But if you work at a company that values software and understands the leverage of great developers, then I can't think of a job that's more cushy and fulfilling.

What you say is that for you all those benefits outweigh the actual fact of sitting in front of computer all day. It's hard to give them all up, that's true.

Could you imagine working retail where you have to be on your feet hours a day with no break? You have no choice.

I get to sit or stand when I want (I have a motorized, adjustable standing desk). I can go for a run mid-day if I want. I can hit the gym if I want. I have any number of options. If I choose to sit in front of the computer all day, that's usually MY fault (but to be fair sometimes there's a launch date looming that results in me working extra).

It's all about choice. I have that. Many professions don't.

When I was younger with no profession I tried painting houses and waiting tables. Now *those^ are hard. Programming is paradise compared. Fun, challenging and comfortable.

I understand if being indoors isn't for you, but programming can categorically be the highest paid job with the lowest requirements physically. When I need is an internet connection and a laptop, it allowed me to do things like live out of an RV full time traveling, or flying to different cities, or being able to spend more time with my family/friends while we are all healthy.

I go to my family reunion and out of 5 of us in the same age bracket, everyone else has to rush back to their jobs in 2 or 3 days, whereas I can spend the full week with the "old folks" - that's time I can't get back.

Am I changing the world with my code? No, not really. Am I helping people in turn work less while still getting to solve increasingly harder problems? Yes. I can only think of a few other professions that would keep me as happy, and none offer the reward/effort level of this.

Maybe the one difference is I started with computers out of a love for problem solving, not for the other things I said. Those are just a nice side benefit I now won't give up now that I'm older by a bit.

University of Phoenix pitch, it is not.

I've been the opposite - preferring to work at big, rather boring places. The work itself is not as exciting or fast paced, but things are stable. And I've not seen any signs of age-ism. The two guys on my team pumping out the most code are both 40+, I think 50+ actually. Sharp as tacks. And it doesn't have to be old tech. We're using a lot of Go and React now, for example.

With all that experience you must have forgotten about or never held a truly crappy job. Try working in a call center or retail. Any job where I can get paid to code is better than 90% of the jobs out there.

>the actual work of being a developer/programmer in a professional setting is a really shitty job and sad life

Compared to what? Have you ever worked manual labor, retail, food service industry, etc?

> the actual work of being a developer/programmer in a professional setting is a really shitty job and sad life for the majority of people on those jobs.

Oh man, I cannot relate to this at all. I feel very lucky to be a software developer - there are far more boring jobs out there, that pay far less, with less flexible hours.

I’ve admittedly only had 4 jobs in 20 years but I wouldn’t describe any of them like this. Maybe this happens more at scrappy startups and I’ve worked at more established companies?

>the actual work of being a developer/programmer in a professional setting

What is a "professional setting" in this context?

Sitting in front of a computer for years, trying to fix mundane problems, bugs, implement very important and urgent features that are very useless and not really urgent.

You know, being the brick mason of yet another cathedral of delusions and failure as most business are. Which is life and I don't really have a problem with that.

But we are all human and after a decade of this shit, it gets hollow and meaningless.

Why not try working on your own thing? Sounds like you're burned out on corporate work.

I did but got lost in my own mind and the solo business is obviously a lot harder ( if you're just a grunt programmer).

Without maturity, a strong social network and discipline every solo venture will end up in isolation and depression.

But I still try the usual side projects that get started and not finished like everybody else. :)

Right now, I just wish I would find something 80% business and 20% tech with a feedback loop comparable to a plumber ( work for a day or 2, when it's done its really DONE. Move to next job )

Oh, its you again. I don't mean that in a mean way, but your comments do stick out as being burned out/not happy.

I'm happy you opened up a little as to how you feel. If you want something that's 80% business 20% tech, I can high recommend getting into "high end" consulting. "High end" just meaning rich people, their houses, and a small smattering of businesses that they themselves have a high earning to effort %.

I don't mean writing code, I mean doing networking/server work/maybe some desktop stuff. I know that doesn't sound glamorous, but you get to make good money (really good money), most jobs are DONE when you're done, and it's generally very low stress. (learn and understand VLAN's and that level of networking and you're already in demand for a lot of people)

Most of the job is learning to setup/meet expectations, know how to talk to those kinds of people, and other business end skills. You'll also never age out - these kinds of people have long ago learned to identify other markers and often are older themselves.

I did this for years and it fit all the things you say you'd like and it left me with plenty of time to write code on the side (or not, the time is there to do what you love.)

Doesn’t have to be solo :-) But yeah, I hear you, I did a solo thing for 5 years, probably wouldn’t do it solo for that long again.

Age discrimination is absolutely a thing, but (and this is only based on anecdata) I would say it is much, much worse in the valley than just about anywhere else. The rest of the world seems to have a less damaged culture.

I'm a 56 year old programmer from New Zealand. I moved to the US to join a hot Silicon Valley RISC-V startup earlier this year. I think the average age of our software team is probably over 50. Certainly the median is.

My previous job (and my first ever outside NZ) was working for a multinational electronics company in their Moscow Russia R&D lab, a job I got when I was 52.

Working on semiconductors/electronics is "old tech" that is still very important. It's been around for quite a while.

Compare to Fortran or supercomputers. Was important, isn't much so these days, although is still used.

What I'm saying is that you got on a train that has been running longer than some other trains, hence you meet older people on the train. As someone who has little interest in management I hope to be as successful as you, in this sense of picking long distance trains.

As someone who ran a startup and worked for several others in the valley, then moved away: I agree.

I’m in Portugal and everyone who’s moved their tech hubs here in the past 2 years is only hiring juniors. And we’re talking about a mix of big companies (who ought to know better) and relatively fresh (<5y) startups.

Because juniors in Lisbon will work for peanuts while anyone with more than a year experience knows they can demand a higher salary for living in Lisbon.

I'm 50 and still programming. Programming is all I know how to do. If that was taken away from me, I would just rot to death. I don't want to be anything else, and I refuse to be forced to do something I don't want to do. I'll learn to farm and support myself that way.

So, is it still preferable for the manager to not hire anyone and let the work sit undone rather than to be forced to work with a detestable "older" worker?

Perhaps the government should start giving disability status to people because they are over 40.

> I'm 50 and still programming. Programming is all I know how to do. If that was taken away from me, I would just rot to death.

Interesting statement. Someone recently told me the biggest problem with US Presidential Candidate Yang's Universal Basic Income plan is that people who have had a successful career in one field their whole life often do not want to learn a new career even if you help them financially while they transition. For reference, the discussion was about winning an election more than solving a problem.

It's very interesting to hear the above statement from someone in tech vs a different industry. Reminds me how similar people across all industries are when often some of us tend to feel unique for some silly reason.

>people who have had a successful career in one field their whole life

It's not only being successful, but also being happy with the characteristics of software dev work.

I'm not sure I can think of a harder field to jump into in middle age than farming.

They probably meant farming on a backyard gardening level, I know plenty of people who got into it in middle age, particularly Mormons who try to be relatively self-sufficient. Not sure about the backyard size needed to actually support yourself though.

> I'll learn to farm and support myself that way.

With the climate apocalypse, that's becoming less and less likely.

Actually if everything folds in completely the way the alarmists are telling us it will, being able to grow your own food will be the only thing that will save you.

Yes. But being able to grow your own food is what's becoming less likely.

Just like all the other predicted climate apocalypses the past 50 years. For a group that should be good at picking up patterns they sure are slow to pick this one up...

Applications are open for YC Winter 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact