Hacker News new | past | comments | ask | show | jobs | submit login
The Planned Obsolescence of Old Coders (onezero.medium.com)
268 points by coffee 12 days ago | hide | past | web | favorite | 349 comments





I wanted to like this article, but I was annoyed by the second sentence, which asserts that the tech industry is extremely white and male. The "male" part may be true, but the "white" part isn't, and that fact is well known to anyone with even a passing familiarity with the industry. The big tech companies are actually LESS white than the US as a whole; it's Asians who are vastly overrepresented. In 2017, Google's tech hires were 47.1% Asian and 42.1% white. The 2018 Facebook tech hires were 51.3% Asian and 42.7% white. This compares to the overall US population which is 72% white and only 5% Asian.

I don't want to start a flame war about diversity, I just want to get the facts straight.

https://diversity.google/annual-report/

https://www.facebook.com/careers/diversity-report

https://en.wikipedia.org/wiki/Race_and_ethnicity_in_the_Unit...


This reads like you seized one sentence and stopped reading because you didnt like what was said so dismissed the whole article. Thats not fair IMO. I read the whole thing and the article uses the first sentences to set the backdrop for his argument that older engineers and programmers are forced into early retirement and companies in tech world and outside of it make rookie mistakes that could be avoided by treating engineering as a life long profession.

Alternating between complaints about White overrepresentation that aren't even true and politically correct terminology like "People of Color" and the entire article being a love letter to an underrepresented group is just jarring. It makes me wonder about how informed the author is about tech if he thinks Whites are overrepresented generally. There's either a lack of accuracy or moral consistency. Why the hell is the author even bring up his poorly informed opinions on racial topics in an article about ageism anyways?

Anyways the article as a whole is heavy on anecdote and low on data & facts. There is an emphasis on outcome metrics but the author doesn't bother to examine any data on productivity vs age.


The problem with your argument is that there are other diversity initiatives in the programming world which are mainly gender and race focused. That is undeniably true. There hasnt been the same thing at the same scale for older programmers. In fact there have been many articles on this very forum about how many tech companies filter out older engineers when searching for talent. Its not the first time this topic has been brought up. IBM was actively phasing out older workers and long term workers. Lets not act like data isnt there to support the claim even if it is just anecdotal.

I didn't really see anything substantiating a claim that eg 30 years engineering experience in tech is more valuable than say 20. It was just an assumption.

It's also very disingenuous to use these four oversimplified categories. There are no end of people who appear white, but don't personally identify as such for a litany of perfectly reasonable factors. And how much meaning does "Asian" have, when it's expected to somehow include two billion-person countries with vastly different genetic and cultural backgrounds?

That does not take away from the point that the industry is not as overwhelmingly "white" as people try to portray it.

You're right that it's a lousy set of categories, but I don't think it's disingenuous. Those categories roughly match the ones that large companies in the US have to track for their EEO-1 reports. I can't blame them for not doing more than is required of them for the sake of statistical accuracy.

I'd be interested to see aggregate demographics for software engineering, but I don't think that two, primarily SV, companies accurately portray our industry/field.

The organisers care about the colour of your skin and the type of genitalia you possess, not so much your technical accomplishments.

As a female developer I really don't like the idea that I might be selected just to fill a quota.


FWIW I didn't get that impression. Seemed to me like the author, conscious of the women's holiday today, used the concept of social awareness as a segue into ageism.

From the author's second sentence:

> The organizers know how male and white the tech industry is, so they make a special effort to recruit a diverse speaker lineup.

Yes, the author was using social awareness to segue into ageism, but it is still factually incorrect (or at least highly misleading) to state that the industry is predominantly "white".


Comparing to the US population is a touch disingenuous. Those are California-based companies, and California has a much larger number of Asian residents than the US as a whole thanks to its proximity to Asia (15% as opposed to the 5% you quoted for the US). They're still over-represented, but not at 9.5x the base rate.

IMO, that’s not disingenuous when talking about an industry and using two well known companies as an example.

Asians, especially from India have an out of proportion presence in technology jobs. It is no different than any other ethnic industry affiliation. Irish were cops, Greeks own diners, and South Asians are huge in tech, and have ethnic social networks that connect folks to jobs, just like any other tightly meshed ethnic community.


I made it pretty clear that I agree they're overrepresented, but quoting the wrong base rate makes it look like a much bigger deal than it is. I work at a NY-based rather than CA-based company, and while the same is true there, we see a lot fewer Asian people in tech than the nearly 50% that West-coast firms claim.

It depends on what you’re doing. Some verticals in my neck of the woods are 65% or more Asian, mostly because the candidate pool is tight and often requires relocation.

I'm looking at that Google report link and it says, for 2017 Tech, 39.2% asian, 53.2% white, whatever that means.

It's worth mentioning that for 2017 leadership, the Google numbers are significantly different: 26% asian, 68% white (and 75% male). So the decision making is still quite white and male.


> I wanted to like this article, but I was annoyed by the second sentence, which asserts that the tech industry is extremely white and male.

In case you missed the memo, the current cultural climate and trendy narrative structure demands that everything, no matter the topic, must include some reference to identity politics. Facts only interfere in this sort of narrative building, so please ignore your data and focus on emotional triggers instead.


While those two companies are large employers, does their labor force makeup the majority of developers in Silicon Valley? If not, then that’s two companies who have done well with upsetting the old balance, but what about the rest?

I refuse to read or react to articles hanging out troll-bait, so that the social trench war may creep to there page and bring in those sweet addmoneys.

Non engineering departments in tech will be white male majority.

Is that a quantified thing? I've only worked at 3 tech companies, but that was not at all my experience. Engineering was the only one that had predominantly men.

Of course not, but internet comments sections are begging for absurdity and falsehood.

Bonus points if it fits somehow into the very trendy culture war narrative.


For context: I'm VP of a software consultancy responsible for delivering software on time and on budget.

My take is that older programmers who maintain their skills on new technology are worth their weight in gold. The financial trick is to pair them with younger developers to create an overall cost effective team. If both the older and younger devs go in with the right attitude, it can be wonderful for both the project and their respective careers.

I also think it's really, really dumb to push developers into management positions if they can't do the job. I'd much prefer putting younger managers with great management skills into the mix than just rotating older devs into that role.

Bottom line: software requires lifelong learning. If you can't or won't stay current, you're in trouble. If you can, then it's just agism or a bad business model that's keeping you out of awesome projects.


In the article they mention a Python conference. I think Python is a good counter-example. It's an old language and still as strong as ever. On Stack Overflow, it recently passed JavaScript [1]. Unlike JavaScript, new Python code isn't all that different from old Python code.

A lot of older Python candidates are being passed over for younger ones. Sure they might not be learning Rust, but there are younger python developers that are having an easier time getting a job, and they aren't necessarily learning Rust either.

[1]: https://stackoverflow.blog/2017/09/06/incredible-growth-pyth... https://insights.stackoverflow.com/trends?tags=python%2Cjava...


I'll be 50 soon and I've been dabbling in Rust. Not to improve my skills or any of that, but simply because my 40 years of coding experience tells me it really is awesome. I have some specialization the will be good for some time to come. When I read these ageism articles I just keep shaking my head. Last company I worked at had tons of them. You startup folks must live in a fucking bubble. (That was for effect, I love HN in case you oversensitive millenials can't tell)

So did you experience ageism or not? I couldn't tell from your comment.

No. Not even a little. My post was a bit unclear. One company I worked for has lots of older guys and it was clearly good for their products.

> It's an old language and still as strong as ever.

Does it make sense to consider Python an old language if it's still a living language? The next version is coming out in October, and most new projects started today are running on versions released within the three years.

This seems pretty different than something like COBOL, where it's technically still being developed but not in anywhere near the same way.


It's also worth noting that recent python versions have added tremendous quality of life improvements. If you haven't kept up with the language, then the code you're writing may still run, but it's not "pythonic" anymore.

I haven't touched python in years. Can anyone link to a list of examples? I remember hearing about optional typing, which is always my favorite addition.

Python is indeed getting better and better, but I personally wouldn't include optional typing... yet. Mostly because the optional typing in Python is just for tooling, not for the run compiler or run time [1]. Python is one of my go-to language, but I haven't found a compelling reason to use type hinting yet.

Python's really good at binding C code [2], which is one of the reasons you see it in so many machine learning and scientific projects (Numpy, Scikit, Tenserflow, etc.) I think that's what given Python a lot of legs in the last few years.

[1] https://stackoverflow.com/questions/41356784/how-to-use-type...

[2] https://stackoverflow.com/a/10202569


Not if being old has a negative connotation, or if being old implies that they have no new ideas, neither of which should be true, of technologies or of people.

> Does it make sense to consider Python an old language if it's still a living language?

Python, I'd argue, is very much a living language. It was introduced in 1990, so it's 29 years old. Last release was 5 days ago as I write this [1]

Java, is also a living language. It was introduced in 1995, so it's 23 years old. Last release was in September 25 [2]

Cobol was introduced in 1959, so it's 1959. [3] The last table release was 2014 (which is a lot newer than I would have expected). Cobol is a great object lesson, I think, as the parent comment calls out. I don't think many software engineers would consider it a "modern" language, but according to the Wikipedia article, as of 2012, 60% of businesses use Cobol. So it's far from a "dead" language.

Lisp was introduced in 1958... 61 years old. [4]. I don't think many developers would consider Lisp dead, either.

Cobol's longevity seems to be tied to the fact that much of the legacy code encapsulates business logic, as opposed to UX logic (websites, mobile apps, etc.) There's more than a few lessons in there I think...

[1] https://en.wikipedia.org/w/index.php?title=Python_(programmi...

[2] https://en.wikipedia.org/w/index.php?title=Java_(programming...

[3] https://en.wikipedia.org/w/index.php?title=COBOL&oldid=88324...

[4] https://en.wikipedia.org/w/index.php?title=Lisp_(programming...


I consider the 1958 Lisp to be dead. The Lisp family's longevity doesn't stem from maintaining legacy code from 1960, but because it actually embodies attractive concepts that continue to captivate new generations of coders, often not in connection to anything mundane, like a job.

Not ragging on Lisp here in any way, but it's been long since the original Lisp is has been viable. It continues to historic significance, and anything that calls itself Lisp still has cons, car, cdr, atom, lists ending in nil (which is a symbol) and such. In 1958, you had no macros, no lexical closures, no exception handling, no backquote, no objects or structures, no bignum integers, ...

Whereas COBOL today is pretty much the same pile of stuff it was in 1960-something. It's not a vibrant language family with newer members. COBOL does not absorb new concepts.

There are new, young people learning COBOL, but only in order to become maintainers of legacy systems: job security in banking, finance and government sectors. Maybe a few have a perverse curiosity: they want to know what is really behind the dirty five letter word.

New people coming to a Lisp are coming for the Lisp itself. There aren't that many Lisp jobs involving new or legacy coding; new Lisp people learn and make new stuff from scratch.

There is simply no comparison.


Companies should embrace a multi-branch management approach. Have technical managers, who are experienced engineers that can focus on ensuring that people are doing quality work and building great products. Then along-side them, the team should also have a people manager, who is responsible for navigating the politics of the company.

I've had managers who were both technical and people-oriented. They both bring a lot to the table. But the older I get, the more I value having a manager I can go to who is good at managing optics and helping me deal with political or institutional roadblocks. In an idea world, I'd have both.


This is somewhat the dream of matrix organizations, but has it's own set of problems.

But fundamentally, if you can't move the decision making power/responsibility/(+ to some degree compensation) out to non-traditional locations in your org structure, you'll naturally have to move those people towards the more traditional locations....


I work at a small matrix organization. Deployments and time allocation are fairly chaotic due to lack of communication, but on the other hand we do have relatively many older people doing technical work and projects are managed by a mix of older and younger people.

   I also think it's really, really dumb to push developers into management positions if they can't do the job.
Even if they can do the job, it may not be the best use of their time. If someone is really good technically, they are more difficult to replace than a competent middle manager. So if that's the trade off you are making, you are probably losing out.

> older programmers who maintain their skills on new technology are worth their weight in gold

and also true of older programmers who maintain their skills in older technologies. see all the COBOL comments below. If history is any guide, competent Java programmers in 10 years will not be aged out, and will be compensated more than whoever is competent (regardless of age) at the current tech stack.


>> I also think it's really, really dumb to push developers into management positions if they can't do the job.

This has been known forever, yet it still happens - Why?

>> I'd much prefer putting younger managers with great management skills into the mix...

One reason is it is impossible to tell if a young manager has great management skills. It's really hard to tell if an experienced manager has strong skills.

As a result, technical proficiency is used as a proxy for management potential. It probably has a stronger correlation than picking a random person, but is not great. The new manager is then really unhappy, quits or is fired or eventually learns the new skills and is successful.

Repeat.


Some companies only have one kind of career track. If you want more money, you have to be in a manager position. They don't have a way for experienced programmers to keep moving up in their careers as programmers.

Is there any tech company where the technical track doesn't plateau below the people-manager track?

The issue isn't the absolute height. The top manager (CEO) is going to make more more than the top IC, that's just obvious.

The real issue is that management is a pyramid but IC is ziggurat with, if you're lucky, an obelisk sitting on top of it.

People like to point to the principal staff engineer or whatever, but there's one of them for every twenty upper-mid level managers making the same money. And all those managers can take their skills to another company, but if an IC guru leaves the company where he made his bones he's probably destined for consulting (at best). Consulting has its upsides, but that's no more for everyone than management.


I think both are true. There are more "slots" for management than technical leaders at the same compensation band. There are also generally at least some compensation bands in management that are above the highest band tech lead role (architect, chief/principal/distinguished/staff/whatever).

Many companies have separate tracks for independent contributors and managers. You can stay an IC and level up to Distinguished Engineer or Fellow or Staff Engineer or whatever, without needing to go into management.

I think we might need a curated list for those. At 17 years of career I've only seen one such company.

It seems pretty common with bigger companies: Google, Apple, Facebook, Amazon, Oracle, Box, Zynga, Electronic Arts, Pinterest, Microsoft, etc.

The tracks and levels are commonly referred to as the "job family architecture" -- don't be afraid to ask a recruiter for this information. Otherwise you'll never really know if you're being properly leveled. (For instance, I believe a Principal SE at Microsoft is roughly equal to a Staff Engineer at Google.)


I guess that would just not be realistic. There was a comment from a while ago that I think makes total sense: The organization has to evaluate the value that you bring and create. If your position moves and directs dozens of people, it will be hard for an individual contributor to match the importance no matter what. Of course, it's fine if you prefer to remain a more individual, engineering-oriented position. But there is little reason to expect you'd reach the same level of importance to the organization as some top managers/executives.

Agreed, completely. Most ICs overestimate their importance by far. Most people do regardless of title.

In my company it’s also much slower to move up the technical track compared to the manager track. If you want to make money it’s much better to be manager.

GitLab (I work there) does allow you to become Staff-Level which is the same multiplier as a manager.

Yes, that's common: but I have yet to see a case where the tech track goes as high as the management track. For example, "Distinguished/Staff Engineer" might be the equivalent of a "Senior Director" in an org with two or three levels of management above that (VP, SVP/CTO, EVP, etc.).

Google's tech track does go all the way to VP equivalents.

>> I also think it's really, really dumb to push developers into management positions if they can't do the job.

>This has been known forever, yet it still happens - Why?

This problem exists in hardware departments too. My best guess is that they want managers familiar with the tech/industry/company. My second best guess is that it's a natural human tendency and hasn't been addressed systematically.


I come frome a industry that sees software written under management that largely has no clue what software is, how to evaluate its qualitys and tech debt - and thus creates exponential tech debt time bombs.

I've been doing some job interviewing lately, at 37.

I do find that companies seem to know how to evaluate young programmers: just test them on their knowledge of your tech stack. Any young coder who is good and is working in that stack will know the basics. The bad ones won't.

For older coders, I'm not sure that same signal works. I've forgotten more frameworks than most coders will ever learn. I mostly don't care about programming languages, they're roughly the same and I can learn a new one in a couple weeks.

I know technologies like Rails and SQL, but I don't play with the new toys as much as I used to, becuase lately I would rather solve a new problem or build a new tool than learn a new tool.

But how do you evaluate a coder on their ability to solve problems? Much harder.

Even more difficult, how do you get a 24 year old coder who only knows react and FireBase to evaluate a 37 year old coder who has built web frameworks from scratch?

It's another world, and a lot of companies just don't bother.

Thankfully I have no desire to work at any of those companies. I solve problems. If a company can't hire for that I don't belong there. There are plenty of companies trying to hire people to solve actual problems, not just put coder butts in seats.


This whole "learn a new one in a couple weeks" mindset is toxic and needs to die.

Yes, you can learn anything at a superficial level in a couple weeks. If you've spent the last ten years maintaining an ASP.NET WebForms app running on IIS, then within a couple of weeks, you can contribute code to a Go microservice running in Kubernetes and using Kafka.

And that kind of "basic competence, with the ability to learn more" is what hiring managers want out of a junior dev, so if you don't keep your skills up, well, great, you're as useful as a junior dev.

But what they want out of senior devs is people who know the ins and outs, who understand what to do and why to do it and can explain that to others, who can avoid the pitfalls of bad tech choices and misguided implementation patterns.

If you've got decades of experience on your resume, my guess is that you want to be evaluated as a senior dev. And if you're a senior dev who doesn't know modern tech, then... you're not that useful. Your encyclopedic knowledge of the WebForms event lifecycle doesn't give you any particular insights into the design challenges of a microservice ecosystem; your ability to tweak IIS for maximum performance won't help you write a Helm chart.

If you want to be hired as a senior dev, you need to understand how the modern world works, and all the posturing around "I could learn it in a week" or "I prefer to focus on problems" doesn't impress anyone.


If you really are a senior technical person, you have many valuable skills that are not tied to a tech stack. Otherwise you may have a related title but you are really just a technology domain expert - which is sometimes very useful and valuable, but it's not the same the same thing at all.

So yes, it's helpful that people keep current, and obviously waiting for a new developer to get up to speed on the particular stack is always a cost (and it isn't a couple of weeks, for anyone). However, if what you really need is a senior person, it may well be worth that time.

Seeing this assumes the hiring manager understands how software is actually produced and maintained, not just what their current tech stack needs. It's surprising how often this isn't true, but a contributing factor is how mid level mangers are made - which is related to this discussion I think.


What I'd say to that is, yeah, it's true that a senior dev will have skills that go beyond a particular tech stack -- but if you're working as a dev, then those skills are still tied to your technical knowledge.

You might be great at mentoring... but you have to know the stuff to mentor people in it. You might be great at communicating with stakeholders... but you have to understand the technical constraints of your system in detail to have that communication. You might understand principles of software design and debugging... but you still need to understand the particulars of these technologies to make your design or to focus your debugging.

And yeah, you can learn the tech stack, and a company that really needs a senior dev eventually could do okay to hire a senior dev with out of date skills, and wait for them to get trained up. But don't plan on it taking three weeks, is all I'm saying. To really get to that point of fluency and domain expertise takes longer.

(And also, one of the things that senior devs are supposed to do is evaluate new technologies and figure out when and where adopting them can make sense. If you've fallen so far behind on learning new tech, then it's not clear if this is something you're able to do -- maybe it is, and you just worked for a workplace that stifled adoption of new tech, or maybe you're someone who'll just learn what you need and then stop paying attention.)


While I see what your saying, a lot of your language seems to come from the idea that tech is a progression, that we are moving "ahead" always, and that people get "left behind".

That's mostly not true. While there definitely have been advances, tech is also very faddy and repeats itself. Trends are often reactive, techniques repetitive. Someone who has been around for long enough to see some of these waves resolve themselves is in a much better position to evaluate current and upcoming approaches. Technologists also tend to be myopic, and view the things happening in their own tiny slice of the industry as "what's happening". There is a whole universe of interesting stuff on the go that you are at best barely aware of.

Personally, if I'm hiring senior technical people I'm much more interested in their continuing level of curiosity and engagement with something, rather than particular stacks.

So maybe you are worried that your candidate who has been doing ASP.NET WebForms for the last decade on IIS will have a hard time getting up to speed on your use of TypeScript and React ... but if I find out that she's also being building raspberry PI infrastructure for fun and messing around with the Julia language to implement a cellular automata project I'm not all that worried about their ability to get up to speed, even though none of that stuff will get used by the team.

But yeah, it doesn't take 3 weeks. It doesn't actually take 3 weeks for a new dev already well versed in your technologies, with rare exceptions. At least not to be firing on all cylinders.

The thing is, a huge number of teams really, really need a good senior dev. They often don't think so, and often erroneously think they already have some... but they don't. Instead they have people who had "Senior" (or "Staff" or whatever) added to their title because they'd been there for a little while. And every piece of work the team produces suffers for it. So yeah, it's typically worth the investment.

Now granted, that assumes you are good at identifying talented people. This is actually a big ask - and one of the issues surrounding the OP's questions really comes back to that. I think that a lot of software and technology development suffers fairly systematically from poor quality middle management. One of the symptoms of this is the "ageism" discussed here, but it's only one of many....


Yes. These days gRPC is a new technology. And yet RPC has been around in one form or another since the 1960s.

   then those skills are still tied to your technical knowledge.
I think this is a problematic way of looking at it. Thing about it this way: If you are hiring a fairly general developer or engineer, their overall impact is going to be determined by something like 30% technical/ 30% problem solving and communication / 30% teamwork & fit, leaves you 10% to move around in those categories. In the much rarer case you are hiring a domain specialist, it is something like 60-70% technical. But you probably aren't hiring someone like that, or shouldn't be very often.

Focusing the hiring too much on that first 30% impact gets you in trouble.


If you’re competing as a “general developer” with ten years of experience, you’re a commodity that is competing with everyone else and if you aren’t knowledgeable about the tech stack we are using, you’re at a disadvantage. The last thing I want is to hire someone who has been doing Webforms for 15 years when I can find someone that has been keeping up. Why should I have sympathy for someone my age who didn’t have the wherewithal to keep up when I can be aggressive about it?

I know developers in thier 30s/40s/50s who are both great developers/architects and have current skills. Why would I settle?


I think you missed the point I was trying to make, I’ll try and summarize. There are two parts. If I’m hiring you as a 10 year veteran “general programmer”, I’m going to be very interested in the non-tech-stack skills you have developed (i.e. the other 60-plus percent of your impact). And let’s be honest, the vast majority of the 3 year experience “general developers” are weak there. The other point is that technology isn’t a straight forward progression, so while it isn’t great if you have been “stuck” in a tech stack you consider obsolete, by itself that shouldn’t be enough to reject you.

Two other points: First, of course when hiring you want to find the “perfect fit”, but you almost never do, so you have to make tradeoffs. Hiring an inexperienced person with the “right” tech stack when you really need to bring experience into your team is a common anti-pattern. Secondly, as a hiring manager you need to properly understand the difference between “10 years experience” and “one year of experience, related ten times”


Agree with all this, but I'll also say that if you're an experienced developer who hasn't kept up with the times, that's not just a fact in itself, it's also a signal about what kind of developer you are. Most of the best developers are active learners who are always trying new things just as a matter of course.

So if you only know old tech, I'm a lot quicker to believe that you're a "repeated ten times" person.


I get where you're coming from, but I didn't get the impression that GP was ignorant to new developments. In my experience the most important thing is really wrapping your head around a domain to be able to solve concrete problems in it. Whatever framework or programming language you choose to solve those problems with is peripheral, really.

Also, most coding is maintenance. So yes, an experienced developer in some language is really worth a lot when you initially start out designing your system. But most of the work spent on that system will be maintaining and extending it and to be able to proficiently do that you need to:

1. Grok the problem domain, so you can connect the real-world / problem you're trying to solve with the code in front of you

2. Have a grasp on the limitations of your own system, which will come with experience as you work longer on it


> ASP.NET WebForms app running on IIS, ... Go microservice running in Kubernetes and using Kafka

Those things don't really count as languages, they appear to be a mix of development environments, software packages and architectures. Learning these things is critical and hard but there is a key mitigating factor. That factor is that really what needs to be learned is 'how we do things around here' which is typically company specific and needs to be learned by a new hire anyway.

My experience suggest new hires in cognitive roles (at medium-large businesses) take 1-2 years to get to the point where they can be proactively useful, purely from learning who is who, what the corporate history is, what ahs been tried before, why things are what they are and what their role's problem domain is really intended to be by their boss. Compared to that set-up time, the cost of knowing or not-knowing a technology is quite small.

> This whole "learn a new one in a couple weeks" mindset is toxic and needs to die.

It is true though. Learning a new programming language to the point you can write bug-free code and execute someone else's design is really easy. Learning how to solve a new problem is typically the hard part.

I 'learned' Python in 2 weeks. What I write looks a lot like C code, and I didn't make great use of the standard libraries, but it works fine. Compared to that learning how to use a large new library for a new problem typically takes a months.


Those things don't really count as languages, they appear to be a mix of development environments, software packages and architectures. Learning these things is critical and hard but there is a key mitigating factor. That factor is that really what needs to be learned is 'how we do things around here' which is typically company specific and needs to be learned by a new hire anyway.

Not necessarily, except for maybe Go, those are industry standard technologies that you could find someone who knew and could add value almost immediately.

We are very heavily invested in the AWS ecosystem. The difference between hiring someone who has the same amount of useful development experience but no AWS experience and someone who does and how fast they can hit the ground running is stark.


It’s not really gloating; I’m in the same situation as GP - I’ve worked enough with languages and their ecosystem to apply those skills to new languages and ecosystems. It takes me around 5-7 days to grok a new language and how to be pretty good at it with its tools/frameworks/idioms. Having experience where understanding basic fundamentals make it easier to extend to a different implementations.

The language + tools aren’t barriers anymore, the interesting part of solving business problems and how to apply the tools become the fun ones.


Yeah, this is how I see it, too. Their example kind of gives away that they didn't understand.

Once you've learned and used a half-dozen imperative languages day-to-day, there's very little new when picking up another imperative language. It's mostly mapping common concepts onto new syntax, plus some new features and maybe quirks/gotchas.

Silly little example: After an hour of reading a Go tutorial, and having never looked at the language before, I was able to diagnose a bug a co-worker had been struggling with for two hours, because I recognized the pattern from something similar in Django (a python framework). Now knowing what to look for, a quick Google search had the exact answer to the problem right in the first result.


So do you think you could pick up and become a useful iOS Dev in two weeks?

Depends on the context. As part of a team (code already exists that I can reference), and simply "useful"? Less than that, probably about one week at most.

I wouldn't be doing anything large or complex in that short a period, but absolutely can become useful with smaller stuff in that period.

And I say this with experience, thought in a different context: Years ago my manager introduced me to a new codebase by giving me a bug to fix. It was in a Django site, and I'd never done any python before. Took about half a day to figure out what was going on (for comparison, nowadays it would maybe have been 20-30 minutes), and a bit more to fix.

But I did figure it out on my own, taking some work off of another developer who was focusing on more complex work. I would call that "useful".


That’s just the point, as a “senior developer”, I’m looking for someone who can select “New Solution” and start a project from scratch or at least start a new major feature. Anyone can pull up an IDE in almost any language, step through existing code, pull up a watch window and fix a bug. That’s something developers can do with three years of experience. Bug fixing is something you put your more junior developers on.

I have no idea why all of the old folks (again I am in my mid 40s) consider it some great skill that excuses them from actually keeping up with what’s going on in the industry.


Several things on that tangent:

* My bugfixing example was indeed trivial, just the first thing to pop into my mind. It was mainly aimed at the use of the word "useful", which is woefully ambiguous.

* Sounds like we agree when talking about someone who already knows the technology stack.

* No matter how skilled the developer, there will be ramp-up time when they not only don't know the framework, but also don't even know the language.

* That ramp-up time being significantly shorter for a senior dev is what we're talking about, and how selecting for the specific stack like you're describing isn't necessarily going to be worthwhile - learning the domain is likely to take longer than learning the technology.

Oh, and:

> Anyone can pull up an IDE in almost any language, step through existing code, pull up a watch window and fix a bug. That’s something developers can do with three years of experience.

Thanks for the compliment I guess; I was describing something from around 3-4 months into my first job.


No matter how skilled the developer, there will be ramp-up time when they not only don't know the framework, but also don't even know the language.

And if a company is using a popular language/framework they can find senior developers who know the language and the stack. Why should they hire someone who hasn’t taken the time to learn both?

That ramp-up time being significantly shorter for a senior dev is what we're talking about, and how selecting for the specific stack like you're describing isn't necessarily going to be worthwhile - learning the domain is likely to take longer than learning the technology.

Not really. If someone has ever been exposed to a similar framework - take your typical MV* framework for example - even a developer whose only been active for three years can easily transition.

I have no idea why old folks (putting myself in that category) think they can get by without keeping up. Do you really think that someone who has 15 years of experience but haven’t kept up with the industry won’t be at a disadvantage over someone with 5 that has? On the other end, you have people like me that knows what it’s like being behind the curve and vowed never to put myself in that situation again.

You can’t imagine how long it took me to unteach a 50 year old who had been doing ASP.Net web forms for years how to even develop in ASP.Net MVC using server side rendering let alone client side frameworks.

He was only eighg years older than I was but didn’t keep current


There is no monolithic "modern tech" to keep up with. If you think there is, you work in a bubble and don't have nearly enough experience to say anything intelligent about how to hire senior developers.

Moreover, if you're proud of how hard it is to learn ins and outs of your tech stack, you're likely using a shitty tech stack and costing your company tons of money.


That’s true. No one is saying be good at everything. But you should be surveying your local market, seeing what an in demand tech stack looks like and align your skills with it and have a cohesive story to tell.

But you can’t be proficient at an architectural level with a stack in two weeks no matter how good you are.

Would you say that you could me an iOS/Android developer in two weeks just because you picked up Swift/Java?


Side note: saying my beliefs need to die as the first thing you ever said to me is pretty aggressive.

But to your actual comment: I'm learning new things all the time. My focus when I'm coding in my spare time is a simple IDE, some project management tools, and a simple game engine I am building. I learn new things in those projects literally every day.

I'm not going to do a bunch of hello world apps in random new "Pro" technologies. I did a lot of that in my 20s, it was fun and I was curious about them, but my interests have shifted.

There are enough jobs I can say "no" to the people who want only coders who are using the newest "Pro" toolset.

Honestly, a good hiring manager will realize you ideally want BOTH that person and people like me on a team. One person will know that Rails 7 solves our new problem, and the other will know that a simple refactor will save 30 hours of headscratching over the next year.


No, you can learn programming languages at quite a deep level in just a few weeks. I went from having never written or read a single line of Scala or Java in my life to making production critical commits to a customer-facing product in about two weeks, where the commits involved needing to know a lot about Scala case class patterns, some tricky in-house implicits, structural typing and the numerical package Breeze. The thing is, because I have developed deep skills at delivering software products in other languages, I know what questions to ask. I know to look into build tooling, certain design patterns, ask what is the equivalent of foo from language X over in language Y. Experience allows you to short circuit huge chunks of the learning process and make bigger strides to knowing complete functional units of the language and tooling. Yes, you’ll keep learning more after a month, but in all sincerity, if you’ve learned at least one procedural language very deeply and have at least brushed up against other paradigms and design patterns, then you can learn almost any other language, whether it’s C++ or Scala or Rust or Haskell or Fortran or Erlang or anything else... up to about an “advanced intermediate” level in just a few weeks of self-teaching.

I would rather hire someone with that mindset. They are more likely to be interested in solving the actual business problems. The junior guy excited about frameworks and fads, is much more likely to create complexity (usually from vast numbers of dependencies and inappropriate technology choices). The flip side of course is that if you already have a mess, then the junior guy might be a better choice to maintain it.

I wouldn’t be excited at all hiring someone whose skillset is ten years out of date. It tells me a lot about their lack of willingness to keep up with technology.

This depends on what you mean by "technology". I would expect them to be aware of emerging trends in the industry. Bonus points also for an interest in the latest computer science. But I don't care if they haven't hacked on groovy gradle scripts or injected Spring AOP pointcuts into Java beans. This sort of tech is accidental ad-hoc complexity and not progress.

Spring's AOP pointcuts (in its first incarnation as AspectJ) and Groovy (pre-Gradle and pre-Apache) are both over 15 yrs old. To not know those is having a skillset that's well over ten years out of date.

Heck, I know what those are and I've only written 20 lines of production Java code. They came up repeatedly when I was researching using Microsoft's (now defunct) Unity framework to do AOP in C#.

https://github.com/BitsCorner/AOP-UnityInterceptor


It’s not about “emerging trends” but they should be familiar with a chosen technology stack that is in the “plateau of productivity” in the hype cycle.

>If you've got decades of experience on your resume, my guess is that you want to be evaluated as a senior dev.

And this is where ageism rears its head.

If two people apply for the same position, and you're OK hiring someone who has only 2 years of experience for the position, don't ask for more from someone who's been working for 20 years.


They get paid very differently, is the thing.

Why do they? The older candidate may have a higher expectation, but as an employer, the compensation for a particular position should be somewhat fixed. If the older candidate doesn't offer anything more than the "inexperienced" one, offer the same salary you would the inexperienced person. They can always say "No".

Trust me: There are plenty of older programmers who do not expect a higher pay because of their age. In my career, whenever I hear "older people expect a higher salary", it has almost always been by employers/managers, not by actual older professionals. On the other hand, I've met quite a few programmers who complained they didn't get an offer or were told they were ruled out because they were "too senior", and the complaint almost always was along the lines of "I was not asked salary expectations".


I have a former manager who is in his 50s. As soon as his kids graduated he self demoted to a full stack developer working with React/C# and Azure.

I have another friend in his 40s who took a job that let him work from home who is in his 40s and it pays less than he was making when we were working together.


Not as differently as you might think. When I was more junior, I assumed the wage increases would continue as they had been. But they flatten out after 10-15 years of experience, to the point that the salary of someone with 15 years of experience is probably the same as someone with 30 years of experience.

the advantage of older developers is that they have seen the same patterns before. For instance, younger devs might think microservices is the hot thing while an older dev is thinking that it just looks like a poor man's erlang from the 1970s and knows to wait a few years when all the CI/CD and orhestration is integrated and commoditized.

So while you are “waiting a few years”, the company has new problems to solve write now. So no you can’t stick around using VB6 for the next 20 years.

yep and I'm pretty sure creating a crapload of devops scripts to support microservices will be the next problem you give the business.

If you are creating a bunch of “dev op scripts”, you’re doing it wrong.

I wrote one parameterized Cloud Formation template that creates our build pipeline for any new service. Most of the time we can use the prebuilt Docker containers (maintained by AWS) for our build environment. We have one or two bespoke Docker containers for special snowflake builds.

And if you haven’t read my other posts, and I get accused of being young and inexperienced, let me just say that my first “internet based application” used the Gopher protocol....


I haven't read any of your other posts and my comments were not directed to you in particular. You're really making my point. If cloud formation templates work then great. I don't know its limitations. I do know that kubernetes has become a standard for orchestration and that it becomes easier over time to manage. A couple years ago there was no support from AWS so maybe not the time to jump in. Likewise with Kafka. Cool tech but really expensive to manage. Now AWS and Confluent will manage so the costs have come way down.

It's really easy with AWS....

https://aws.amazon.com/about-aws/whats-new/2019/02/deploy-a-...

But we really don't care about the lock in boogeyman. Out of all of a typical business's risk, lock in is the least of them. We use lambda where feasible, Fargate (serverless Docker) when necessary. Why waste time doing the "undifferentiated heavy lifting"?


It's not toxic, it's dumb.

For older coders, I'm not sure that same signal works. I've forgotten more frameworks than most coders will ever learn. I mostly don't care about programming languages, they're roughly the same and I can learn a new one in a couple weeks.

Why not? I’m older than you and I have passed plenty of technical interviews on the $latest_tech.

Sure you can learn a language in two weeks but a language is far more than syntax. There is the ecosysystem, frameworks, third party packages, etc.

And honestly, I would be more concerned about hiring someone who built their own bespoke framework from scratch than someone who used an existing one. Building frameworks rarely adds business value.


> And honestly, I would be more concerned about hiring someone who built their own bespoke framework from scratch than someone who used an existing one. Building frameworks rarely adds business value.

If I were interviewing such a candidate, I'd dig into why they built a custom framework. The answer to that question could be quite enlightening, especially if they mention business reasons for doing so, or if they talk about what they've learned since then.

When I started my own business a decade ago, I decided that existing web frameworks wouldn't suit my needs and built my own. Though I built a framework that suited my needs extremely well, it was a massive distraction from my actual business (which wasn't even tech). In interviews, I freely discuss how my NIH syndrome contributed to my inability to get the business off the ground; I've found that it tends to go over rather well with interviewers.


Same. The way I work at home is no way to run a business, and I am painfully aware of that fact.

I go to work to be on a team and pull in one direction and support the leadership and make hard compromises.

I write code at home to understand where the industry will be in 10 years.


> And honestly, I would be more concerned about hiring someone who built their own bespoke framework from scratch than someone who used an existing one

If they're over 45, then they likely worked in the late 90s and early 00s. If they were like many of us, then they worked on at least some web applications back then. This was far before jQuery and even Prototype.js on the frontend. Java, PHP and Perl were commonly used on the backend (or where I lived, more often IIS with asp). But the start of the career of someone in their 50s and even 45+ was before any commonly-used framework in open source scripting languages used for web development (unless you consider PHP to be a framework).

Even Spring framework was only created in 2003 (I just checked to confirm). Symfony in PHP was 2005. And those took several years to catch on in their respective ecosystems.

Tons of companies had in-house frameworks back then. Who did you think wrote them?

Even if someone working back then never explicitly built a framework, they used design patterns that mirror today's frameworks. If not, they were writing spaghetti code (and to be fair, that was pretty common back then).

Based on your username and your assertion ("older than you"), you're what, 45? Have you forgotten what it was like back then?


In the late 90s early 2000s every website was a bunch of spaghetti code with a mix of code and html - the original VB based active server pages, JSP, or even ugly Perl and PHP.

But when he said he “made his own web frameworks”, I’m assuming he was talking about this decade.


> But when he said he “made his own web frameworks”, I’m assuming he was talking about this decade.

I re-read the comment in question. Not sure why you would have assumed that. Someone who is now 37 likely started programming before the age of web frameworks.


No, your parent was correct. I am currently building a complete web application hosting toolkit that I started in earnest in 2015. I worked on it this week.

I wouldn't call it a framework, because it's built of small independent pieces, but it's a similar scope to something like Rails.

I did start coding long before web frameworks existed though. :)


Given the amount of spaghetti I've seen... It's my understanding that being able to think in frameworks is a rare skill. If you find someone who is actually good at designing frameworks maybe, just maybe they have some real skill and they built their own framework to practice and hone their craft because that's a necessary condition for getting good at something?

Granted, I hear what you're saying, but I'm not convinced your blanket dismissal is the right analysis without digging deeper on the individual details.


> a language is far more than syntax. There is the ecosysystem, frameworks, third party packages, etc.

Agreed. However, if you're a software company you've already likely made all of those decisions. If I come work for you, I'm stuck with them. It doesn't matter if I like the package manager or the frameworks. That's just the background against which we work.

Our work is making our lives better, and solving the company's problems to the best of our ability, given all those constraints.

> honestly, I would be more concerned about hiring someone who built their own bespoke framework from scratch

Yes! I feel honestly quite nervous about sharing that aspect of my work, because I worry people will see me as out of touch.

But I am quite up front that I don't advocate using my toolkit in production environments. My professional advice is usually quite conservative. I tend to advocate for minimal disruption and using standard tools as they were meant to be used.

However, because I do that all day at work, I see all the cracks and warts in the architecture and when I go home I want to work on the future.

And even though you are correct "building a framework rarely adds business value"... that's not what motivates me. I do think there is value being created, and I will eventually be able to cash in on that. But I am doing it because I am interested in the future, and I want to work on tools that transcend the pointless busywork I have to deal with day to day, in the name of meeting quarterly goals (which TBF is crucially important in a business context).

I get direct enjoyment from working in my own little hobby world where those constraints don't exist, and that is enough motivation for me.


Agreed. However, if you're a software company you've already likely made all of those decisions. If I come work for you, I'm stuck with them. It doesn't matter if I like the package manager or the frameworks. That's just the background against which we work.

Let’s say you learn C# in “two weeks”. Is that going to be enough time to get proficient with ASP.Net MVC? Entity Framewor? IIS? The built in logging framework? The DI framework? Middleware? What if they are using Xamarin for mobile? Let’s say you did pick up Xamarin (which I haven’t).

Would you know the fiddly bits about doing multithreaded processing with desktop apps using Windows Forms?


> And honestly, I would be more concerned about hiring someone who built their own bespoke framework from scratch than someone who used an existing one

It's almost like you're pretending the years between 2008-2012 never happened, but I'm sure that's not the case.


In 2008 I was in my mid 30s, had been working at one job for nine years as a C/C++ bit twiddler and writing COM/DCOM objects that interop’d with VB - in other words I had a skillset they was woefully out of date. I got lucky and found a job that let me pivot into being a modern “enterprise developer” and barely survived rounds of layoffs for the next three years.

I learned my lesson. For the last 10 years I’ve kept my resume competitive with whatever the $cool_kids technology is and haven’t had a problem competing in the market.


What are you referring to?

Ditto, if something extremely special happened in the years 2008-2012 with regard to frameworks, its cataclysmic significance seems to have escaped my notice. Rails? /shrugs

I think maybe they are just referring to the fact that in that era pretty much everyone had to build their own "framework" to get a big web app up, because there weren't a lot of established choices.

Between 2008-2012 there was ASP.Net Web Forms and MVC was becoming popular.

I didn’t play in the Java space back then but there were plenty of Java frameworks (https://dzone.com/articles/what-serverside-java-web-framewor...)

Ruby on Rails was released in 2005.

Knockout JS was introduced in 2010.

Django was released in 2005.

There was also Mojolicous for Perl also released in 2008.


This is what I meant, apologies for being cryptic. Web/JS development was absolute chaos, it just so happened that Facebook & Google won the day and we've converged on some "standards" in React/Vue/Angular, and I use that term very loosely.

The things I've seen. shudders


> And honestly, I would be more concerned about hiring someone who built their own bespoke framework from scratch than someone who used an existing one. Building frameworks rarely adds business value.

Hum... My experience is that whatever framework a business use in any widespread manner, I do always have to build another one on top, because generalist libraries completely suck for productivity. Unless you are a single app shop, then you can just hard-code everything.

I don't get how improving development speed does not add business value.


It means that the next developer will have to learn your bespoke logging, authentication, etc. and other cross cutting concerns instead of just being able to go to stack overflow and finding an answer.

If their is a problem, they have to wait for a response from the one Guru who wrote it.

Most architects who think their problem is a special snowflake usually isn’t.


If it's "just logging" then of course just use the provided solution.

A custom domain for a business would be something more than logging though. For example, perhaps you want your logging to include some regulatory compliance hooks. Or maybe you want to do logging that also serves as a persistence layer for undo/redo and you need that to be tied deeply into your presentation layer.

In those cases it's much more than just "log this" that's being abstracted, it's all of the business needs around logging.

And next developer can't get away from learning all that code. Either it's tied up as a nice internal company logging framework, or it's spread out as a bunch of concerns through the app. But either way, the newbie has to figure it out.


The point of personalizing a framework is that most developers shouldn't be thinking about logging, authentication, etc.

Again, my experience is that the personalization layer is very easy to get by a senior developer, maybe there is some unconscious architectural decision that is biasing it, but I've never seen one have problem taking my code. Junior developers by their turn just can't grasp it, what I consider a feature, they are better just using the functionality until they get some experience.


So you are against encoding domain knowledge in the software? It's better to re-invent the wheel with every application so that "the next developer" can google their questions?

I don't grant your premise (that it's wheel-re-inventing), but yes: it's usually massively better to build with familiar, boring, transferrable, google-able tools. Even if it takes a little bit more time to build.

Frameworks aren’t about domain knowledge. They are about cross cutting concerns. You can’t imagine how many badly written custom ORM, logging, authentication, web frameworks I’ve come across.

I would presume he built his own as a learning exercise/for fun.

As a 40+ yo dev I want to agree with you but:

>I've forgotten more frameworks than most coders will ever learn.

Forgotten knowledge is worthless

> I mostly don't care about programming languages, they're roughly the same and I can learn a new one in a couple weeks.

I used to think that, its possibly true, but the platform that comes with the language certainly can't be learnt in a couple of weeks so I dont think this is true as it used to be.

> I know technologies like Rails and SQL, but I don't play with the new toys as much as I used to, becuase lately I would rather solve a new problem or build a new tool than learn a new tool.

This is a real problem for you if companies are hiring for people that know the new toys.

> It's another world, and a lot of companies just don't bother...Thankfully I have no desire to work at any of those companies...There are plenty of companies trying to hire people to solve actual problems

That's great now, but in the future the job market might not be so tight and then people with your attitude might find it really tough.


I think the point on forgotten frame works / new tech is -- nothing is _really_ new. Knowing ten web frameworks inside out isn't really 10x more than knowing one, because they have a tremendous amount of overlap. Relatedly, I've forgotten how a couple frameworks work because there's nothing interesting left behind, that isn't present in the newer ones. I'm quite sure if I had to go pick one back up, it would look very familiar. perhaps I know more about it now than I did when I "knew" it.

You didn’t work with a C# ASP.Net WebForm dev who before that was a WinForm dev trying to grok ASP.Net MVC.

Great, but that means that for the "knowing frameworks" box, there is no advantage in hiring an experience dev over a younger one.

The general thrust of this thread is that there is no advantage in hiring an experienced developer over a younger one.

You might be getting a mid-six-figure salary in your first job, but you probably ought to be socking it away because you can't expect raises.


> Forgotten knowledge is worthless

Not really, you pick up something you forget much faster than something you've never seen before. It's all in there somewhere.


Learn a language in a couple of weeks? No way! A couple of weeks is plenty to understand the mechanics of the language, but the biggest aspect of a language is the library.

I do agree that testing for the ability to solve problem is hard and gets neglected.


If at 37 you consider yourself "older programmer", the industry is fucked. The minimum social security retirement age in US is 62. That is older programmer. Full retirement age is 66.

"Why is that old guy still working? Must not have worked hard enough like I do! I'll be retired at 45 from my stock options and 401k" e.g. extrapolating from last 5-10 years of stock returns, which is all they've known...

> I can learn a new one in a couple weeks.

To play devil's (or manager's) advocate - have you ever crunched the numbers on what it takes to pay you for a couple of weeks? That's not a trivial investment, especially considering the risk that you just thought you could learn a new language in a couple of weeks but were wrong.


Sure, but you also have to weigh that against what it takes to pay people to develop the wrong architecture at the wrong time, and the opportunity cost to the company as a whole to be saddled with those bad decisions. That's not trivial, either.

(although I agree with the sibling comment — one doesn't really learn most languages in a few weeks)


Well let me tell you the story about some older experienced “infrastructure consultants” who thought that AWS was just like on prem infrastructure, did a lift and shift, moved my former company “to the cloud” and ended up costing more because they thought they had “learned AWS”.

Architecture and processes that was fine for one environment was horrible for the other.

In development terms, you can write FORTRAN in any language.

Let’s not talk about the one Java developer who thought he could pick up C in two or three weeks. It was ugly.


But during that time, they're not just learning the new language, they're also learning your environment, applications, and tools, which they have to spend anyway. In my opinion, that could accelerate their learning of the language, by seeing it used "for real", rather than in a more restricted tutorial context.

Every job I've ever worked I felt I produced more cash value for the owners than I took. I'm actually quite proud of that fact. It's important to me, and it's one of the things I think about all the time as I'm making decisions on how to work.

Isn’t the whole point of asking programmers to work out and and write code for alien dictionary from Leetcode to test their problem solving ability? There are really no easy answers to interviewing people for jobs, I guess.

Yeah, I'm talking about people being upset that I had to look up the syntax for ActiveRecord associations because I haven't written any in a while.

> But how do you evaluate a coder on their ability to solve problems? Much harder.

With an IQ test, the same way you evaluate anyone else on their ability to solve problems.


That evaluates a person on how well they solve IQ test problems.

It evaluates the quality of the substrate.

Having a high IQ doesn't mean you are going to be able to hit the ground running, solving problems and writing production code in your first week.

Better to hire for your particular skills. And younger is cheaper.


I work in enterprises with 50,000+ employees and billions in annual revenue. I work with older programmers all the time! They're better at programming in Angular than I am. And they are wonderful mentors!

I think what's actually happening is some programmers, who happen to be older, don't keep up with their skills. Then when they interview, they bomb it, or talk about how they could do the same thing in an older stack.

That's great, but the company is really invested in the current / new stack, so it doesn't make sense to hire someone who does not have those skills. So they pass them up and find a candidate who knows the software the company is looking for.

But the older interview candidate, instead of reflecting on themselves and realizing they need to spend time enhancing their skills, they simply say the company was "ageist" and blame it on age discrimination.

Blaming others is generally easier than blaming yourself, I guess.


The scenario you talk about happens for sure. And some of those protests are legitimate; we've made little advance in real productivity since the 70s but all the while churning languages and libraries and environments like fashionable clothes partly because programmers are on the average young and inexperienced. So after a decade or two of riding the wheel, a lot of engineers decide it's pointless and just kind of check out. (Not me, FWIW; I'm on board with React Hooks and looking for an excuse to use ReasonML.)

On the other hand, ageism is real. When I was younger I was told more than once by managers, straight up, this candidate is too old so find an excuse not to hire. And now in my 40s, when I walk into some companies for an interview, I can feel the decision has been made before I even start talking. Not everywhere, not even most places, but it happens for sure and it's not even that subtle.


> we've made little advance in real productivity since the 70s but all the while churning languages and libraries and environments like fashionable clothes partly because programmers are on the average young and inexperienced.

Making a real time video chat client is now a literal programming exercise, it used to be the domain of multi-million dollar venture backed startups that were valued in the billions.

Real time chat between users is now a feature that is added to an application in a day, or even a few hours if you follow any of the myriads of tutorials available on YouTube.

Go from writing a UI in C/C++ to writing it in any of the higher level languages. Assuming you don't get caught in some design pattern trap that results in massive code bloat, it is now possible to do things in hours that used to take days to weeks.

Heck, transparency is no longer "write some ASM" routines, it is setting an opacity variable!

Some tech stacks, such as those for making basic CRUD apps, may have actually degraded a bit, but at the same time someone who is only slightly technical can now drag and drop their way to an online store front that is a fair bit more powerful that Viaweb was back in the day!

Heck, it is possible to now go from an empty folder to writing+deploy HTTPS API endpoints in under half an hour.

Edit:

More stuff!

App (not web!) updates can be deployed to users by uploading a file and having a client pull down the changes and seamlessly switch over to the new code next time the app is launched. Versus mailing out disks!

There are app stores that you upload intermediate code to and they'll compile apps to the appropriate architecture for a customer's device!

During 72 hours of coding for Ludem Dare, game developers and artists work together to create games that are as complex as a full fledged commercial game would have been 25 years ago.

And of course, many corporate software engineering efforts continue to fail or go massively over budget for largely political reasons.


In the 1980s a spell-checker was a very complicated project and now it's simply a hashmap lookup. But it's not because we invented the hashmap since then, or nobody in the 80s would have known that, it just wasn't possible with the RAM limitations at the time.

Likewise, moving from ASM to C to Smalltalk is a night-and-day improvement... that we made in the 1970s. The difference again is what hardware we get to run it on.

Video game construction kits existed in the 80s, and 80s GUIs looked pretty much the same as they do now if you ignore resolution.

Drag and drop application development was huge in the 90s, and the CRUD applications of the time weren't significantly different than now, other than they didn't run in a browser. HTTPS wasn't hard back then, and you could write a Perl endpoint in half an hour easy.

When it comes to things like playing video, it's easy now but I don't even really consider it programming. You're just installing software that does it for you.

Libraries and OSS and StackOverflow and various services have made a real difference. I'm not arguing that we haven't made progress. I'm just saying that I can see how some people feel that this year's incremental advance or retreat is not nearly as exciting if they've seen how the last 20 worked out.


Your discussion of "real time video chat clients" has me cracking up; it seemed that a myriad of pieces of software had this figured out in the 2000s, but today I can't think of a single piece of software that I'd actually want to use for video chat. All of them have some fatal Achilles' Heel: Facetime has security problems/previously did not have group chats/is exclusive to Apple OSes, Skype is laggy and has more connection problems than any piece of software I've ever used, Google Hangouts chokes once you get to 3 or 4 participants and frequently has issues with even 2 participants, Duo is only usable on Android/iOS as far as I know... does anybody have a good client that they can actually recommend?

Myself, I've gone back to using Ventrilo because I realized that I really don't care about having video.


I mean sure, video chat is overrated... but the tech is there, it just turned out it isn't quite as desirable as first imagined!

Firefox Hello worked remarkably well IMHO, it was super simple to use.

The field is full of video chat clients though. Uber-Conf is popular, although I think the % of problem free calls I've had with them is less than 50%.

Zoom has worked well for me, no real issues.

Facebook Messenger, privacy issues aside, works well.


Have a look at Vsee. Aimed at doctors, meaning they're probably reliable and secure. And they have a free version.

> Go from writing a UI in C/C++ to writing it in any of the higher level languages.

I've never done this in C/C++ specifically but I have to imagine that the GTK bindings are mostly language agnostic. Using a tool like Glade[0], you can build a GTK GUI with ease.

Glade was released 21 years ago.

[0]: https://developer.gnome.org/gtkmm-tutorial/stable/chapter-bu...


I don't remember where I read this, but that reminds me of the "Perl Paradox" -- that if your job listing requires a "Perl programmer", you get worse programmers, even though a good programmer can equally well adapt to any language, including Perl. But if you require such a person, your only applicants will be people who could never adapt to a new language.

Edit: Ah, it's actually named that and I found a lot of results for it:

https://www.activestate.com/resources/webinars/perl-paradox/


Interesting. I legitimately liked perl back in the day. The ability to do things like inline c code, etc. was great. But I've been immediately passing on any roles that mention Perl for a long time.

Whoa, I had never even heard of c-inlining. Then again, I never really got into perl.


> they simply say the company was "ageist" and blame it on age discrimination.

This is a topic I'm very interested in (and as I've gotten older, my interest has only grown).

In ~20 years as a coder I've seen only a handful of overt ageist issues. I've seen plenty of ageist language and attitudes, and even more assumptions, but those are a lot harder to answer "Yeah, but would they turn down a qualified candidate for that?" about. I've known people who were both experienced and open-minded, as well as those that were experienced and closed-minded.

End result: No idea what to believe. I'm not comfortable assuming one way or the other, because either assumption is harmful if incorrect.


I am divided on this. On the one hand straight up age discrimination is definitely a thing. A surprising number of startup types don't even know it is illegal so they are completely explicit about it.

On the other hand, companies who engage in age discrimination are probably shit companies, so are you really losing out by them not considering you?


> companies who engage in age discrimination are probably shit companies, so are you really losing out by them not considering you?

If they have the money to make my life easier, and I end up not getting it, then yeah, I lose.

If they have the experiences that make me a better dev, and I end up not there, then yeah, I lose.

If they aren't actually that bad, they just have this one area of ignorance, but I never get to experience all those other benefits, then yeah, I lose.

If I'm NOT in a hot tech center and this is one of the relatively few jobs that is available/is a step up, and I don't get it, then yeah, I lose.

I myself have nothing to complain about, at least not in my experiences to date, but in terms of considering the issue overall, I don't think it can be so easily dismissed. I recall plenty of older techies struggling after the end of Y2K and the dot-com bust when the market wasn't "If you want a job, you can get a great one", and I assume there are plenty of people in the above categories.


They simply say the company was "ageist" and blame it on age discrimination.

Except they don't "simply say that".

It's a fact of the industry that is backed up by (a heck of a lot of) experience and observation, unfortunately.

In that sense, it's kind of condescending to deny the reality of the (sometimes painful and frustrating) experiences many people are having in this regard.


I think larger enterprises tend to be more friendly to older programmers. The companies move slower, so they have more legacy code and less interest in trends. Less fun to work at, but.

I'm in my early 30s and very aware of this, to the point where if I could go back I'd pick a different career.

My strategy is: live in the bay area so I can get paid like crazy, live a very frugal life much below my means, invest diligently, and hopefully have enough money to retire in my late 30s (in a much cheaper area, of course), so that when I'll be told that I'm too old, I'll raise my middle finger and leave the scene with the few million dollars I saved.


Hi. I’m you from ten years in the future. Do not be afraid.

I wanted to pop back in time and say thanks. Yeah you didn’t retire in your late thirties like you thought you would mostly because it’s harder to make FI cash than you thought. The crash of 2020 took a few years to get passed but hey 45 is the new late thirties as they say now.

The reason I’m thanking you is you went ahead and struggled and saved anyway while things got tough and now you are free. Just FYI, I texted my QUIT to the HR bot in the Philippines just before I came back to see you.

Btw quitting is harder than you think. Just to let you know, I don’t blame you. Not quitting at 40 when you could eke bye because they offered you 2x what you’d ever made to stay; yeah the place was thalidomide for your soul but then your wife (I’m not telling) got pretty sick and that cushion was worth it.

Also: take care of your health and sleep more. An extra ten years of long hours at the terminal has left me not as healthy as I’d like.


Haha thanks for this, I had a good time reading it.

To be fair, I don't plan to retire at 39 or 40, as long as someone hires me with reasonable compensation (2x would be a dream!). I'm purely trying to protect myself against a foobar scenario where indeed, I become obsolete from the point of view of most employers, due to "old" age.


They don't pay you 2x to work at a dream job. It's 2x because it's "thalidomide for your soul". Be careful.

I'm 40 and making the most money I've ever made in my career, fighting off hiring managers begging me to interview, and doing the most exciting work of my career. Also, you should leave the bay area now, not later. Your crazy money is getting eaten away by living expenses and you could have higher discretionary income in a smaller tech hub, and you could buy a house now and build equity instead of renting in the bay area.

Seconded. I make good money, am nearly 40, and work remotely while living in an inexpensive small town near a reasonably fun metropolis. I’m probably going to be able to retire in 5 years. All of this would have been much, much harder in a different career track. Computer science was a great career choice for me.

>Your crazy money is getting eaten away by living expenses and you could have higher discretionary income in a smaller tech hub, and you could buy a house now and build equity

You're assuming that he would actually want to stay in that other place long enough for the house purchase to make financial sense. You have to own a house at least 5 years for it to work out, minimum. Moreoever, in the big tech hub, you're making more money, so if you keep your living costs low, you'll bank more cash than living someplace cheaper.


> you should leave the bay area now, not later

Curious, why? I am having no problem being hireable right now at all (literally just joined a FAANG recently).

As far as I know, it would be impossible to get the same pay I'm getting now anywhere else. Even after considering the high cost of living, the spread between income and cost of living is very high, and allows me to stash away a significant amount of savings every year, to pay for my future freedom. I really doubt I'd be able to save that much if I were working anywhere else.

What else would you suggest?


Okay, it's crass but let's talk numbers. I'm currently earning $210k/year in Austin, TX. My monthly income is about $12k after taxes, and my rent is $1,400 for three bedroom house 20 minutes from the office (wood floors, nice appliances, yadda yadda). I save about $9,000/month, give or take. When I was looking at relocating to the bay area I crunched the numbers and couldn't find a realistic scenario where I wouldn't be taking a relative pay cut.

Damn... I may be more underpaid than I realized... I keep hearing from my employer that they won't pay me comparably to my SF-based co-workers as I'm remote, "Texas is cheap", and "No one there makes over 100k". I think it's time to haggle a bit harder...

No way, I have a friend that's a junior front end developer with three years experience making $90k. When I do get numbers in emails from recruiters it's 140-170 for a senior role.

You have a nice situation, but my goal is mostly saving money for the future while living a decent life now (and admittedly I am minimalist so many things people care about don't bother me much). I have obviously higher cost of living expenses (living frugally I do it in $50k-$60k/y), but just my base salary is higher than your compensation, and with RSUs I get easily into ~400k/y total comp or more depending on the year, so basically I end up saving a lot for the future, which is really the metric I'm optimizing for. I'm in my very early 30s and already saved up ~1.5M liquid.

I don't own a house here in the Bay, I dump everything in very diversified index funds and some rentals out of state, since I'm not interested in staying in this place long term and play the million dollar mortgage roulette.

The day I call it quit, me and my partner (she does well too) can move to Austin TX, or anywhere else really (we're both immigrants with citizenships in really really cheap countries), with our fat liquid assets, and not having to worry about necessarily finding work in my 40s. At least that's my very optimistic plan, it might completely not turn out like this, or I might die tomorrow.


Man, if I had 1.5 mil in cash laying around I'd be on the first flight to Spain or Italy.

While that might seem like a lot, if you estimate the money is invested to just keep up with inflation and no real growth (very conservative assumption, but still possible), for $1.5M to last you 60 years (i.e. from 35yo to 95yo, completely depleting the capital at the end of your life essentially) you could just withdraw $25k inflation-adjusted pre-tax every year. I'm not sure you could necessarily live that well in Spain or Italy with that, without working. I happen to be originally from a cheap European country, and it's not that obvious.

So it's really not that much, most people who get private or state pensions get a better deal than that from a cash-flow perspective.


What about the 4% rule (put it in an index fund and hope for an average 7% return, leave 3% for inflation and live off the rest)? I always figured that would work out to $50k per year per mil. Is that naive?

Edit: sorry for the ninja edit. I read your comment too quickly.


A few points:

1) If you keep it under your mattress it will be much less than what I described, because every year your nest egg will shrink due to inflation, so you'll just be able to take $25k non-inflation adjusted, which is a big difference from my $25k inflation adjusted (in 60 years, $25k will be $150k at a 3% inflation). My assumption is 0% real growth, not 0% nominal growth, which is what you'd get by keeping it under the mattress.

2) The 4% rule is based on a shorter retirement interval (30 years) than what I'm looking for (60 years). Try to go on firecalc.com and look for the statistical odds of 1M giving you 50k/y for 60 years: the failure rate is higher than the success rate, and that's based on historical data.

3) Yes, my assumptions are very conservative, but I don't believe index funds will return 7% nominal over the next few decades, the world is going to face too many problems in my opinion. That being said, pretty much all I have is invested in index funds despite my opinions (mainly because I wouldn't know where else to invest it, since both cash and bonds are sure losers to inflation), so in the best case I'll be pleasantly surprised.


I regret trying to give you advice. You have clearly planned your life much better than I have. :)

that is quite conservative. i would think 3% or 45K adjusting for inflation each year is still a very safe assumption.

Totally fair skepticism. If you wanted to make it last just 30 years (standard early-retirement duration) then sure, I'd agree with you, probably even 4% would be totally safe. But 60 years? Too many things can happen, I have to be super conservative before taking the decision (forced or not) to pull the plug, hence my estimation.

Again, quite possibly I'll die much younger without even enjoying any of that freedom :-)


A thing to also take into account is that it is very unlikely you will live the rest of your life without going into some other money-making venture. People who know how to accumulate money tend to accumulate money.

Basically, the people who recommend moving someplace cheap are people who value stability over cash savings, and want to have a big house for a family. If that doesn't fit you, then it doesn't make sense for you. If you're single and live cheap, living in the high-CoL makes much more sense.

If you are planning on staying in the bay area for 10 years to save money, why not move some place with a lower COL and buy a house that might appreciate significantly in those ten years? You can make good money in Austin.

There's no guarantee housing will appreciate, in fact it's a pretty good bet that it won't. Housing is already unaffordable in many places relative to local wages, so it looks like another bubble that may burst.

Also, Austin, while not as bad as other parts of Texas, is still in Texas.


> Also, Austin, while not as bad as other parts of Texas, is still in Texas.

I've lived in six cities on three continents (and I'm talking signing a lease, not staying for a month) and I can assure you Austin is no better or worse than any other metropolitan area regarding... wait, what metric exactly are you using here? Weather? Burrito size? Number of smug, passive aggressive west coast urbanites per square mile?


> Also, Austin, while not as bad as other parts of Texas, is still in Texas.

So, what's the disadvantage?


It's a felony to receive unregistered chemical apparatus, such as an Erlenmeyer or Florence flask.

https://www.dps.texas.gov/RSD/Precursor/Laws/index.htm


That does seem a bit excessive, but is getting a transfer permit actually difficult? And it's not like California isn't doing the "war on drugs" either, with the same success.

I'm 45, soon to turn 46.

If I could pop back in time, it would be to (somehow, because I didn't back in 1992) recognize that where things were at was the Bay Area, and moved there instead of staying in Phoenix.

But here is where I am, and while I make a decent salary, I certainly am not retiring any time soon.

The whole age thing is standing in sharp relief in my mind currently, because things are slightly "up in the air" at my current employer; I want to stay employed here, and I'm sure my employer wants to keep me, but recent events may potentially cause the relationship to have to be dissolved if the projects and numbers don't line up. I can understand that from a business perspective - fair's fair.

But the prospect of interviewing yet again, 3 years after getting this position, is not something I look forward to. I have new skills I can bring with me that I didn't have before, so that's a plus, but I worry that the age thing may be a barrier.

I'll probably have to lower my salary ask just to have a chance, but I won't do that unless I don't get any traction after a few months of looking - should I have to do so.

Even with that, though, I tend to wonder if the question in the minds of those hiring will be something akin to "Why isn't he a team lead or a manager?"; which is something I don't have, and have never been offered - possibly because I've never been able to stay at a company long enough to get to that level - either they go out of business or they get sold 2-3 years after I join, or the company is just too small to have any kind of way to advance you into such a position (I've found that I fit best in companies of under 50 employees, and the fewer the better).

Maybe if there's another time around coming for me, I should look into startups, though that seems like a bust from the get-go (ie - find a startup, in Phoenix, who needs a 45+ year old software engineer - that's probably a very small pool, if it exists at all).

It is what it is, I guess. I am fortunate in that I am debt free, at least.


Don't feel bad - I am in this situation as well. I believe your assessment is true in regards to the stereotype / first impression we make to hiring managers etc.

If I could go back in time I would have the done the bay area thing when I was young too. But, life is a bit of give and take - there are some good things that happened in my life not taking that course (even if they didn't all work out as planned). So you kinda have to appreciate those parts of life and accept the fact that you are maybe not in the financial situation you wish you were in.

The only guarantee we have in life is this moment :)


Old people don't get useless, they have much needed experience that have to be harvested. Just because you use the latest fad does not mean you don't dig the same pit or just a slightly different pit. The young that realise this will stay ahead.

For sure, but in this case I apply the investing's theory "the market can stay irrational longer than you can stay solvent", to something like:

"the software engineering labor market can stay irrational towards older people longer than the 2 decades past 40 where I could stay effectively employable".


I will be 53 this year. I am, strictly speaking, an IC. A leaf node.

The company where I work administers a survey of all tech employees every year, then publishes the results as a series of graphs, which can be broken down at any branch in the org chart. The very first question is "how many years of industry experience do you have?" The graph for my team is, shall we say, bimodal. There's a bell curve between 1-4 years, and a spike at 30 years.

I've been building things for a long time, and I try to impress upon the other developers on my team (or in my org) that I've never, ever, had an opportunity to build at the scale that they're being asked to build at in their first gig out of school. If I come off as negative, it's only because I can say with surety that X won't work because it didn't work in the past. I expect to learn, with my team, whether we can make things work at now scale.

There are times that I miss hours and hours of uninterrupted coding, and being inventive at small scale. My leadership would take me out to the loading dock and kick the shit out of me if I tried to contribute at that level.

My job is, essentially, to educate, and slap the team off of local optima. To impress upon them that their customers are king, and that their customer base includes themselves and their teammates, paged out of bed at 3am. And to impress upon their management and the business that we really to need to focus on improving operations, rather than add features from time to time. It is also to observe the progress of, and advocate for the developers on my team.

I feel (at the time of writing... Talk to me on Monday afternoon) like I'm lucky that my employer points senior engineers at problems like this.


> He was interviewed by a younger engineer who told him, “I’m always surprised when older programmers keep up on technology.”

As someone relatively old, I've been in a lot of developer interviews. I also noticed that older programmers tend to not be as up to date in newer technologies or ways of doing things.

I don't think it's strictly limited to technology. How many 50-year old do you know who listen to trap music? Or approve of the way teens dress these days?

It's just the classic "older people tend to be more conservative" fact. At some point most people get stuck in the things the were doing/liking when they were younger. The hate new music, they dismiss newer technologies (to open a rats nest: Electron anyone?)

I'm sure I'll get comments like "I'm 18 and I hate trap music and Electron". That's not the point, I'm talking about distributions not individual cases.


> I also noticed that older programmers tend to not be as up to date in newer technologies or ways of doing things.

I've been doing this a long time and it doesn't matter to me.

The newer stuff is most of the time just a variation on a theme that I've seen turnover three or four times since i've been doing this.

What older coders have that younger ones do not, is experience. This can often halve the workload on some projects. The older I get in this profession, the more results I get out of less and less code.

Also, let's not immediately fetishize the newer stuff. It is often a less well implemented retread of older technology. Seems like people have less patience for learning existing systems, so they just re-write some quick and dirty replacement and call it good.

Perhaps it might be better if younger programmers learned the old stuff instead of just badly reinventing the wheel every few years.


I've been doing this a long time and it doesn't matter to me.

And this attitude is why older developers find it harder to get a job (I’m in my mid 40s myself).

What older coders have that younger ones do not, is experience. This can often halve the workload on some projects. The older I get in this profession, the more results I get out of less and less code.

I keep seeing this as an excuse for older developers who don’t want to keep up the date and stay marketable. Code is code. At some point you have to put lines in editor to make stuff work.

Also, let's not immediately fetishize the newer stuff. It is often a less well implemented retread of older technology. Seems like people have less patience for learning existing systems, so they just re-write some quick and dirty replacement and call it good. Perhaps it might be better if younger programmers learned the old stuff instead of just badly reinventing the wheel every few years.

The market can stay irrational longer than you can stay solvent.


Nice reply. I myself do not find it harder to find jobs, and I'm even older than you.

My post above didn't mention it, but I also learn new stuff. And some of it is really useful.

I don't insist the client use my preferred technologies. I may recommend them, sure, but in the end I'll learn what I have to do get the job done.

This whole thread reminds me of a story:

A small town barber has been cutting hair for decades, and charging his customers a fair price.

One day a national chain opens a salon just across the street and advertises $10 haircuts.

A customer asks, "Aren't you worried they'll drive you out of business?"

The barber grabs a piece of cardboard, scribbles something on it quickly, and props it up in the window.

The sign reads: "We fix $10 haircuts."

I think the best business model for an older developer is to fix $10 haircuts. Seek out those kind of jobs that capitalize on your experience.

There are so many messes out there that if you can bill yourself as a cleanup guy, you'll never run out of clients.


It's not about churning out so many kloc's of code - its about solving problems.

I agree completely. But I hear so many developers my age say they don’t need to keep up because they can rewrite Facebook in one line of code.

>Perhaps it might be better if younger programmers learned the old stuff instead of just badly reinventing the wheel every few years.

That's pure gold. Not to say there haven't been advances--there certainly have been. But, there's also the issue of the shininess of new tech giving it undue cred, while simultaneously translating to an outsized denigration of the tech it purports to replace.

Frankly, a lot of these problems have been solved, and solved well.

Look at NoSQL as an example. There are certainly use cases for it, but then people went nuts, exclaiming things like "RDBMS must die" and touting no ACID transactions as a feature/benefit. Then, you get the trickle of reports that maybe NoSQL wasn't the best choice for this or that.

Finally, fast-forward to 2018 and you have announcements from MongoDB that they now support ACID transactions.

Many older coders have seen enough of these cycles, and have adopted a justifiably-cynical posture of "wait for it...".


Yup. And when you chime in with "wait for it..." you get the nasty vitriolic comments thrown at you like you're going to rot in the basement of a fortune 10 feeding a legacy system while the $cool_kids enjoy the sunshine on the beach with their new shiny frameworks.

Yet "Wait for it" usually turns out to be correct.


>Perhaps it might be better if younger programmers learned the old stuff instead of just badly reinventing the wheel every few years.

I think if you pick a concrete example of this phenomena (say, Java applets vs. WebAssembly) and you ask "so why didn't someone take Java Applets and update them, why do we need WebAssembly?", the answers to why this happens start to become more apparent. It's not so much that Java applets were good and then we decided they were unfashionable, as that Java applets were always kind of ugly and we eventually stopped tolerating them.

For examples from before say, 2000, say Turbo Pascal a lot of the answer is because the thing in question was a proprietary product so once the financially backed dev team stopped working on it users were forced to move on to something else. Other times it's because the thing was written for an 80's computer platform that no longer exists outside of emulation and retrocomputing collections. Sometimes it's that the graphical standards the original used were for hardware with very different capabilities and making the thing usefully work on a newer system would almost require a total rewrite anyway.

Even beyond all that, there's another problem: Software is fairly hard to do literature review for. If I want to know if a particular research idea has been done before, I can hop on Google Scholar and figure it out with a few hours of carefully constructed search queries. Doing the same thing for software is very difficult, especially because most products for the majority of the fields history were proprietary. As a consequence, we don't have our history available to analyze in any easily accessed form. The Internet Archive has been doing what they can to rectify this situation. (See: https://archive.org/details/BeyondCyberpunkMacintosh) Nonetheless, there are still serious problems in this area.

I'm a younger person (23) who shares your frustrations, but I don't think that you can really wag your finger at people when there are real barriers to what you're proposing.


> I think if you pick a concrete example of this phenomena (say, Java applets vs. WebAssembly) and you ask "so why didn't someone take Java Applets and update them, why do we need WebAssembly?", the answers to why this happens start to become more apparent. It's not so much that Java applets were good and then we decided they were unfashionable, as that Java applets were always kind of ugly and we eventually stopped tolerating them.

I've been asking this exact question on various HN threads with WebAssembly announcements, and the answer is always one of "Applets are insecure" or "Java is insecure".

So I don't really follow your example here... whatever security model you want to have, it seems much more obvious to implement it for Java applets than to implement WebAssembly and then implement your preferred security model for WebAssembly. Why do you think we replaced applets with WebAssembly?

(It's kind of a leading question -- my experience as a browser user suggests that we stopped using applets, spent a while not having them, and then started working on WebAssembly. By that analysis, we never replaced applets with WebAssembly -- we replaced applets with nothing, and then replaced nothing with WebAssembly. That would tend to support the more crotchety answer of "people developed WebAssembly because they just forgot about applets".)


My understanding was that the concern wasn't necessarily that the jvm was insecure, or the browser was insecure (although they've all had vulberabilities at one time or another) it was the integration between the jvm and the browser that was difficult/impossible to get right. This not only held for Java but any browser plugin (flash, silverlight). The entire concept was just bad.

Now where webassembly comes in is a compilation target for anything to the browser. I had the sorry misfortune of working on a project that had an applet that interacted extensively with the host page. It was excruciating, the api's were not up to the task. At the time, before the canvas tag, that's the length we had to go to get "advanced" functionality into a page. Nowadays that could all be done with js (or X compiled to webassembly) and canvas element. Things evolve.


Native plugins only work on desktops/laptops. They don't work on mobiles because hardware and software diversity, also because security model is more friendly towards consumers. And mobile users are now more than 50%.

The move into the web alone was responsible for a lot of that killing. Up to this day web tools are not as good as the old desktop ones, but those old ones are all dying. Mobile devices also killed some stuff, but much less.

Before that, computer networking killed most of the running systems. And before those, any new computer would require all new software, so there was no continuation at all.

Things seem to be getting slower with time... maybe except on Javascript frameworks.


Seems on except should mention arrival of microprocessors and pc. From apple to mainframe to network ...

Only security and network seems stable. All are sand dune like.


Well, in my comments I was restricting my examples mostly to server side software running on UNIX.

This eliminates much of the proprietary lockin and portability problems you mention. I never was saying people should just keep using Delphi into the 21st century.

Also, my comments apply to broad categories of solutions not just software implementations in the narrow sense.

For example: I just replaced (had to for perf reasons) a NOSQL solution in one of our applications with MySQL.

Ironically, the application would actually scale better with NoSQL in theory, but the choices the developers made along the way (data modelling and API design) made it perform worse on the NoSQL DB than on MySQL.

The tactical solution, which fits our needs and the size of our user base, turned out to be to replace a new shiny thing, with an older technology. The benefit gained was more flexibility with queries, which in turn allowed us to speed things up significantly with little effort.

The previous team burned a lot of time learning the new NoSQL thing to then not get any benefit from it at all.


WebAssembly is definitely a great example of a technology looking for a problem to solve. Java Applets are more comparable to conventional JavaScript, and the latter’s advantage (DOM integration) isn’t even in WASM yet. WASM is probably for something else.

A better comparison might be Java WebStart vs Electron.

This. And older engineers have finer bullshit detectors. ISTM ageism is real, and it's all about the money.

I’d add that survivor bias applies both to technologies and developers.

Creaky old software tools from the 70’s that are actually still in use have fought off decades of competitors that attempted to improve them.


I certainly don't want to chime in with a detraction that misses your point, but I do feel there's a slight difference. (for context, I'm 42)

I think the reason older coders don't "keep up" is because you have to lose ground first. If I want to adopt a new technology, I have to first learn how to handle problems I ALREADY KNOW HOW TO SOLVE with my current technologies. The better I am with solving the problems I encounter already, the more I will have to backpedal before I can go forward in a new tech. And while I definitely get better overall from learning a broader base of techs, over time there are definitely diminishing returns. The cost-benefit of learning something new vs spending that time either learning more of what I'm already in, or just solving any of the infinite things I want to get done is a balance that will slide more and more towards not bothering to learn new tech unless there's a real need.

Add in that this is an industry rife with Imposter Syndrome that DEFINITELY bites you in the face during any moment of struggling to "get" a new tech, and there are plenty of incentives against embracing new tech beyond just "I don't like change". (though that is definitely present - I'm a JS dev currently, so I'm well familiar with the dismissive and arrogant attitudes that can be tossed around).


Same (41). In my experience, most of the money seems to be in paying down the technical debt acquired by projects when they were in the startup phase. So I spend most of my days deciphering antipatterns in order to get down to the abstract logic, then fixing issues using as few transformations as possible since there is no budget for a full rewrite.

So most of the issues I fix are due to mistakes I never would have made, because I already made them 10 or 20 years ago.

The worst part is that I no longer think in code, I think in piping data around and working functionally or declaratively because my brain is so apprehensive of side effects. I've found that younger programmers don't tend to realize how dangerous imperative programming (especially object-oriented programming) can be in the long term.

So what I really struggle with is motivation, because it can be so disheartening to not be listened to, or to work hard on an implementation only to have the elegance of its abstractions muddied beyond recognition by developers who don't take the time to understand them.

It's probably too late for me. I cling to a fantasy though of saving enough money that I can take a step back from programming and work on some of my own demo languages/frameworks that illustrate the flaws with the status quo. Because of handwaving and cargo culting, today's apps and websites encode so little core logic in proportion to lines of code that I'm hopeful I can make a niche for myself working at a higher level of abstraction and bidding less money on contracts someday. But it might just be a fantasy.


> So what I really struggle with is motivation, because it can be so disheartening to not be listened to, or to work hard on an implementation only to have the elegance of its abstractions muddied beyond recognition by developers who don't take the time to understand them.

This post hit really close to home for me. It's especially depressing to be caught between these "pearls before swine" moments and Impostor Syndrome and for a while this made it very difficult for me to work up the courage to go to job interviews.

I am currently working somewhere that I genuinely love, as one of two senior engineers on a team of 7. I think I love it so much because I work with people who care about the quality of their work and respect the experience I bring to the table. They've been around long enough to see the cost of making the same mistakes I made 5 years ago, and have outgrown the "just ship it" mentality so many startups are born with and rewarded for. When you're trying to build a 100-year company, you value good engineering.

Most software jobs in California are at startups that are not trying to build a 100-year company, more like 100-week companies. In fact, most venture-funded startups are trying to have as short a lifespan as possible, because they exist to generate short-term returns through big-money exits instead of creating reliable revenue streams. This often actively disincentivizes high-quality engineering, and I think is a big reason why it can be so depressing to work with startups.


I think it's more along the lines that most new technology is not fundamentally different from the old solutions. Like how different is HMVC from PAC for example. How different are cool language x's Mixins from Lisp or C#. How different really are channels and message passing in golang compared to erlang.

If a new technology is good or seems like it has potential I dig into it, and due to experience I can generally pick up something to the point of productivity in a day or two that takes junior developers weeks or months to reach. Or at a minimum know enough to figure out how to trouble shoot blocking bugs others can't solve by knowing where to look for solutions and what those solutions might entail. So to an extent I keep up to date in that I read the value propositions and reviews on new technologies so I know when they might be an appropriate tool in my toolbox to tackle some solution or other, on the flip side i'm not in a rush to pick up every flavor of the week framework or library that is more than likely to vanish with out a trace in 3-4 years.

Meanwhile outside of a persistent google recruiter whose been asking to do an onsite for the last year I've gotten dear john'd for something like 3 roles in the last month including one where i'm personal friends with the SVP of engineering. Is it missing keywords on my resume, my age, my career path, education, wth knows but unless the market has tanked it strongly differs from my experience a few years prior.

https://www.dropbox.com/s/haod1gpfjd723lg/Keith-Brings-Resum...


Your resume reeks of someone who lacks focus.

1. Objectives on resumes are useless fluff. Yours doesn’t add anything useful. It’s so vague, I couldn’t even tell what type of job you are looking for. Project management? Test? Software Development?

2. Your whole summary of skills sections is word soup. I would completely ignore it. Anyone can list technologies they touched once. It’s too unfocused. In your experience section list how you actually used those technologies. Why put you know how to use source control? That’s like listing you know how to type. Why list 8 different languages? Could you handle a deep technical dive in all eight?

3. In your work experience section, you listed your accomplishments, but I have no idea what technologies you used to accomplish anything. You didn’t list the languages you used in any of your jobs since 2007.

I wouldn’t call you in for an in person or a phone screen based on your resume. You claim that you can pick up a language in a day or two but I can’t tell from your resume where you feel technically in any specific area. Again, it looks like you think because you can figure out the syntax of a language you’ve avoided learning the ecosystem, the tooling, the frameworks etc.

You can ignore picking up popular frameworks but you won’t be competitive with people who have stayed up to date.

If you came into an interview saying that you could advise on a technology based on “what you read” and you said you decided not to actually put it into practice, that would be a red flag.

I’ve been working for 20 years. I at one point had an encyclopedic knowledge of C and C++, that’s nowhere on my resume, neither is VB6, Perl, or PHP. I don’t want a job in any of those languages and I don’t want to discuss them.

My resume for the last 8 years has been entirely focused on telling the story I want to tell. I’m a backend C# developer/architect with some front end Javascript experience.

Slowly it’s been morphing into an architect who specializes in “cloud native solutions” with AWS and I’ve added Python and Node and hopefully will be adding the $cool_kids front end stack over the next year or two.


Fair feed back. The Summary of Skills is mostly a Keyword hack for passive job hunting I can move it around.

I can deep dive into Elixir, Embedded C, C/C++, Java, C#, and PHP. I've used all of these on the job excluding perl, python and ruby in the last 5 years, and python and ruby i've used on side projects or for tutoring a friend on algorithmic programming for a evangelical role google asked her to do a follow up interview on. This is also not a comprehensive list of every language I've ever worked in or touched. I don't list Lisp, F#, Haskell, ActionScript, etc.

It probably makes sense to prune out outdated frameworks. And I should probably be updating this with some of the more recent frameworks i've worked with. But the thing is I'm bright, top 1% of the top 1% IQ, inability to write a compelling resume aside. I can pick up frameworks like candy, I read the manual. Further when I want to figure out how to do something that's not initially obvious or covered in the documentation I just read the source code or look it up, which is what everyone does but I do it better. That 2-3 day number includes getting a handle on libraries and frameworks being used on the project. I can pickup just syntax in the 10-90 minutes it takes to glance over existing code and documentation for most languages. For existing code I can usually follow the flow on first read with out knowing the language in advance. I learned C/C++ when I was 14 from reading intermediate level game programming books and reverse engineering it to figure out what the constructs do since there was no internet or other books handy, and the skill stuck with me.

So how do I stress that flexibility on my resume, or tell that story with out coming off as braggadocious?

P.s. do you mind if I ping you with a modified version in a few days that takes into account your suggestions?


I’ve resurrected an old associated email account to keep this conversation out of HN email me. My email address in my profile.

But for the more generic comments that are germane to the topic....

Fair feed back. The Summary of Skills is mostly a Keyword hack for passive job hunting I can move it around.

You don’t need to move it around, you need to get rid of it. Your resume should express how you’ve used a technology/framework/language in a method that added business value to a company.

Besides, if you’re waiting for the job tooth fairy to show up and hand you a job because you list a bunch of keywords on your resume - you’re doing it wrong. At some point in your career you should have built a network of former coworkers, managers, and even good local recruiters that you can reach out to.

I can deep dive into Elixir, Embedded C, C/C++, Java, C#, and PHP. I've used all of these on the job excluding perl, python and ruby in the last 5 years, and python and ruby i've used on side projects or for tutoring a friend on algorithmic programming for a evangelical role google asked her to do a follow up interview on.

“Deep diving” is not knowing the syntax of a language. It is having enough experience to know the ecosystem, libraries, frameworks, pros and cons and to have written and being involved in large projects using the language. By definition, you can’t have been focused on five languages in five years. What exactly is your focus? Your resume doesn’t give a clue with what you’re good at.

Again, you come across as someone who thinks they are a special snowflake because you “learned C when you were 14”- you’re not. You’re posting in a forum with a lot of greybeards who are looking at this entire exchange just thinking “that’s cute”. I

And “knowing algorithms”, leetCode and how to invert a binary tree in Python puts you about on the experience level of a new grad.

So how do I stress that flexibility on my resume, or tell that story with out coming off as braggadocious?

From your responses you come across very much as an “expert beginner” (https://daedtech.com/how-developers-stop-learning-rise-of-th...).

I’m a reformed former dev lead, what you call “showing flexibility”, I would see as a jack of all trades and a master of none. If in 2019, you’re in any major metropolitan city in the US with years of experience as a software engineer and you’re not getting lots of hits - it’s you.

Your resume doesn’t (A) tell a coherent story with where you’ve been, which technologies you’ve used to bring business value or your focus and (B) where you are trying to go.


I'll send you an email, meanwhile to hopefully clear my good name.

Learned C, Algebra and Calculus at 14 for what it's worth. I wrote a general purpose game GUI system, scrollbars, font system, picture in picture, keyboard and mouse handlers, etc. thousands of hours over the summer break between junior high and highschool. On my own, I mean little brown kid, whose mother dropped out of highschool and didn't know any one else involved in computer stuff, or access to people with math backgrounds own. I wasn't just writing hello world and simple console apps.

I know there are thousands of other developers on here with similar background, many who are much better than I am but I started off good. I took AP C/C++ freshman year of high school and was in the top of my class etc. Although I quit programming for a few years after getting refused entry into the specific program I wanted to attend in the college program my sophmore year of high school, and after the shift from dos to Win95, GameSDK which I just didn't enjoy working with as much.

I never got into the 31337 coding or algorithms side heavily, although I've been meaning to now that I'm down to a nice 40 hours a month work schedule. My focus has mostly been performance tuning high volume web sites, system/software architecture, and internal systems and tools. I'm a strong autodidact so I constantly go out of my way to learn new things and usually go out of my way to do things the hard way (e.g. investigate the best possible solution on a given platform/framework for a given problem instead of just going with the tech debt laden good enough solution based on what I already know), but I've been tied-up picking up IOT development, and Elixir/Phoenix for the last few years instead of electron or whatever is hip these days.

I'm currently the primary developer and architect on a multi million dollar IOT stack that includes a GAE Java with Objectify client api layer, Elixir/Phoenix + Mnesia, Riak and formerly Redis backend, and an Angular4/Bootstrap4 TypeScript admin page. I've had to help debug issues in the Objective C iphone client, and wrote some of the initial framework code and library wrappers for it to interface with some of the Elixir Libraries and to add a layer of indirection around the generated GAE apis. I maintain some node.js admin tools although I've migrated most of that over to elixir. I also have applied some patches to a weather forecast proxy service maintained by another developer in GO to enable better handling of null values and some other odds and ends and read through documentation and plenty of articles on here and elsewhere discussing the high level pros and cons of the language versus elixir to get a feel for it, and it's use cases although I don't know it well enough to list having only spent a few hours here and there with it. I could work in the language today if I had to but I don't know all of its quirks yet.

In addition to the above recent java/elixir and some node experience, I had to go in last minute and spend a good chunk of time rewriting my clients product line's 2017 firmware after the chinese hardware team failed to get the product to reliably function. Prior to that I had to help troubleshoot SSL connectivity and resolve a SSL cipher issue after another hardware team with much more experience than me in the area had failed to do so. These were my first real exposures to doing embedded system work, discounting designing and writing an x-mode GUI on the 386 when I was kid, and I objectively did incredibly well based on outcomes.

In addition to maintaining these Java, Elixir, Typescript (mostly pushed off on the junior dev as of the last year) Admin Page, and C Firmware projects I maintain a php rpg game which, because I wanted to dig into it I setup to run on a docker cluster a few years ago. Senginx with dynamic ip resolution (because it cost extra on vanilla nginx) routing to phpfpm, elixir and static containers, mysql on containers, postgres, jira/atlassian containers, with consul.io DNS for hooking things together with out loosing connectivity after rebooting. Here I don't mean I checked out some existing containers, I went in and figure out how to do a lot of things (should probably include that somewhere in my resume), so I could help give others advice on the stuff.

In the previous five years I've additionally helped trouble shoot some C# issues for JetzyApp although i'm starting to get a bit rusty. At Great nonprofits I helped push the change to bootstrap 3 and responsive design, updated parts of the code and API to run on laravel (because it looked more interesting than previous php frameworks I'd used and had good enough performance characteristics), improved the work flow by setting up production like vagrant developer sandboxes, and circle ci continuous deployment, etc.

At microsoft I took a crude poorly working batch processing script responsible for converting test data from one system and converted it into a reliable system service. I didn't need to make it a system service, It just seemed like the most reliable approach to get to the point it was supposed to be at, and I'd never written one so I wanted to dig into it.

I continuously go out of my way to learn new things, but I think I've been making the mistake of learning the things I find most interesting instead of what the market most values lately.


You’re digging a hole for yourself. HN is not the place to brag about your superstar coding skill. There are probably literally thousands of people here who have more impressive credentials.

In school, many of us were use to being the smartest kid in the room. It’s just like being the star basketball player in high school and going to college and being surrounded by a dozen kids who were also the best in their high school.

For instance:

I know there are thousands of other developers on here with similar background, many who are much better than I am but I started off good. I took AP C/C++ freshman year of high school and was in the top of my class etc.

Bragging about being good at C based on how well you did in programming class is not saying too much. High school programming classes have historically been easy for anyone who was already a programming geek. Do you know how many of us wasted time on comp.lang.c on Usenet before the web even existed?

On any technical subject that’s posted here, you could have experts in the field that could put either of us to shame.

I don’t think I was a special snowflake, but by the time I was 14 (1988), I turned my nose up at any “high level” language because they were too slow and non performant for the little hobby projects I wanted to do on my 8 bit computers. I had already been doing assembly for 3 years. A lot of the grey beards around here can tell you much more impressive stories.

The whole point of this lecture is that your professional experience is no more impressive than plenty of people I’ve worked beside every day and certainly not any more impressive than people on HN.

I am not trying to discourage you. But as has been said by myself and others, you don’t have a consistent easy to follow resume that tells a story about what your area of expertise is, how you stand out from dozens of other applicants and I couldn’t tell where you would be a good fit from reading it.


noted.

I am at lunch and on mobile so I can't get too detailed, but, with respect, I see a number of errors on your resume. You are simultaneously being too broad in your objective and too detailed in the body. It's not telling a coherent, targeted story. You can leave off "references available upon request" because that's assumed. You can just list your school instead of that you only got an associates degree. I would if I were you find a good recruiter and ask them how it reads. Make it more focused and scannable. I put a lot of work into my resume and personal branding on LinkedIn and it pays huge dividends.

What steps do you take for personal branding. I'm really bad with this networking, self selling stuff.

appreciated. I haven't polished the thing really in years its a bit rough.

If I want to adopt a new technology, I have to first learn how to handle problems I ALREADY KNOW HOW TO SOLVE with my current technologies.

So much this. I’m in my 40s and I am on my third pivot. The first was a C++ bit twiddler for 12 years. The second was an MS stack “enterprise developer” and now a third - JavaScript everywhere, $cool_kids front end stack, AWS, and the Docker ecosystem. It’s been about a year and it’s going to realistically take 2 to 3 more.


> The cost-benefit of learning something new vs spending that time either learning more of what I'm already in...

That's a good point. But you also need to consider it from the point of view of the person sitting across the table. You might be very productive solving a problem in COBOL (to exagerate the point), but they are much more comfortable in Python/Vue.js and don't want to burden themselves with legacy ways. Because they can find someone solving the problem in the stack they want.


Totally. I hope I didn't come across as saying my reluctance to adopt new techs is JUSTIFIED. I absolutely need to do it to be any good, and even those techs that DO end up being flash-in-the-pan can leave me better off for learning them. I'm just saying any reluctance is UNDERSTANDABLE, not "correct".

It's funny you mention Python because I am in my mid-thirties and was writing Python in high school.

I hear this from older developers a lot, that the problems we are solving today have been solved already. That may be true to some extent, but I think that is dangerous, the ways the problems are solved again have different characteristics and advantages and generally push the ball forward. I think of it like forest fires clearing out old grown for new, and I think it's best to just have a zen-like mindset of being a constant beginner.

I completely agree. I'm not giving a rant about society re-inventing the wheel, I'm talking at a personal level. If I know language X, but I'm trying to learn language Y, I have to learn how to do things that I already know how to do in X. It's a change that HAS to happen to learn, and I'll walk away a better coder, but that doesn't make the process of relearning any more enjoyable or more productive.

I switched jobs recently and tried to adopt a zen-like attitude of beginners mind. It definitely helped...but I'd be lying if I said I never hit frustration points where I'm trying to go from A to C but I have to learn all about B when I don't WANT to care about B, I want to get to C. A good attitude helps, but I think the underlying reason is still true and is part of why experienced devs aren't adopting new tech at the rate of newer devs.


I don't know. I've been writing software for over 24 years. I'm not that ancient but learning c as a youngster helps pad the numbers.

There's an awful lot of cyclical patterns in development, and an awful lot of convergence. HTML5 canvas for example is an awful lot like writing custom 2d graphics back in the day. Docker, Vagrant, etc. are neat but there was chroot, zones, vmware images, cygwin etc. before. A lot of hot new language features as well are just existing patterns from the functional discipline being tacked onto imperative languages and vice versa.

The fact that the constraints and reasons for adding generics to java or lambdas to c/c++ are different from the constraints and reasons for adding templating to c/c++ and lambdas to lisp doesnt mean that you can't go in if you understand one well and identify the pit falls or gotchas of newer implementations.

Is it really meaningful to go all in on learning how feature x from 2000 repurposed to solve solution y is all that fundamentally different? WebSockets are great, so was long polling and comet architecture, restful is great, so was SOAP, and their non web based COM messaging or just cross application communication ancestors.

There are a lot of pitfalls and unique characteristics to knew technologies but there is a lot more of new spins on old ideas, common underlying concepts and reinventing of the wheel.


> so was SOAP

You had me until that (j/k)(mostly)

> Is it really meaningful to go all in on learning how ....is all that fundamentally different?

That'd be the diminishing returns I mentioned up-thread: each iteration of learning a new way to do an old problem gives you less. I'd argue it's still more than 0, but we aren't comparing to zero, we're comparing to the opportunity cost.

That said, I think, as a generalization, experienced devs tend to be prematurely dismissive. That is, however, a personal opinion.


Hey SOAP was great when the alternative was coming up with your own custom RPC protocol. At least if you were using visual studio's tooling. And nullable types didn't show up to give you a headache. XML was great when the alternative was writing your own custom serialization format. I don't miss them but, I remember their being a step forward.

* stealth edit.


100%. I think this is the biggest trap that an experienced developer can fall into. "I know Tech X, I can still get lucrative jobs in Tech X, there's no reason for me to jump out of my comfort zone and learn Tech Y."

But if you don't do it, you're stuck on the local maximum of Hill X, and as it gradually erodes away, one day you realize that you don't have any lucrative jobs that you're qualified for, and you're so far away from what Tech Z is nowadays that you're effectively starting over from scratch, except that also you need to unlearn a bunch of stuff that's no longer true.

And this happens fast. If you try to coast for even, say, five years, so many of your skills will have turned to dust.

This is why the only real way to have a long-term career as a developer is to genuinely be interested in this stuff for its own sake. You need to want to go learn the latest new tech not just out of a cynical career calculus, but because you think it's genuinely cool and interesting and sounds fun. As long as that's true, I think you'll be able to be a valuable dev even as you get older; as soon as it stops being true, well, time to consider what's next for you.


There's definitely a balance. Switching languages doesn't always make you a better coder. There's the danger of shallowly scratching the surface of a bunch of different languages/ frameworks/ etc without deep knowledge in any of them. That by far has been my biggest career challenge - trying to pick the right things to invest in.

It's also often just not true. A lot of the tools we use to solve problems today are miles better than what we used in the past. Imagine trying to process terabytes worth of data with a parallel compute engine before Spark - you would have to write tons of custom scheduling/workflow balancing code, whereas today I can buy a databricks instance from Azure and start writing a massively parallel stream-processing demo in an hour.

Now, a lot of companies do over-engineer and cutting edge tools aren't necessary in a lot of cases, but that's a separate problem.


As an older software engineer (52) I have developed a good sense of what "new technology" has a fleeting "here today and gone tomorrow" lifespan. I have seen a lot come and go. This is especially true now with "web tech" which I quit paying a lot of attention to because it so ephemeral. It is my math background and numerical computing (e.g., optimization) skills that keep me ahead of the pack. Only the best of the younger generation will keep up with me.

Learn things that will be important for forever. Life is a breath and then we are gone.


From the get off my lawn crowd (60) I don't mind being pushed out mostly because the work doesn't look that interesting at this point. I managed for 38 years with nothing in the way of languages (C, C++ when it became available, a whole bunch of assembly languages for exotic processors) by modern standards. Most of the heavy lifting in the workstation and high end embedded world was algorithmic.

All they are is dust in the wind.


That is great experience, particularly "a whole bunch of assembly languages for exotic processors". Here, a job listing for you: https://news.ycombinator.com/item?id=19284153

Within 10 years here I had encountered more than 10 different CPU architectures. I've lost count now, but it is probably at least a dozen. The same goes for OSes.


I don't know; anecdotally, these days it seems things are not quite like that anymore.

I'm pushing 40, and like reggae, acid jazz, electronic. I keep up to date (as far as one can) with tech, programming, cloud and the like.

It's not just me - it feels like a lot of older people around me don't match the old stereotypes. Older colleagues are learning Typescript, my grandmother reads erotica...

Youth culture is of course always changing, but older people are perhaps more liberal than they used to be, and don't feel like they need to behave a certain way any more more.


It's not just you. The youngsters in tech are blatantly ageist. They'll be singing a different tune when they're 40 and the generation behind is already chiseling out their headstones.

> ...my grandmother reads erotica...

I'm proud of your grandmother, but TMI, man. TMI.


You're making a lot of anecdotal generalizations here that contribute to the problem rather than help address it. What kind of music somebody likes or the clothing they're comfortable with shouldn't be part of the conversation at all, and for every anecdotal old-guy that hates Electron, I've met an anecdotal old-guy who's sharp as a tack and pushing the envelope with the latest tech.

> for every anecdotal old-guy that hates Electron

Any sane person with half a brain hates Electron.

(Not that there are any good alternatives, but still...)


The chances that YOU, when you reach same age, will be interested in pushing the envelope with the latest tech, are very low. The chances that you will find "latest tech" absurd and would rather be starving than pushing said envelope, are very high. When I was young, I couldn't even imagine the future with Spring framework, Hibernate, Angular, AWS configuration etc. The profession is dead for me, and has been dead for quite some time. This is my own anecdotal evidence, sorry.

Your bad attitude doesn't have anything to do with me.

You can't know what your future attitude will be like. Bookmark this post and review it 20 years later :)

But the phenomenon I'm talking about is well established in psychology: https://www.psychologytoday.com/gb/blog/mr-personality/20141...

The older I get, the more 'liberal' I get.

I think it comes from having more and more empathy.

The more I live, the less I feel I can judge people and their choices. Even if it is to prefer Mongo over Postgres :-)


Mongo support acid transactions nowadays, so the reason for vitriol has lessened a lot. I like my explicit db schema though.

There is a problem of equating being "up to date" with "competent for the position". Up to date might mean you know how to work in the syntax & convention of the stack of interest, but that's largely independent of the level of competence for overall software development & discipline. If an older programmer is solid on the latter, getting up to date really isn't a large amount of time. It's harder to assess, just like its hard to assess in interviews even if the engineer is "up to date" - look at the perennial debate on how best to interview candidates.

> It's just the classic "older people tend to be more conservative" fact.

Can we please kill this myth already? That was extraordinary so during the era when social changes swept through society, but someone who spent their formative years during the 70s will hardly regard today's society anything else than an extension of the changes which took place then.

There are many different ways to act conservatively. Resistance to new ideas because "but we've always done it this way" is a lot more common with people whose definition of "always" stretches to five (maybe ten) years.

> Electron anyone?

Well, let's put it this way: Most ideas are bad. But you can't dismiss all ideas on that statistical ground, because then you would miss the really good ones.


>I also noticed that older programmers tend to not be as up to date in newer technologies or ways of doing things.

Serious consideration: I wonder how the cost of "older programmers not being as up to date" compares to the cost of younger programmers too quickly adopting newer technologies.


If you think newer “technology” (re: programming language or framework) is superior to older “technology” – you are likely agist, and part of the problem.

So you think that someone saying that Python or C++ is superior to COBOL is ageist.

Depending on the context yes.

If your bank has some COBOL systems running to keep accounts up to date, do you really want some team of pinheads to 'fix it' by porting it to python?


> That's not the point, I'm talking about distributions not individual cases.

But you're not talking about distributions. You talk about the individuals you meet in interviews.


>The hate new music

That's a different issue. New music really is garbage; the entire industry has changed over the last several decades. There's a reason so many younger people are listening to classic rock.


Sorry, but this is spoken like a true curmudgeon.

Can I ask where it is you live to see hordes of young people finding solace in classic rock (or any sort of purely guitar-based music) ? Except maybe in US rural communities where it's all about country music (always has been, always will be) those days are well and truly over. "Nobody wants an electric guitar anymore" : https://qz.com/1013293/rock-and-roll-is-dead-sales-of-fender...

Meanwhile electronic music has taken millenials by storm. Tomorrowland festival sold out nearly 400.000 tickets in 40 minutes a few weeks ago. Same kind of numbers every year for Electric Daisy Carnival in the US.

Never in history has there been so much quality music released every day in all genres, because musicians can now do so without the gatekeepers of the past, much like Youtube has shattered the TV networks.

The consequence of that is that it's all about fragmented niches and discovery can be challenging. But there is gem after gem in my Spotify "Weekly Discovery" playlist. There has never been a more exciting time to be a music lover !


Everyone has different tastes, but there is some nice new music out there.

This for example: http://www.hawaiianreggae.org/artists

Nice and mellow, lots of young new artists. It's a pretty good scene.


These days, to find new music that's actually any good, you have to look really hard, and everything that isn't mainstream has really veered off into very different directions (meaning any given artist/subgenre probably won't appeal to you). It isn't like the "old days" when the mainstream music was actually good. There's a reason all those rockers from the 70s and 80s (and some 60s!) are still touring.

Your comment hilariously proves the point.

Ask the kids what they think about today's mainstream music. They will say it's amazing. Who do you think drives the charts?

Yes, yes, I know, you are or you know a couple youngsters who think anything newer than Pink Floyd is shit. And you think that the charts are not organic, they are "marketing", where the record label pay to get entry and rank.


I think it's more that older coders to a larger degree have more commitments, there's less time to keep up with all new things while doing your job, family and whatever clubs your kids are member of.

One of my responsibilities to my family (I’m in my mid 40s) is to make sure that we are all fed and have a roof over our head. If that means taking time to keep up with the $cool_kids and stay employable so be it.

If you look at my resume over the past 10 years and the tech I’ve used, it’s indistinguishable from younger developers.


To a certain extent you also just need to be aware of where you are and where the market is going. You don't need to keep up with every single new technology, you just need an idea of where you fit in the current market and what skills you will need to pick up if / when you go to switch jobs.

You need to pick up the skills before you switch jobs because othet people will already know it and be able to hit the ground running. I’m already asking for above median salaries (not talking about FAANG salaries), I need to show a level of architectural efficiency. Meaning no, I am not going to study leetCode and I’ll probably laugh if you ask me to invert a binary tree, but I will draw out and discuss a well design architecture and software development process and answer questions that show I have proficiency at the technologies I claim to know.

45, certain "trap music" is a typo

https://en.m.wikipedia.org/wiki/Trap_music

But...it was popular in the late 90s, which corresponds to us mid-40 age people anyways.


Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: