
Ask HN: Are older devs more resistant to adopting new tech? - baron816
I&#x27;ve noticed a lot of comments here recently from self proclaimed tech veterans talking about how they won&#x27;t even look at new technologies until they&#x27;ve had a few years to mature, or complaining about new patterns&#x2F;paradigms that put engineers enjoyment&#x2F;productivity ahead of raw compute performance. Is this a systemic thing? I&#x27;d imagine after 20+ years of writing code a certain way, I would be loath to completely changing my approach, but I&#x27;m concerned that this could be a source of (potentially justified) ageism in the industry.
======
mywittyname
Veterans have seen technologies come and go. If they were keen to hop on every
new technology that came out during their career, they'd literally never get
anything done. The risk of adopting a technology that will fail high and the
results can be disastrous.

It might be seen as stodgy to pick Java along side a relational database for a
brand new application over the latest JS hotness plus NoSQL. But we know that
Java is going to still be around in 20 years, we know the language and
architecture can do what we plan, and is easily able to expand and scale as
the product grows. We know what deployment, integration, maintenance, etc
looks like.

What young developers don't see is the long, long, long line of failed
technologies that were hot, then faded it obscurity. It's very hard to bet on
winners.

~~~
nkkollaw
This is exactly how it is.

Unfortunately, it seems to me that new devs have to always bw kept entertained
and know/care very little about the business, and would rather make the
company tank and have fun experimenting and move on, than get the job done.

Maybe I'm getting old.

------
kareemm
I've been writing code professionally since 2001, and on an amateur basis
since learning basic and machine language on my Commodore 64 in 1989.

The thing that many younger devs don't (yet) get is that tech works in cycles.
Today's hotness is yesterday's tried-and-true. As an example, Java (J2SE) came
out in 1998 and my guess is - based on open positions on e.g. Stack[1] - that
you'll still be able to write Java for decades.

So for an old guy to adopt new tech:

1\. I'd want there to be obvious benefits __that outweigh the costs __for the
new approach. As an example, I 'm not convinced FE frameworks are the best
approach for CRUD apps. If you're building Trello and need crazy
interactivity, sure. But you incur a bunch of cost vs. say Django / Rails /
Laravel with intercooler or Stimulus sprinkled into interactive pages. And
what's the huge cost for? Skipping a few page refreshes?

2\. I'd want tech to be mature enough that I feel good it's going to be around
and therefore is worth investing in. As you get older your time becomes more
valuable as you tend to take on other obligations (kids, spouse). Hacking on
new tech late at night or all weekend is less palatable or possible than it
used to be.

Curious to hear what others think though.

1- As of writing there are 425 jobs that mention Java
([https://stackoverflow.com/jobs?q=java](https://stackoverflow.com/jobs?q=java)).
Surprisingly there are only 341 that mention React
([https://stackoverflow.com/jobs?q=react](https://stackoverflow.com/jobs?q=react)).

~~~
twunde
2.5 I want tech to be either mature enough or beneficial enough that I'm
willing to deal with all the bugs AND specifically the very real possibility
that the application may need to be rewritten to replace a bad tech bet. A
good example was that around 2011 MongoDB was really taking off. HN routinely
had articles about migrating to Mongo and how it was the new hotness. It turns
out that MongoDB had a number of problems including dropping data regularly :/

Those of us that had to deal with debugging issues like that and potentially
porting away from Mongo, are a bit gun-shy about adopting new databases,
especially without the ability to call someone up and chat about what issues
they encountered after running the tech on an application supporting a million
users.

~~~
twunde
Ah, and of course I found the essay I wanted to reference when I wrote this:
[https://mcfunley.com/choose-boring-technology](https://mcfunley.com/choose-
boring-technology)

When you're new everything is new so you're willing to try a bunch of new
tech. You haven't seen people get burned (repeatedly) on using a whole bunch
of new tools at the same time.

------
muzani
A lot of new tech, especially frameworks, are not actually much better.
They're just cleaner and neater.

They argue that this causes less bugs and time to transfer information to
another developer, but the overall cost to learn and implement exceeds the
losses. They also tend to be less flexible, which also increases the costs
further.

Older devs are much less likely to make mistakes, whether it's Java or SQL,
and the new technologies designed to reduce mistakes by tripling code size are
less appealing.

It's different to, say, C vs Assembly, or Java vs C, when there were
substantial gains.

------
cjbprime
I guess I'll throw myself out as a hopefully counter-exampling data point: got
a CS degree 18 years ago, concentrated on C, kernels, systems for ten years,
now writing mostly React/React Native frontends and web backends. The tools
are amazing. I don't miss having to compile and reboot between every attempt
at a kernel change.

~~~
t-writescode
Isn’t React pretty old now, as far as web technologies are concerned? Like,
it’s established, battle tested and has a relatively low bug count?

~~~
ironmagma
Sure, but it’s still one of the “cool” frameworks and thus something many
claim is new enough to count as a fad.

------
dragonwriter
There's certainly a correlation between age and several factors that
contribute to the absence of neophilia, notably (but not exclusively):

(1) expertise/investment in existing tech platforms: why get into something
where you'll be one of many with 1 year of experience where you are already on
a platform where you've got 20?

(2) having seen the hyped products fizzle out quickly. Why get involved in
something that may not ever have industrial demand materialize where you are
already invested in a platform that is established and will have, even if it
was abandoned for all new projects tomorrow, demand for expertise driven by
legacy software for longer than the remainder of your working career.

(3) Fatigue with the whole cycle of learning the basics of yet-another-stack
rather than deepening knowledge of established stacks, application domain,
engineering/analytical/process mastery that isn't tech stack specific, etc.

I'm a 40+ dev who _loves_ learning new things including tech stacks, but I
absolutely don't view "learning new tech stacks" (especially _really_ new ones
rather than established but new-to-me ones) as anywhere close to the most
career-useful use of time. Its a hobby that sometimes has nice career side-
benefits.

OTOH, for a new dev some of this isn't just missing but positively inverted: a
new tech stack is an opportunity that, if it takes off, for you to be in a
position that while you'll still have to compete with people with more
_general_ experience, you won't have to compete with people with a decade or
more _stack-specific_ experience than you have, whereas with an established
stack there is no way to avoid that.

------
kristianp
On a related note, I was surprised to see a recent thread on HN about
recommending a mobile development platform. Most of the recommendations were
for Flutter, which I would expect is not mature enough to be a solid platform
yet. A couple of years ago it might have been react native.

The HN crowd is the other extreme, mostly being young, they are mostly only
exposed to what has been promoted recently.

~~~
Zenst
Most mobile dev work is done by individuals/small groups and each app is in
itself a short life product for most and that's the key aspects as those
factors make cutting edge gains/risk balance more towards gains. Why, the
exposure is limited to the individual app and smaller/more dynamic ability
numbers allow such things.

But for your share listed companies, such things tend to be less so and even
with tech companies - the only new tech they tend to embrace is tech they
developed.

So there are avenues to learn the latest and greatest, but always keep that
risk balanced with gains.

Now even in large businesses - new tech can and does get looked at, but even
then, it will be cases of new projects and area's in which there is no
alternative tech. So there will always be niches, take FPGA's and bank share
trading.

But in a World that still has COBOL, C and lesser ancients though still older
than many of the coders that use them - Java. Not everything changes
overnight.

I remember the early days of Java and lots of people jumped on it, it had
pedigree and at a time in which Sun was king of the internet roust. I recall
seeing many a fine demo, then checked that they also had cutting edge hardware
and was still, not near fluid - let alone upon the rolled out desktops. So
yes, cutting edge software can often demand cutting edge hardware just to get
the same results. Things improved in software (java performance) and hardware
and that fine balance came about.

Computer languages are brands in many ways, that level of trust and
understanding takes time to come about.

[EDIT ADD] Sorry I tangented but another important aspect is tooling - that is
one area in which you may find cutting edge or less legacy come into play,
though there will always be a robust content delivery and auditing system in
place and that will be a wall for many things.

------
flukus
If anything I'd turn the question around and ask if younger devs were more
resistant to adopting older technology.

It certainly seems that way when we have tools like electron are popoular
because it allowa them to stick with the javascript/html they already know vs
learning things like native desktop apis or nosql where they can just insert
json objects rather than learn SQL. It seems rare to meet devs under 30 who
learned tried and proven languages like c, c++, perl, etc or that know how to
use a debugger outside of an IDE.

------
adventured
I suppose I'm an older dev now, approaching 40. I've been building things
online for 20+ years. When I was in my early 20s, I used to wonder about the
nature of the premise: whether it was accurate, why it would happen, and so
on.

What I've observed in myself and others is that older people become
dramatically wiser about wasted motion, effort, time. They acquire a greater
respect for the value of energy and time. Not least of all because at 40 most
things are more taxing physically than at 20. Your brain will tell you flat
out: no, I don't want to burn my valuable time on that, I don't _need_ it; I
want to do this other more important thing with my time instead and I have to
choose one or the other (having an experienced scope or overview of life and
knowing that time investments are always choosing between competing things; as
you get older you more fully realize how quickly life rolls by and that you
will not get to do everything, so these choices become quite important). By 50
it's generally going to be true that you have more good days behind you than
ahead of you, time becomes very important.

I think older devs seek to avoid unnecessarily reinventing wheels whenever
they can. They seek to avoid wasting time learning _this_ technology when
_that_ technology they're already a master of will do just as well (and
faster, they already know it). And as others have mentioned, they will
inevitably also have been burned numerous times on watching tech come and go
and having sunk time and effort into learning a thing that vanishes. You can't
afford time hits as trivially at 40 as you (seemingly) could at 20. When you
burn / waste a year at 20, it doesn't feel nearly as detrimental as if you do
it at 40.

------
throwawayblahhh
I have seen too much come and go. I've mostly settled into Java on the back
end and react front end. They have been around for a long time, and will
continue to be. You can take a project many years old and easily modernize it.

I think all of us in the industry for a while have worked on "dead end"
projects where many things were just unfixable because core libraries,
languages, and frameworks are simply obsolete. These are, by far, the worst
projects to work on.

Java is the perfect counter example. It's kinda verbose and has some annoying
shortcomings, but all the tooling is top notch. It's one of the fastest
languages out there and has the best debugging/build/monitoring/IDE tools of
any language I've used. And it's rock solid stable, we have plenty of servers
that run some hacky old stuff for years straight without a restart.

And the best part about Java is that I can find a library to do anything. I
have never once ran into a technical roadblock in Java because I needed
something unusual.

Most of these attributes are lacking in new languages. Rust has crappy IDE
tools and debuggability, no good ORM, not many supported target architectures,
no run time code generation. Golang has terrible package management, no
generics, and a basic garbage collector that's fast except when it isn't.
Node/JS threading support is a joke, the standard library is so small that
most projects pull in over 100 megabytes of dependencies, and it's still a
slow memory hog compared to Java/Go/Rust/C#.

Java doesn't have anything "terrible" about it compared to the glaring
shortcomings in these newer languages. There's just been enough decades of use
that everything has been smoothed over until it's at least decent.

I bounced between Perl, PHP, C# in my early days and I'm sticking with Java
now. It's the easiest to work with. I'll pick up a new language when it's as
good as Java, so maybe Go or Rust in another 5 years

~~~
jason0597
What do you think about Kotlin?

------
epc
Personally it's a combination of wariness and weariness.

Wary of getting onto the new–new hotness bandwagon just because it’s the
new–new thing, and weary of being pushed into a new–new thing because it’s the
new–new thing and not necessarily the best solution for the problem at hand.

Increasing my productivity is irrelevant if it decreases usability or
performance of whatever it is I’m building. If it isn’t serviceable, or
maintainable, I won’t consider it. If it’s a proprietary solution that depends
on the company surviving, I won’t consider it.

For me it’s a tradeoff of risks. Will the benefit of rewriting everything,
again in this new language and framework pay off in reduced operational or
service costs or improved performance? If it won't, then investing money and
time chasing after the new–new thing just seems to be a waste.

------
fortran77
No! I switched from Fortran IV to Fortran 77, and never looked back.

------
saxonww
I'd say I am. It's a constant fight - use the tool that I know and move on,
even if it requires some gymnastics to get it working, vs. evaluate something
new that might or might not cover all the necessary use cases, and might or
might not require gymnastics of its own.

You just have to take things on a case by case basis, and especially
understand that you have to think about your whole team, and also recruiting
new people.

------
Tainnor
As someone who has been in the business for more than 5 years but not nearly
long enough to be one of the "veterans", I find myself often between the two
camps.

On the one hand the hype cycle is annoying. First, Node and NoSQL were
supposed to solve all problems, now it's GraphQL, or Go, or whatever. All the
while perfectly reasonable tech like SQL or REST are almost not considered
anymore even in cases where they are more suitable.

OTOH, some of the "older" people are the same when it comes to newer
technologies. Things can't be the same forever, and not every change is a
temporary fad (the internet wasn't, nor were mobile phones), and the same
people that now refuse to look at things like e.g. Kotlin or Scala should
realise that Java was new at some point too and plenty of languages that came
before have now all but died out.

In short, people should come up with actual arguments for/against some tech
instead of going purely by novelty (whether as a positive or a negative
aspect).

------
ncmncm
The fact is that the overwhelming majority of new techs disappear in just a
few years. Look up the "lava flow" antipattern for the effect of adopting each
new thing.

The more money spent hyping a thing, the sooner it vanishes. About the only
exception is Java, which was hyped from here to Saturn. Everyone jumped on it,
at the time, to get out from under Microsoft and other lock-ins.

Proprietary lock-in is the worst thing ever.

------
redis_mlc
Different is not better.

When you understand that, you will be an expert.

~~~
ThalesX
Perhaps “Different is not always better, but better is always different” is a
bit more realistic?

------
bobbytherobot
Assuming an older dev is also a higher level, they are likely to be dealing
with many non-technical problems like project management and politics. As
such, their big problems to solve are less the tech and all the other issues
around the tech.

------
Jemm
Software and UI suffer the same problems as fashion.

Younger people see existing solutions as OLD and gravitate towards HOT new
solutions. This creates change for change sake, at the cost of efficiency and
usability.

Consumers do not like change unless it brings new functionality or improves
usability.

As an older programmer and ops I see the trends and cringe at the loss of
efficiency. Software running on resources that dwarf what we had in the 90s is
giving a worse perceived performance.

------
mbrodersen
I am 50 and have been programming since I was 11. I learn new tech all the
time. Just enough to find out if it is worth learning deeply or whether it is
just another flavour of something I already know. 95% of the time it _is_ just
another flavour of what I already know so I can safely leave it alone until I
need it (again 95% of the time I won't ever need it).

------
shoo
C.f. [http://boringtechnology.club/](http://boringtechnology.club/)

------
rurban
Not more resistant to new tech at all, they love tech and they certainly look
very deeply at new self-proclaimed breakthrough tech.

But they are probably more resistant to the current hype train, and they
experienced the various pitfalls of hype-driven marketing. "Fake it till you
make it".

E.g. as with the current safety hype around Rust, they probably know more
about the three fundamental safeties Rust proclaims to support (Note: it does
not). E.g. That Java proclaimed memory safety for a long time, and just last
year really started to remove their memory unsafe API.

The maturity argument is relevant to managers only, not technicians. Older
technicians love improving approaches to build and maintain software, be it
more abstract, more elegant, shorter or just better. Older technicians love
the new CI/Deploy pipelines, simplier OO or FP paradigms, better garbage
collectors, better type systems, better concurrency approaches.

And maturity does not say that much. POSIX is pretty mature I would say, but
has completely broken string (unicode) and concurrency support. Blocking IO
does not help neither. GCC and GLIBC are still a horrible mess.
Linux/Windows/FreeBSD also.

------
lonelappde
The older you are, the less value you can extract from learning new things,
the more easily you can extract value from what your already know. This is
explains the endless debate between young people and old people who bicker on
forums instead of building.

------
Zenst
Having spent from teens till 50's in tech I offer my insight.

When I started out, I was all trained up in cutting edge and to walk into a
job that did not do that. I learned why.

1) Bleeding edge tech is just that, something will bleed. 2) Skill debt 3)
Standards

Those are the main three.

1) Bleeding edge - all software has bugs, anything new needs time to mature
and when your running a business and want your nth 9's uptime, stability,
robustness.....just works and does the job at hand - everything that something
less mature, tried and trusted offers. Sure it may be that rare - just works
from day one, but the odds are terrible when most often it will not.

2) New skills will be needed and paradigm shifts just never happen, it's
costly and transitions are gradual and glacial almost. You also have legacy
stuff that needs support. I've worked at companies doing paradim shifts, ended
up leaving existing employees stuck supporting the old and they got new staff
in to play with the new toys. This saw the buisness knowledge in the old team
ignored and also saw many leave as the new team was, well, arragant and in the
end, the old system lost traction and the new system never ever worked as well
as the old system due to lots of bleeding edge gotcha you would never expect,
like many bugs. But that was the days of 4Gl's in the late 80's. We still have
COBOL today for many reasons - it just works and does the job at hand and has
proven to do that for a long time. "Trust is earned, never given away" Klingon
proverb that is true for IT in the business world.

3) Standards - this ties up with the previous two, but equally is important as
you need to have standards and anything new takes time to curate those
standards, in coding and operation, these nuances take time to perfect and
equally are important if you want to start out right with any tech in business
that depends upon such tech. Standards, whilst operation and some best
practices will be common for all, but each buisness will haev it's own nuances
and that plays out in standards. Be that naming convention. Though you can and
I have seen odd standards that had a passing back to the legacy code/system
and that was due to making things logical across those systems and that would
be mostly in naming standards, data formats, many things that all add up.

But yes, it's nice to play with new bleeding edge toys, but for many
businesses - that will not be what they want. They prefer toys which have seen
every possible accident happen and all safety revisions are in place and it is
now stable without updates every other day.

But for some businesses - being able to run with cutting edge is their
business, or case of the new project in which the risk is offset by the gains
more easily and everybody has their own segway...

------
badrabbit
With ages comes wisdom, wisdom that comes from lessons learned. Sometimes
though people learn the wrong lesson, younger people take more risks with
their choices and can correct the lessons wrongly accepted by older folk.

It's almost as if being older is more like playing chess, being younger is
more like playing checkers?

I don't know where I fit on this but I don't really care about the new
language or tech if it doesn't help me reach my goals. For example, I am
perfectly fine doing everything in python for most things unless a different
language gives me enough of an advantage to reach my goals. A few years ago,
I'd hammer it out in C or every other side project is used to try and learn
some language. I'd be happy now to do something in Go for example,provided I
can guarantee someone other than myself will maintain it and I need really
good concurency and network code.

------
PaulHoule
I have gone through some major change in the programming tools and methods I
use every two or three years, largely driven by the needs of the market.

I wrote highly complex Rich Internet Applications long before most people, but
I kept away from Javascript for the last few years mostly because I heard
other people complaining about the instability.

React work has come my way and I am having fun with it. Right now it seems to
me that the tools are where they should be.

I am all for developer convenience, but I am also for performance and the fact
is that the subjective performance of computers today isn't consistently
better than computers 40 years ago. The "Internet of things" won't really
happen unless we get cheap chips with high real-world performance with an
absolute minimum of bloat.

~~~
Zenst
>subjective performance of computers today isn't consistently better than
computers 40 years ago

I totally get that point and a good way to illustrate that would be: Boot up
C64 and write a "HELLO WORLD" program, now do that with a modern PC. A crude
comparison, but makes the point.

