
Reflections of an “Old” Programmer - speckz
http://www.bennorthrop.com/Essays/2016/reflections-of-an-old-programmer.php
======
delinka
I'm a bit older than the author. Every time I feel like I'm "out of touch"
with the hip new thing, I take a weekend to look into it. I tend to discover
that the core principles are the same, this time someone has added another C
to MVC; or the put their spin on an API for doing X; or you can tell they
didn't learn from the previous solution and this new one misses the mark, but
it'll be three years before anyone notices (because those with experience
probably aren't touching it yet, and those without experience will discover
the shortcomings in time.)

When learning something new, I find that this group implemented with
$NEW_THING in a completely different way than that group did an implementation
with the exact same $NEW_THING. I have a harder time understanding how the
project is organized than I do grokking $NEW_THING. And when I ask "why not
$THAT_THING instead?" I get blank stares, and amazement that someone solved
the problem a decade ago.

Sure, I've seen a few paradigm shifts, but I don't think I've seen anything
Truly New in Quite Some Time. Lots of NIH; lots of not knowing the existing
landscape.

All that said, I hope we find tools that work for people. Remix however you
need to for your own edification. Share your work. Let others contribute.
Maybe one day we'll stumble on some Holy Grail that helps us understand
sooner, be more productive, and generally make the world a better place.

But nothing's gonna leave me behind until I'm dead.

~~~
andyjohnson0
Late forties developer here.

> Every time I feel like I'm "out of touch" with the hip new thing, I take a
> weekend to look into it.

There's my problem right away. I can't just "take a weekend" to learn some new
shiny thing. I have a partner and children who I want to be with at the
weekend. And I'd rather go climbing or hiking or even just go out on my bike,
than learn another damn api. Twenty years ago I had evenings and weekends to
burn. Now I don't.

~~~
toxik
I'm 26, and modulo the kids, this is how I feel as well. I've been programming
professionally for almost ten years, and I feel less and less pressure to keep
up with the latest packages on NPM or whatever the flavor of the month is.
It's much more interesting to know what has been tried before, and why that
didn't stick. Like this newLISP thing, why should we suddenly start writing
Lisp? The language is older than C, for god's sake.

~~~
progman
> why should we suddenly start writing Lisp? The language is older than C, for
> god's sake.

If so many "modern" languages still copy features from Lisp (hello C++) then
why not use the real thing?

All those nice "new" features which Python and C++ are praised for (lambdas,
closures, list comprehensions) have been available for almost 50 years. Lisp
was way ahead of its time. It just lacked the hardware power which we have
today. It is still ahead of our time. Consider Lisp macros, Genera and MCCLIM.
MCCLIM is an interactive GUI for shells which has been neglected for decades.
It is being revived right now to make it available for modern Lisp
distributions. Modern Lisp just lacks one thing to be the real deal: a native
Lisp Machine.

[https://common-lisp.net/project/mcclim/](https://common-
lisp.net/project/mcclim/)

~~~
jerf
"If so many "modern" languages still copy features from Lisp (hello C++) then
why not use the real thing?"

Because if it hasn't "succeeded", for suitable definitions of "succeeded" in
50 years, as an old fart approaching 40 myself my assumption is that there is
a good reason. I was much more willing to believe the "everybody else is just
stupid and can't see the obvious brilliance" reason when I was younger, but as
I've learned more, I've learned just how many factors go into a language being
successful. Lisp may have all the features, but there are also good reasons
why it has never really quite taken off.

That said, it is also true that using languages that bodge Lisp-like features
on to the side after several decades of usage is generally inferior to
languages that start with them. It's one of the reasons I'm still a big
proponent of people making new languages even though we've got a lot of good
ones. (I just want them to be _new_ languages somehow; a lot of people just
reskin an existing language with a slightly different syntax and then wonder
why nobody uses it.)

But they've all been listed before, and I don't expect me listing them here
will change anything, so I'll skip it here.

(I did a quick check on Google Trends; Clojure seems to not be growing. It's
not shrinking, but it's not growing. That was Lisp's best chance for growth
I've seen in a long time, and that window has probably now closed.)

~~~
progman
The reason why Lisp is not mainstream is its power. It makes writing DSLs
extremely easy so that everyone can write his own DSL to solve a certain
problem. Such style of writing however makes Lisp code unsuitable for group
working, and hence unmaintainable. It explains (imho) why there are so many
unmaintained Lisp projects.

Clojure has a different problem. It is based on the JVM infrastructure which
was (imho) an unfortunate decision. Access to Java features is nice but
dealing with Java stack traces is not fun. Also the usual startup time of
clojure apps in the range of seconds is not acceptable (Tcl/Tk apps start in a
fraction of a second). AFAIK clojure is also not suitable for mobile app
development. The clojure developers should have used their own VM, or they
should have provided a Clojure/Lisp compiler for native compilation. LuaJit
has demonstrated how incredibly fast a suitable VM can be.

~~~
optionalparens
I have felt some of the pain and gripes you have with Clojure and while I am
not denying that some of them are very real, I am not sure if you understand
the mentality and philosophy behind Clojure as far as most of us who use it
seem to have decided at least. Put succinctly, it is to balance practicality
with features to get real work done.

Clojure is quite wisely based on the JVM because the idea was not to create a
Lisp replacement or a new Lisp, but rather a practical Lisp. Some of the
historical problems with Lisp and many other languages like Smalltalk were
related to specialized hardware, development environments, and/or ecosystems.
Clojure did away with most of these concerns by attaching itself to one of the
largest existing environments.

Using the JVM was a wise decision that has/had many advantages not limited to:

\- Ability to leverage an already huge ecosystem of libraries, tools, servers,
etc. that are well-tested

\- Already battle-tested and working package system - this is severely
underestimated by some popular languages

\- Justifying its existence as a tool along side other JVM languages within an
organization, not a full replacement

\- Existing, highly optimized JIT

\- Reasonably fast, nearly for free

Originally there were some plans to expand more to other environments, for
example the CLR, but the JVM got the most attention and in the end this was
practical.

As for some of the real drawbacks:

\- Clojure alienates Lisps zealots and people who could never grasp Lisp. IMO,
this is a stupid person/psycho filter so I don't see it as a drawback but it's
worth noting.

\- Startup time as you note. Startup time sucks, but it has gotten better and
there are workarounds that are the same ones you could use for Java apps
traditionally. The mobile dev issue you note is somewhat wrong and without a
huge explanation, one option is to more or less enjoy a huge part of Clojure
via ClojureScript, or in other words, targeting JavaScript and using things
like React Native.

\- Garbage. This is an issue of a lot of things on the JVM and Clojure is no
exception. You can work around this somewhat with specific coding practices if
you need to, but yeah, Clojure isn't going to be good if you can't fathom an
eventual garbage collection cycle.

\- No TCO, which is mainly a JVM issue if I remember right. This would be
great, but it's a tradeoff and Clojure has negotiated this somewhat by
providing what I feel is more readable code than when I worked in certain
Lisps.

\- Some legacy baggage in the standard lib, for example Clojure.zip. Recently
they've been more brutal about what can and cannot go into standard libs.
Every language though suffers from this a bit.

Regarding developing its own VM, I think you again miss the point about
practicality. If Clojure did this, it would have been even more years before
its release. Moreover, comparing to Lua is a bad example as it is a very
different language (yes, I used Lua professionally). Lua achieves a lot by
really keeping it simple, and while there is merit to that, Lua leaves a ton
to be desired which I won't get off-topic about here.

So Clojure could work better in its own VM, but then you'd lose the JVM
ecosystem along with many other things. I personally would rather have the
ability from the beginning to reach from a huge amount of libraries rather
than have nothing but what other people writing the language provide or via
crude things like calling back into C. There are many talks about all of this,
many from Rich Hickey himself. I think you really missed the point of Clojure
and I am more of the mindset that I am glad it exists and not in an ever state
of flux so that I can use it today, get things done, and not have it relegated
to some research language I could never justify in a workplace. And no, I am
not a Clojure zealot, I used about a dozen languages in any given year
depending on my project and interests. There's a lot I prefer in Lisp over
Clojure, but I see Clojure as taking some lessons from various Lisps rather
than trying to be the one true Lisp.

~~~
progman
I understand why Clojure deliberately focuses on the Java ecosystem. If I
still would be in Java development today I likely would use Clojure. If I were
in web development I likely would use Clojurescript. Currently I prefer the
Emacs/SLIME/SBCL toolbox which is more responsible (and Nim by the way).

My humble two cents to the Clojure team:

1) You should implement a cache mechanism ("save-image") for native code so
that at least the annoying startup time of clojure apps is gone. I wonder why
Java doesn't support native caches to this day.

2) The weird Java stack trace problem could be solved by providing an
individual stack tracer which is close to the source code. I know that this is
not possible for Java libraries but at least clojure stack traces should be
presented in a more convenient manner.

------
carsongross
I get where the guy is coming from, I'm right there as an old guy.

On the other hand, I think there is a bit too much fatalism in the article.
Sometimes the kids are being stupid, and they need to be told so.

The vast majority of web apps could be built in 1/10th the code with server-
side rendering and intercooler.js. All this client-side crap is wasted when
you are trying to get text from computer A into data-store B and back again.
It's the front-end equivalent of the J2EE debacle, but with better logos and
more attractive people.

And people are starting to wake up[1][2]. It's up to us old guys to show the
way back to the original simplicity of the web, incorporating the good ideas
that have shown up along the way _as well as_ all the good ideas[3] that have
been forgotten. Yes, we'll be called dinosaurs, out of touch, and worse.

Well so what? We're 40 now. And one of the great, shocking at first, but
great, things about that age is you begin to really, truly stop giving a fuck
what other people think.

Besides, what else are we going to do?

[1] - [https://medium.com/@ericclemmons/javascript-
fatigue-48d4011b...](https://medium.com/@ericclemmons/javascript-
fatigue-48d4011b6fc4#.uifr5lhm8)

[2] - [https://hackernoon.com/how-it-feels-to-learn-javascript-
in-2...](https://hackernoon.com/how-it-feels-to-learn-javascript-
in-2016-d3a717dd577f#.rmamjx8hw)

[3] - [http://intercoolerjs.org/2016/01/18/rescuing-
rest.html](http://intercoolerjs.org/2016/01/18/rescuing-rest.html)

~~~
rlander
I can vouch for Intercooler. We're rewriting large parts of our app (used to
be a complex flux beast) and it is now way more maintainable and indeed around
1/10th the code. We now keep most of our app state within the server, instead
of spread throughout client and server.

Of course, it is not end-all-be-all: it solves simple interface problems,
those that should't require 200mb of JS dependencies to solve. Once the
interface gets complex enough, you should use JS. We've still got a couple of
JS components though.

~~~
ehnto
Excellent. Intercooler is simple enough that if it goes awry I will just write
my own, but if I don't have to then great.

But to further your point, intercooler is just a tool rather than an
ideological shift in how we execute web applications.

The reason I see the whole front end JS infrastructure mess as unintuitive is
precisely because my needs are served with back end code and a sprinkle of
Ajax.

If I were building a complex SPA like an in browser photoshop, I might see
more use in the complex ecosystem and try and tackle it. But, from a not-so-
outside view it still looks like a mess.

------
bsenftner
It is plain and simple, Kids. I'm 52 - been programming professionally since
the 70's when I started writing C code and getting paid for it in 5th grade.
Our "professional" is writing glue code, and how it is done and what hoops are
jumped through simply do not matter: all that matters is the final shipping
product, widget, or logical dodad works for the immediate marketing moment.

I speak from enviable experience: game studio owner at 17, member of original
3D graphics research community during 80's, operating system team for 3DO and
original PlayStation, team or lead on 36+ entertainment software titles
(games), digital artist, developer & analyst for 9 VFX heavy major release
feature films, developer/owner of the neural net driven 3D Avatar Store, and
currently working in machine intelligence and facial recognition.

Our profession is purposefully amateur night every single day, as practically
no one does their computational homework to know the landscape of
computational solutions to whatever they are trying to solve today. Calling us
"computer scientists" is a painful joke. "Code Monkeys" is much more accurate.
The profession is building stuff, and that stuff is disposable crap 99% of the
time. That does not make it any less valuable, but it does render it quite
ridiculous the mental attitude of 90% of our profession.

Drop the attitude, write code freely knowing that it is disposable crap, and
write tons of it. You'll get lazy and before you know it, you'll have boiled
down whatever logic you do into a nice compact swiss army knife.

And the best part? Becuause you'd stepped off the hype train, you'll have more
confidence and you'll land that job anyway. If they insist or require you to
learn and know some new framework: so what? you're getting paid to do the same
simply crap over again, just more slowly with their required dodad. Get paid.
Go home and do what you enjoy. This is all a huge joke anyway.

~~~
nambit
You're totally right but then people are faced with the possibility of an
increasing number of code monkeys graduating from college every year.

Then the question becomes, how do you keep your job year after year when the
number of code monkeys just keep on increasing. Some of them are shitty but a
lot aren't.

~~~
abledon
it seems programmers are one of the most in-demand type of worker right now. I
understand the fear of becoming pressured out of a job... but what about every
other sector in the economy?

I think before our profession's sector becomes threatened, a whole lot of
other sectors are going to blow up in a mushroom cloud of
automation/obsoleteness (caused perhaps by us!) and force the economy to
rethink jobs/food/shelter/basic necessities for everyone.

------
keithnz
I started programming when I was 7, I'm 45 next month :)

The one thing in the programming world that is almost 100% applicable to
almost every article like this ( and many other topics ) is..... it depends.

I'm fortunate in that for most all my career I have spanned many technologies
from embedded systems to the latest crazes on the web. Mostly what becomes
redundant is language syntax and framework. If your programming career is
largely centered around these then you become redundant pretty quick (or super
valuable when critical systems are built with them then need maintenance
forever ).

Frameworks come and go so if you spend a lot of time creating solutions that
shuffle data from a DB to a screen then shuffle data back into a DB.... then a
majority of your programming skills will become redundant relatively quickly.
( maybe a half life of 4 years? ). But often when you are doing this, the real
skill is translating what people want into software solutions, which is a
timeless skill that has to be built over a number of projects.

If you work in highly algorithmic areas, then not a lot of your skills become
redundant. Though you may find libraries evolve that solve problems that you
had to painfully do from scratch. However that deep knowledge is important.

Design, the more complex a system is to engineer (that isn't provided to you
via a framework), the more likely you will have skills that won't go
redundant. Design knowledge is semi timeless. My books on cgi programming
circa the mid nineties are next to useless, but my GOF Design Patterns book is
still full of knowledge that anyone should still know. OOSC by Betrand Meyer
is still full of relevant good ideas. My books on functional programming from
the 80s are great. The Actor model which has its history in the 70s is getting
appreciated by the cool kids using elixir/erlang

Skills in debugging are often timeless, not sure there's any technique I'd not
use anymore. ( though putting logic probes on all the data and address lines
of a CPU to find that the CPU has a bug in it's interrupt handling is not
often needed now )

~~~
pjc50
_spend a lot of time creating solutions that shuffle data from a DB to a
screen then shuffle data back into a DB.... then a majority of your
programming skills will become redundant relatively quickly_

This is kind of astonishing, isn't it? When this kind of "CRUD" data
bureaucracy has been going on for decades. There's no fully general solution
yet? We're doomed to keep reinventing it regularly?

Debugging is really one of the core skills of programming that should be
explicitly taught.

~~~
crdoconnor
The tech industry has certain weird persistent prejudices. One of them is a
prejudicial attitude to "CRUD" despite its importance and relative complexity
compared to how it is perceived. Another is a fetish for unnecessary code
optimization and scalability.

The obsession for newness is another one, obviously.

------
thesmallestcat
Hm. The author works for a web/mobile development agency and uses React Native
and GWT as examples of the new and the old, respectively. I hope it isn't news
to anybody here that this sort of work is a race to the bottom and has such
turnover precisely because it's mostly being done by junior developers. Linux
systems programming arcana, for instance, doesn't disintegrate so quickly as
the ten years the author cites. That's why, after getting into the industry as
a frontend web dev, I will only do that sort of work now as a last resort to
pay the bills (the other reason is because it's easy/boring as hell, apart
from the greater opportunity for mentoring). Doing that sort of work now feels
like I am sabotaging my career.

~~~
Zyst
Twentysomething year old here, so you know, not like you don't have a point
but:

>the other reason is because it's easy/boring as hell

I legitimately think front end development is not only very fun, it can have
some really challenging aspects. I wouldn't think there's any programming
challenge that is inherently easier because it's on web as opposed to
something else.

Of course I bet there are domain-specific tasks (Distributed Programming,
Embedded hardware to list some) that are likely harder than web development.
But I guess what I'm trying to say is: I don't appreciate you calling what I
make a living off, and spend quite a few hours studying weekly 'easy/boring as
hell'.

~~~
EpicEng
Well, sorry, but... it is. Maybe you find it interesting, that's subjective,
but there's no real technical challenge in front end stuff. You're not solving
hard engineering problems; you're pasting together libraries other people
wrote on top of libraries other people wrote (and on it goes) and searching
google to figure out why your opaque stack doesn't seem to be working.

Developing a good UI is difficult, no question about it, but not for technical
reasons. Whether you resent that or not doesn't make it any less true.

>I wouldn't think there's any programming challenge that is inherently easier
because it's on web as opposed to something else.

Not because it's on the web, because front end work doesn't require anything
more than knowledge of your toolset and some design sense.

It's nearly all "hey, build a UI with some CRUD functionality which is
essentially the same as the 100 you've built before, but for this special
snowflake customer." Bleh.

~~~
GuiA
Honestly when I see what my friends who design airplanes and particle
accelerators are doing, I feel like we're all kind of fucking around in
software engineering, frontend or not.

~~~
nickpsecurity
Try looking into or messing with _actual engineering_ of software instead.
You'll get similarly amazing things. Here's a few:

[http://www.anthonyhall.org/c_by_c_secure_system.pdf](http://www.anthonyhall.org/c_by_c_secure_system.pdf)

[http://www.methode-b.com/wp-
content/uploads/sites/7/dl/thier...](http://www.methode-b.com/wp-
content/uploads/sites/7/dl/thierry_lecomte/Formal_methods_in_safety_critical_railway_systems.pdf)

[https://ts.data61.csiro.au/publications/nictaabstracts/7371....](https://ts.data61.csiro.au/publications/nictaabstracts/7371.pdf)

[http://ceur-ws.org/Vol-192/paper08.pdf](http://ceur-
ws.org/Vol-192/paper08.pdf)

------
mafribe
If somebody had found himself in Edinburgh in 1986 and bumped into a tall
gentleman called Robin, who was a bit familiar with this new-fangled thing
called computers, and had asked Robin, what kind of programming language
should one learn to use these computer thingies, what would Robin have said?
Not sure, but maybe something along the lines of "well ... there are many
interesting languages, and different languages are suitable for different
purposes. But if you are interested, I'm dabbling in programming language
design myself. Together with my students I've been developing a language that
we call ML, maybe you find it interesting. With my young colleagues Mads and
Robert, I'm writting a little book on ML, do you want to have a look at the
draft?"

Maybe such a person would have chosen to learn ML as first programming
language. If this person had then gone on to work in programming for 3
decades, and if you'd asked this person 30 years later, i.e. today, what's new
in programming languages since ML, what might have been his answer?

Maybe something along the lines of: "To a good first approximation, there are
three core novelties in _mainstream sequential_ languages that are not in ML:

\- Higher kinded types (Scala, Haskell).

\- Monadic control of effects (Haskell).

\- Affine types for unique ownership (Rust).

Could I be that somebody?

~~~
catnaroek
Haskell isn't quite “mainstream”, so I'm taking the liberty to add innovations
from other “not quite mainstream” languages:

\- Hygienic macros as a scalable tool for extending and redefining languages,
and furthermore, making the extensions interoperable with each other (Racket).

\- Language support for building reliable massively distributed systems in
spite of individual node failures (Erlang).

\- General-purpose programming with growable arrays, hash tables and no other
data structures (okay, these ones are _very_ mainstream).

~~~
mafribe
Good points.

ML originally had Lisp-like macros, not sure about hygiene. Note also that one
doesn't always want hygiene in meta-programming, although it is nice to have
the option of hygienic macro expansion.

I explicitly restricted the comparison to languages for sequential computing.
There has been a lot of novely in concurrent programming.

Arrays and hash tables are data-structures that you can implement as libraries
in ML, so I'd say that's not a language issue. Progress in data structures and
algorithms has been considerable.

~~~
catnaroek
> I explicitly restricted the comparison to languages for sequential
> computing. There has been a lot of novely in concurrent programming.

Oops, yes, my bad!

> Arrays and hash tables are data-structures that you can implement as
> libraries in ML, so I'd say that's not a language issue.

Yes, but the point is that nowadays we have languages in which it's
“convenient” to design entire large applications around nothing but arrays and
data structures. Also, that one was snark.

> Progress in data structures and algorithms has been considerable.

In CS, yes. In everyday programming, regress in data structures and algorithms
has also been considerable.

~~~
mafribe

       regress [...]  has also been considerable.
    

Thanks to Moore's law, most programmers even get away with it. And if they
don't ... they do big data.

------
dkarapetyan
This statement is false

> Half of what a programmer knows will be useless in 10 years.

and the rest of the article seems to be based on it so it negates much of what
is said.

Foundational knowledge does not decay. Knowing how to estimate the scalability
of a given design never gets old. Knowing fundamental concurrency concepts
never gets old. Knowing the fundamentals of logic programming and how
backtracking works never gets old. Knowing how to set up an allocation problem
as a mixed-integer program never gets old.

In short, there are many things that never get old. What does get old is the
latest fad and trend. So ignore the fads and trends and learn the
fundamentals.

~~~
glandium
How exactly are you contradicting that sentence you're quoting? In fact, it
seems to me you're confirming it. He didn't say that everything a programmer
knows will be useless in 10 years, but that half will. You're only enumerating
the half that won't.

~~~
combatentropy
I agree with the grandparent, that the writer overstated how much in ten years
will be useless. It was almost 20 years ago that I first learned HTML, and
since then neither it nor hardly anything else I have learned has decayed:
CSS, native JavaScript, jQuery, PHP, PostgreSQL, Apache, and Bash. But I guess
it depends on what you try learning.

~~~
tobltobs
Things you have learned, but decayed: IE6 CSS fixes, Apache1 configuration
file syntaxes, PostgreSQl getOrCreate surrogates, Jquery, ...

~~~
combatentropy

      > IE6 CSS fixes
    

I didn't really ever learn these. I stuck to simple layouts or used tables,
which is fine
([https://www.combatentropy.com/coming_back_to_the_table](https://www.combatentropy.com/coming_back_to_the_table)).

    
    
      > Apache1 configuration file syntaxes
    

I didn't learn Apache until version 2.

    
    
      > PostgreSQl getOrCreate surrogates
    

I never learned these. I don't know what they are.

    
    
      > Jquery
    

This hasn't decayed.

------
smoyer
Apparently old is a matter of perspective ... To me not quite 40 is still a
young'in.

I'm over fifty and just got back from presenting at a major conference. I've
managed to say current through 35 years of embedded systems design (hardware
and software) as well as a stretch of software-only business. It's really not
that hard if you understand your job is to continually be learning. I must be
doing it right because often those I'm teaching are half my age.

As an aside, I've done the management track and moved back to the technical
track when I found it unfulfilling.

~~~
tluyben2
People over 30 feel 'old' every 10 years. That's nothing new; it's not even
'programmers are 20something'; people becoming 30-40-50-60 have all been
saying 'now I am old' while we stand to become 90-100 (at least in western
EU), so 60 is not that old. 40 (i'm 41) is spring chicken and I look forward
to many years telling my younger colleagues that the latest thing, however
interesting to learn about, is not always better.

~~~
smoyer
I never said that I was old ... One good sign is that my wife keeps telling me
to act my age!

~~~
fineline
Nobody has any experience of being any older than they currently are, but a
lifetime's memory of being younger. Hence most people of all ages feel old.

One trick I like to play on myself is to imagine I come back from twenty years
in the future. What advice would I give myself? First thing would have to be
"shut up about being old! Your life is still ahead of you."

I like the other trick too, where I imagine being visited by a teenage me and
thinking what he would say about where I'm at. It can be an awkward
conversation. Where's the Ferrari?

As a less whacky version, pay really good attention to your parents and your
kids.

~~~
tluyben2
I use that timetravel trick as well; I never thought I was old (I like getting
older so far; so many doors just open that were closed before), but yeah
things to tell your younger self; do not hurry so much. Make 10 year+ plans
when doing things. I always hurried thinking something would end; I have been
running companies since I was 15 and, for instance, the first company I co-ran
with my uncle, made educational software for MS-DOS and later Windows 3.11 and
then Win95 etcetc. I was in hurry because I thought first MS-DOS would go away
and then that Windows apps would go away because web. The software I made then
sells well still; it's now over 25 years old... Why did I hurry/worry?

Things I thought that would end, like the CMS market 16 years ago (a market my
company thrived in) didn't end. They became bigger. If I would've not hurried,
I would have less stress at the time and probably be running on a larger scale
than that company is doing now. You cannot stand still and for some parts
there needs to be a sense of urgency but it doesn't change _that_ much in most
markets. Currently I use that to tell my colleagues we need a 10-year plan,
not just a 3-5 year plan.

------
iamleppert
You don't need to learn React or Angular or another framework. Spend your time
getting really good in your preferred stack of choice. That could be a
framework or something of your own creation. Do not go to work for a company
that only wants to hire people familiar with a specific framework. It's a huge
red flag. The work will be boring and the team mediocre. More often than not
there will also be culture issues.

Great companies who have interesting projects will want to see what you've
built in the past; the technology is just a tool. They will trust you to use
the right tools for the job, and will respect you enough to let you pick those
which you prefer.

For legacy systems, it's helpful to have some experience but it's not like you
won't be able to be effective if you're good given sufficient ramp up time.

In my experience it's far better to hire the smart, motivated engineer who can
actually get stuff done and has created high quality software before than
someone who is an expert in a specific framework.

Also I avoid going to tech conferences about web stuff, unless it's a
legitimately new technology. A new way to organize your code and conventions
are not new technology, it's just some guy's opinionated way of doing things.
And most of the talks are less about conveying useful information that will
help you and more about the speaker's ego and vanity.

~~~
clifanatic
> Do not go to work for a company that only wants to hire people familiar with
> a specific framework.

So, filter out 99% of the jobs that are out there? (And 100% of the ones
outside of San Francisco)?

------
oldprogrammer52
One of the consequences of this wide-spread ageism is the amount of
unnecessary, ill-conceived, and often dangerous wheel-reinvention that
20-something hipster programmers get away with.

Exhibit A would be NoSQL. Little more than a rehash of the hierarchical and
network (graph/pointer) databases popular in the 1950s before the ascent of
relational databases, these systems enjoy increasing popularity despite few,
if any, advantages over relational databases besides allowing 20-something
hipster programmers to avoid learning SQL and the ins-and-outs of a particular
relational database (like PostgreSQL) and allowing VC-backed tech companies to
avoid paying senior developers who already possess that knowledge what they're
actually worth.

If these new data stores were at last as reliable as the older relational
databases they are supplanting, it wouldn't be so bad. But they aren't.
Virtually all of them have been shown to be much less reliable and much more
prone to data loss with MongoDB, one of the trendiest, also being one of the
worst[1].

And these systems aren't even really new. They only appear that way to young
developers with no sense of history. IBM's IMS, for example, is now 50-years-
old, yet it has every bit as much a right to the label "NoSQL" as MongoDB does
--and amusingly, it's even categorized as such on Wikipedia.[2]

1) [https://aphyr.com/posts/322-call-me-maybe-mongodb-stale-
read...](https://aphyr.com/posts/322-call-me-maybe-mongodb-stale-reads)

2)
[https://en.wikipedia.org/wiki/IBM_Information_Management_Sys...](https://en.wikipedia.org/wiki/IBM_Information_Management_System)

------
ams6110
_To me, it seems a bit like JSPs of 15 years ago, with all the logic in the
presentation code, but I 'm "old", so I assume I just don't "get it"._

No, you get it. It's the people who get excited about stuff we tried and
abandoned two decades ago that don't get it.

~~~
place1
So i'm young. What makes JSPs bad, why aren't you using them anymore and what
are you using instead?

~~~
bboreham
I think it's the "all the logic in the presentation code" that is emphasised
as bad, and that is what you saw in typical JSP example code 15 years ago.

"Mainstream Java" then tried to sell EJBs as the answer, which was another
world of pain.

------
transfire
You know, if truth be told, we really haven't come very far. You'd probably be
surprised at just how well a modern COBOL system can operate.
([http://blog.hackerrank.com/the-inevitable-return-of-
cobol/](http://blog.hackerrank.com/the-inevitable-return-of-cobol/))

In fact, in many ways we've made things worse because not only does the sand
keep shifting, there is now way too much sand. Young people come into the
field and they want to make their mark. So we are constantly going through
"next big thing" phases, some big like OOP, some smaller like React, only
later to realize that what seemed so very interesting was really a lot of
navel gazing and didn't really mater that much. It was just a choice, among
many.

I can only hope one day some breakthrough in C.S. will get us past this
"Cambrian Explosion" period and things will finally start to settle down. But
I am not holding my breath. Instead I am learning Forth ;)

~~~
d23
> I can only hope one day some breakthrough in C.S. will get us past this
> "Cambrian Explosion" period and things will finally start to settle down.

I genuinely do feel like we're in the stone age in this industry right now.
I've thought about it a lot, but of course, it's hard to really get to the
good ideas when you can't hop on lot of stepping stones that will be found
later and taken for granted.

I think a few things will happen. 100 or 200 years from now (if it's even
appropriate to think on such a timescale!), we'll have some very large scale,
stable data storage systems that people can simply rely on. A few common
development paradigms will have thoroughly been cemented in our collective
consciousness, and the programmers of the day will be essentially what
construction workers are right now, perhaps with a bit more creativity.
They'll follow plans and be put into rigid confines when programming with the
system, and it'll be scalable from a development perspective.

I haven't got much further than that. It's hard to step more than a few layers
deep on stuff like this. A lot of the rest of it depends on how things like AI
and VR and a bunch of stuff I can barely imagine will pan out. But from a
software point of view, I think we're still waiting on a bunch of 100-monkey-
style revelations.

------
minipci1321
Very surprised and honestly, even shocked. Not sure what to think about this
post, and I am good deal older and been working longer too. Maybe learning is
hard for him? He has 2 degrees.

In his sig he says he likes to write about making decision, and not a word
about intuition, how it builds up in you over years from that seemingly
pointless going round in circles? Very little about how we improve in
relationships with people (going from 0% skill for many of us), and accomplish
even more by making others pull in right direction?

I have never wanted to do anything else than what I do. Differently, yes, but
no farming / opening a restaurant or an art gallery. Maybe that is the real
culprit?

Last thing, knowledge does not "afford increased measure of respect and
compensation". Adding value, helping people, and solving their problem does.
If you have this trail on your CV, maybe that long list of the technologies is
less needed.

------
autognosis
The fundamentals of computers have not changed all that much. Every assembly
language i've learned is still valid, and their respective architectures are
still widely deployed.

I'd suggest not building a house on sand, and learn the fundamentals of how
computers and programming language works. Don't learn anything closed-source.
It isn't worth the brainspace.

------
osullivj
Very sad that the author aspires to be Martin Fowler. Fowler is an adept
populariser of other people's ideas, and he does that as marketing effort for
Thought Works. AFAIK he has not originated any innovation over the last twenty
years, whether it be patterns, enterprise architecture, microservices,
generators, refactoring or agile. Basically he's a corporate shill trolling
round the conference circuit drumming up consulting gigs for Thought Works by
banging on about the latest trend. If you want to aspire to be someone in the
software world how about Brad Cox, Steve Wozniak, Guido, Carmack, or Kay,
Ingalls and Goldberg?

~~~
Chris2048
I don't see anything wrong with picking up ideas like that and promoting them,
if he didn't rename so many of them...

~~~
osullivj
And make it a little clearer that they aren't his ideas, so noobs don't get
the wrong impression.

------
d23
At 27, I still feel old in the same regard as a lot of the ways the author is
talking about. A lot of things I'm seeing reeks of being a fad. I'd rather
avoid naming any technology or framework, but my instinct has been to avoid
planting my seeds in soil that's churned up every 6 months and keep an eye
toward that which has been solid for a decade and is likely to continue. I
don't mind learning a new language -- there are a couple I'm hopeful toward
and think could be long-term winners. But I'm not about to waste grey matter
on things I suspect will be obsolete before I can even reach mastery.

------
mml
This is the first time in history there are a huge number of "old", nay,
wizened, programmers around in comparison to young ones.

Make of that what you will.

As a 40+ programmer, who knows what becomes of those who move into management,
I am seeing lots of my cohort falling back into actually making things, as a
way to preserve our hard-won value.

This makes me happy. And you whippersnappers better watch yourselves ;)

~~~
vanderreeah
As a 40+ web dev, who doesn't know what becomes of those who move into
management, I'd be interested to know: what becomes of them?

~~~
mml
They stop coding, lose their skills, get laid off at some point, and quickly
realize that getting hired in as a middle manager is a _lot_ harder than
getting hired in as a programmer.

------
dwarman
This comes up approximately annually. I used to answer at length. But now at
69 brevity seems more productive. I am probably nearing the end of my
accidental unplanned drunkard's walk career, one that start in 1967 when I was
dropped (unskilled 19 yr old college hippie drop-out) into the inside
(literally) of a mainframe and told to "make it work". wandered subsequently
through probably every computing field, and lately do audio DSP work inside
game consoles. Inside the inside of a current SOC inside a black box.

My conclusion: there is no formula for staying relevant. Perhaps an
understanding of the roots and rapid skill acquisition, but beyond that every
second I spent learning a new framework just because has been a second wasted.
By the timne I was somewhere it was poossibly relevant, it was already dead
and replaced or I was too far ahead of the time and had to write my own.

Further, and sadly, after 50 some years in the biz, I still understand the
insides of current SOC chips, and I shouldn't - no progress has been made in
practical computing thoeretics at all. Lots of embellishment, lots of band-
aids, nothing really different. Otherwise I would not be able to do this job.

Yes, really, brief this time. $0.01 instead of a full $.

------
jrapdx3
A good article that makes valid points about the difficulties of keeping up in
rapidly-changing fields. Since I'm "old", and have a foot in the worlds of
medicine and programming, it's apparent to me there's not that much difference
in the "aging curve" in these occupations.

The parallels include the explosion of new knowledge, or at least variations
on the old knowledge, that a practitioner needs to keep up with. In
programming it's languages and frameworks, in medicine it's discoveries (basic
science), new drugs and techniques, and aspects of the regulatory environment.
In either case a few years out of school/training it becomes daunting to keep
up.

I should add here a particular peeve, the proliferation of abbreviations and
acronyms is _way_ out of control. It's nearly impossible to read an article
without encountering an avalanche of incomprehensible ABBRs. What's worse, the
same ABBR is often used to mean entirely different things one article to the
next. Cross-field usage is a naturally incongruous extension of the confusion,
though at times it's humorous.

What the article doesn't emphasize is the blizzard of details to keep up with
is just one part of the experience. As years in the trenches becomes decades
the value of "time in grade" becomes evident. The ability to size up the
demands of a complex problem, to have a clear idea of where to enter the path
of its management, and calm assurance growing out of having been down the road
before are all won only by virtue of real experience.

Having done what I have for 40 years, it took me only 33 years to realize I
didn't know what I was doing, and that's when I got really good at it. Therein
is an essential wisdom that time and effort alone confer, and can't be gained
in any other way.

------
edpichler
A guy from Oracle advised me on I chat we had, on early days of the Google
Talk. His name was Matthias Weßendorf (I don't forget people I am grateful)
and the advice was something like this: "Study software engineering, it's more
difficult to change than technology, and you will use it for all your life".

I got lucky to have this advice 10 years ago. I did a Master degree on this
area and my life changed, my software has high quality, evolve fast and I
sleep nice every night with all the "chaos" under control. I can change
languages and development processes fast and painless. I think the point is:
you have to understand the abstract concepts of technology rather than
languages or frameworks. As an example, if you understand Object Orientation,
new languages will appear and disappear while your abstract concept will still
be completely useful and applicable. If you study software engineering, it
does'n matter if you will use scrum, RUP, you will get it fast because you
already have all the base.

------
bungie4
Mid 50's here. I've been coding professionally for about 30 years. All of this
is pretty much spot on.

More so, the ability (and desire) to learn the latest/greatest has waned. I
fear that we only have the ability to jam so much new knowledge into our
heads. At some point, we must discard the old to make room for the new.

I'm just afraid I'm gonna delete the ability to control my bowels.

------
splicer
I got a bug report from a 76 year old developer the other day.

------
artellectual
I don't think experience is replaceable. There are certain things in software
that doesn't change.

\- solving the problem at hand.

\- solving it in the quickest time possible

\- the solution to the problem should not introduce new problems

I think for experienced programmer such as yourself the knowledge that you
"lose" or "decays" don't actually become useless. I think they will serve you
in making better future decisions like for example what you are saying now.
Realizing whether things are 'fads' or 'foundational ideas' is a big asset for
an experienced programmer.

The same way with doctors. Their tools are changing rapidly but the underlying
concept is still the same. She may not know about all the new tools but she
understands how a heart works, and because of that she can gauge if the new
tool is just a 'fad' or will it change how things are done fundamentally.

I think every career path has this in some degree or another.

------
keefe
I think it's time for this meme about engineering being a young man's game to
die. There are plenty of events in the olympics with good competitors in their
30s and early 40s. The point being that we certainly haven't aged out of any
physical requirements once we hit our 40s. Sure, I have a little less energy
in my 30s than in my 40s, but overall I am more productive and make fewer
mistakes.

I do agree that keeping skill levels up across a long career is difficult.
Maybe these memes come up because it's a convenient excuse not to put the
effort in? It's very easy to get complacent if you are smart, get things done
and have a comfortable home life. We have to train to get that brass ring, to
stretch the analogy ;)

------
progx
"The doctor at 40 doesn't seem to be worried about discovering that all his
knowledge of the vascular system is about to evaporate in favor of some new
organizing theory. The same goes for the lawyer, the plumber, the accountant,
or the english teacher."

This is the point you think in a wrong direction. All of this Jobs need a
basic knowledge (a developer need it too) and all auf this Jobs need tools or
regulations to do the job.

A doctor need knowledge about new medics, new instruments. A good teacher
teaches not the same way for 40 years. ...

React or whatever are tools. And yes, the most tools reinvent the wheel and
this is not development specific, this is valid for many jobs.

------
dcw303
I'm about to hit 40 as well, and I completely identify with the pressure to
keep up to date with new languages and frameworks, with the fact that I've
lost memory of many things I haven't used recently, and that despite the
proliferation of the new new thing, there really aren't that many new ideas
out there.

But the thing is, I really like that I must always be learning. I just have
that kind of brain that is attracted to learning new things, so for me this
career has always been a natural fit.

In the last few years I learnt Meteor.js and built an issue tracker from
scratch; I made several attempts at gaming projects using C++, C#, Swift, and
others; I played Microcorruption and Cryptopals and Starfighter to learn a
bunch about assembly, reverse engineering, crypto and security; I learnt Go
and built a compiler with it; and right now my attention is moving towards lua
to do some pico8 games. I did all this on the side of my boring corporate java
developer job, and for no other reason than I wanted to learn new things. (OK,
maybe I had big dreams of a startup with the issue tracker, but the others
were purely for fun.)

I'm probably never going to be well known for any of those things, and I
really haven't built up the chops to be considered an expert in any of them.
But I'm content being a dilettante. Perhaps one day I'll get exhausted of
exploring new things, but until then it's just fun to just dabble in whatever
takes my fancy.

------
kkanojia
A lot of old guys(40 is old eh!) I meet are into managerial/advisory roles and
even though they wont know the underlying details of the framework it does not
take them much time to understand it, because there is always something
similar they had during their time.

The time I spent mastering, adobe flex, javas struts framework, GWT and the
likes. Could seem like wasted time. But in the larger scheme of things it just
made me smarter. I know what worked for them and what didn't and that helps me
understand the future frameworks better.

------
zelos
"...invest most in knowledge that is durable. My energy is better spent
accumulating knowledge that has a longer half-life - algorithms, application
security, performance optimization, and architecture"

That's the key quote, I think. There seems to be far too much focus on
'programming knowledge' being about new frameworks, languages etc. That's just
ephemera. Picking up React Native takes what, a couple of weeks? The basics
are still the same, and still far more important.

~~~
Chris2048
Too many cooks spoil the broth, if everyone shunned frameworks in favor of
rolling their own things would be even more of a mess - except in JS which has
all sorts of problems whichever way you go..

------
patkai
I'm also a bit older and have had a 10 year academic career followed by 10
years of software contracting. The biggest difference between academia and
current web development - not sure of "other" software development like
embedded code, systems languages - that in academia: 0\. you went to school,
did some homework (not that it's all so useful, but at least you have a
background in what you are doing, and a common denominator / language with
your peers) 1\. before you start something new, you do a thorough review, or
read a lot of review papers, so you do know what was done before and why 2\.
you do get mentorship, even if Phd students / postdocs often complain about
the lack of it. But directly or indirectly you do test your ideas on people
who know more and who have been there longer 3\. you start to send preliminary
workshop / conference papers for review, and also funding applications 4\. at
this point you at least know why SQL - or whatever else - is there, and in
some cases you might even learn some humility

I guess my conclusions are trivial. Many of us have amazing technical skill
but our education and experience is not on par. It results in a lot of waste,
of time and quality.

------
staticelf
I understand the analogy with the doctor but I don't think it is true at all.
I don't think programming is different from most fields actually.

Perhaps concepts in programming change more rapidly than in other fields but
technology ascends there too. For example I've heard dentists discuss new tech
and methods they use as if it was a new web framework with different thinking.

Doctors need to learn about new methodologies all the time since science and
technology discover new shit all the time and develop new methods to finding
and fighting diseases for example. I think most people would be extremely
disappointed if they visited a doctor that gave them medical advice that was
40-50 years old and wasn't updated with more modern medicine.

All technology is a means to an end. You don't have to learn the new tools to
complete the job if you can have the same outcome and I think many times
people are so afraid to become less relevant that they learn stuff they don't
actually need.

If you really benefit from learning something, that's when you should learn it
and use it.

------
ensiferum
This is why you let the javascript fanboys come and go with their "angularjs".
You can focus on tools that do not change so much. Just to name a few, C++
(slowly updates), C, posix, Qt and many other native technologies that have a
good "shelf life" of at least nearly 5-10 years with only casual update.

Further on, the core of computer science has even better shelf life. It
basically never expires.

Personally I split two things. The stuff that I need to learn just _now_ to
get my current work done, and the stuff that matters in the long term. The
former can change quickly and I don't fuss about it. If It's the buzzwords and
latest tech gimmicks or just a new technology I didn't use before, I learn as
much as I need to and as much as sticks naturally over the course of my work,
but I don't always actively try to retain it.

The latter part however is the real "gold". Once you know the core computer
sciency stuff you can always build on that later on using whatever tools and
technologies.

------
agentultra
I find that this pace is a symptom of the Javascript culture of popularity. It
is a hallmark achievement in the career of a Javascript developer to be the
maintainer of a popular library or framework and monetize their popularity by
way of training videos, talks, books, and buy-in from companies building upon
their work.

It's not that frightening to me, a mid-30's developer, at this point. I find
the fundamentals are more important than the fads and it's relatively easy for
me at this point to separate the wheat from the chaff. Is Redux or the Elm
architecture good? Yes -- it's a left-fold over a state tree; great! I want
that.

Are new things coming out constantly? Yes. Some of them are incremental
improvements. That's a good feature to have. It means there are a horde of
passionate people constantly improving the tooling and libraries available. I
wish some of my preferred languages received even a fraction of the attention
that JS gets.

We live in interesting times.

------
szines
Nice article. Thank you. We should never stop learning...

Interesting, I found this writing also, which is about new frontend
frameworks: [https://medium.com/@edemkumodzi/how-to-choose-a-
javascript-f...](https://medium.com/@edemkumodzi/how-to-choose-a-javascript-
framework-to-learn-a265c55f1271#.ltchkty24)

About How to choose a framework... and it suggest, if you already an
experienced dev, who prefer OO patterns and you believe in serious computer
science, you should use Ember.js. If you are a designer, Angular is good for
you. However, if you are young and you don't have any experience, go with
React, because it is easy to learn... like PHP back in time... I'm afraid
React will be the new PHP, because we will see a full generation growing up
with mixing logic with view, and they follow this kind of patterns... :)

------
rb808
I recently went for a C++ job which I hadn't done for 10 years. Most of the
questions I was asked were the same ones we had in interviews in the 90s. It
actually felt really nice, I just wish there were more good C++ roles around -
would be nice to live in a world that doesn't radically change every few
years.

------
eikenberry
> To me, it seems a bit like JSPs of 15 years ago, with all the logic in the
> presentation code, > but I'm "old", so I assume I just don't "get it".

He seems to ignore that his experience just paid off. It is not that his
knowledge of JSP is out of date and not useful, it is that back in the day he
learned the anti-pattern and can apply that now. Programming has the same long
term advantages as any profession. Most of what I know after 20-some years is
not any specific tech, but ways of doing things, recognizing good and bad
habits, patterns, etc. The specific techs come and go, but the real knowledge
transcends them all and builds on itself. His 3 stages graph shouldn't
logarithmic but exponential.

------
dep_b
I don't know. I'm in this still somewhat hot new thing called mobile and now
I'm supposed to understand how to debug C-code and all that "old bullshit". It
even drops me into a view sometimes that's straight from my C64 assembler
cartridge full of labeled MOV, LDA calls and all that stuff. I really wish I
dug a bit deeper then than writing adventures in BASIC back then!

I don't think knowing how a computer actually works will ever go out of
fashion. Now there's Falcon framework for PHP for example, full of speedy
functions written in C by smart people that actually knew what was happening
beyond the stuff they typed into their .php files.

------
markbnj
As a working developer and SRE at 55 years of age I can't help being just a
little amused at the author. I guess you really _are_ as old as you feel. The
points about the competence cycle over the span of a career are dead on, of
course.

------
rmason
Only in two professions pro athletics and computer programming is forty years
old.

~~~
WildUtah
_Only in two professions pro athletics and computer programming is forty years
old._

Also, forty is old in the oldest profession that those two most resemble in
their degree of exploitation and social stigma.

------
whybroke
We work in a surreal field were knowing a bit of Node.js and nothing else is
considered superior to knowing a bit of Node.js and alot of .NET

Obviously you may substitute any fashionable/unfashionable language pair in
the above.

------
prewett
I make a point of not learning the new framework du jour. (Back in The Day it
was the next UI frame Microsoft was putting out.) If I keep hearing about it
for a few years, I figure it might be worth looking into. I starting writing
new sysadmin scripts in Python instead of Perl after I kept hearing about it,
for instance. Other than that, I tend to learn on a need-to-know basis. I feel
like that has led to little churn in my knowledge. But then, I also try to
avoid working in areas that have high churn, which has led to my experience
being in areas of low churn.

------
muzster
I remember those novice days with fond memories. I've observed, in the
twilight hours or when I'm playing with my kids, that I am attracted to things
that make me feel a novice. However, this is not compatible with my day time
job, where my paid expertise is often required. _sigh_

It would be interesting to see the graph of the careers stages with happiness
overlaid.

Source: [http://www.bennorthrop.com/Essays/2016/career-stages-
program...](http://www.bennorthrop.com/Essays/2016/career-stages-
programmer.png)

------
mti27
"And then one day you find, ten years got behind you. No one told you when to
run, you missed the starting gun..."

To the author: Hang in there, man! You're just feeling the time crunch, now
that you have kids and other responsibilities. Based on your age, it would
have been the mid-1990s when your professional career started. Back then, the
economy was a lot better and outsourcing hadn't yet taken over at large
companies. It's a little more dog-eat-dog economically, but your brain
probably still works fine. Just take a breath and keep going.

------
partycoder
Well, first of all, new shiny things are not really new at all. The principles
behind them have been around for decades.

Learning 30 different imperative languages (c, pascal, ada and descendants)
might not add as much value as learning an imperative language, a functional
language, a logic programming language, a language emphasizing concurrency
(go, erlang), etc... meaning, learning paradigms and high level design
constructs not syntax.

Try to stay in touch with new paradigms, instead of just new applications of
them.

------
euske
I recently found that every programmer has to discover what it is like to
create a new exciting thing and watch it fades into obscurity.

i.e. Life is all about reinventing your own wheel.

------
sbt
The second advice, investing in durable skills, is key. In addition to what
the author mentions, I would point out that more change happens higher in the
stack. There is relatively little change at the x86, C, operating system
level, entrenched protocols. But once you start getting into the higher level
languages, and in particular web, the churn is much greater. So personally I'm
trying to stay away from those higher levels.

------
LeanderK
As a CS-Student i can not imagine going into a profession that you can just
learn in some university and then just work in it. For me this seems rather
absurd, that you can stop learning. You just have to manage that there is
always something new to learn and something you know going obsolete. Thats the
way life works, at least in the view of an CS-Student.

------
justinhj
I'm 45. Feels like there are 1000 directions I could go to improve my skills,
from soft skills to different programming models and industries. As long as
someone will pay me to, I will program for money, and when they stop paying me
I'll keep on learning and coding for fun anyway.

------
patsplat
As an "old" programmer myself am a bit disappointed in the takeaway that jsx
is a templating language.

State management is the more important topic and the React tool chain has some
great options for addressing it.

------
m3kw9
Depends where you work, some will help you gradually learn new stuff. But if
you are a contractor, you'd need to learn to keep up. That's why hey are paid
more

------
tempodox
I roughly concur with the OP's thoughts on the matter. Which makes programming
more than just a profession to me. It is, if you will, a way of life.

------
samfisher83
Only in tech would late 30s be considered old.

------
flamelover
So get a copy of sicp, there is _almost_ nothing new to you. (Yes, I am going
to start a fire :), bite me honey)

------
nirav72
If this guy thinks he's is old at being shy of 40..I must be really old at
almost 45.

------
br3w5
Is this a Freudian slip? "feeling apart of a community of technical"

------
DanielBMarkham
I agree mostly with the author. The only quibble I might have is this: _We
realize that it 'll require real effort to just maintain our level proficiency
- and without that effort, we could be worse at our jobs in 5 years than we
are today._

If by "worse", you mean more forgetful of the details of latest fads? Sure.
But definitely not less able to put together solutions (Not if you've been
spending your time doing that, of course)

When you're a kid and fresh into programming, everything you pick up has some
magical power to do all sorts of awesome and cool stuff you've never done
before. It's a grand adventure and you're just collecting all the trinkets you
can on the way there. You look to other programmers to see which ones have the
most potential. Whatever job comes along, you've already got the solution in
your toolkit.

Over time you begin to realize that many, many problems have been solved
hundreds of times. You note that there is an ecosystem around tools and
frameworks, and as a developer? You are a market for lots of people who want
you to use their stuff. That there's quite a bit of social signaling going on
around which languages and tools people use. I'll never forget the first time
I heard somebody say about another programmer "He's a nice guy, but he's just
a VB programmer."

Actually he was one of the best programmers I knew at the time. He programmed
in many different languages. It was just that for the work he was doing, VB
was the right tool. But that's not the way it looked to the cool kids.

Know what's sad? It's sad when you look back 10 or 20 years and remember a ton
of effort and pain you went through to fuck around with WhizBang 4.0 only to
see it replaced by CoolStuff 0.5 -- and then you realize that CoolStuff really
wasn't all that much of an improvement. And then you realize that CoolStuff is
no longer cool. And then you think of the hundreds of thousands of manhours
coders spent mastering all of that and comparing notes with each other.
Looking down on those poor folks who never made the switch. Makes you kinda
feel like an asshole.

I think you lose a lot of detail recovery ability as you get older, no doubt.
I keep very little implementation detail active in my memory and only dig it
back out as needed. But we are communicating on this wonderful little forum
that, last I checked, was built using html _tables_! Yikes! And using a
language that's a derivative of LISP! Yet somehow the world keeps spinning
around.

I have no doubt that as you finally smarten up and focus on the important
stuff that you will appear to other, perhaps younger programmers as losing it.
I just don't think they know what the hell they're talking about.

Meh.

~~~
tempodox
> I just don't think they know what the hell they're talking about.

Same here. But if you tell people their big wide world looks like a rather
small box from your perspective, you'll discover that killing the messenger is
very common. Even among folks who consider themselves educated and
enlightened.

------
vacri
> _The doctor at 40 doesn 't seem to be worried about discovering that all his
> knowledge of the vascular system is about to evaporate in favor of some new
> organizing theory. The same goes for the lawyer, the plumber, the
> accountant, or the english teacher._

And the same is true of programming. There are still variables, arrays, syntax
errors, IDEs and so forth - the underlying algorithms don't change that much.
Lawyers and accountants _have_ to keep up to date to keep their credentials,
and doctors almost always do (but not actually always - just like some
programmers don't update their skills). Fads come and go in teaching as well.

I know less about plumbers, but there are few white-collar professionals where
you don't have to keep on top of things throughout your career. From
architects to engineers to social workers to pilots to biologists to
meteorologists to managers, things change in your profession and you need to
adapt. It's just that usually those changes don't have the ridiculous levels
of hype and fanfare that they have in our bubble (management being an
exception here as well, lest I draw the wrath of a six-sigma black belt!)

------
oldmanjay
Reading over the comments here makes me want to hug everyone and carefully
explain that complaining that you want to dedicate less time to your craft and
still get the outsized rewards you feel you deserve just for being old isn't
going to convince anyone to hire old programmers.

------
abritinthebay
Dunning-Kruger right here folks.

~~~
sctb
We detached this subthread from
[https://news.ycombinator.com/item?id=12657534](https://news.ycombinator.com/item?id=12657534)
and marked it off-topic. Please comment civilly and substantively or not at
all.

~~~
abritinthebay
Oh please, you just confirm what the industry already knows about HN

Such a joke.

------
EpicEng
Please. Reduction to absurdity isn't a valid argument. Are those folks working
at large scales? Are they tuning DB's for applications which have to handle
hundreds of thousands or millions of transactions per second? I don't imagine
you actually know what you're talking about here.

~~~
WildUtah
_Reduction to absurdity isn 't a valid argument._

The Wik says Aristotle called it " _ἐις ἀτοπον ἀπαγωγή_ " and _reductio ad
absurdum_ has been considered an important and valid form of argument for at
least 2500 years. [0]

[0]
[https://en.wikipedia.org/wiki/Reductio_ad_absurdum](https://en.wikipedia.org/wiki/Reductio_ad_absurdum)

~~~
EpicEng
That's nice. Not here though. Equating the technical complexity of learning a
UI framework and designing back end systems is just silly and you know it.

This is also not a form of reductio ad absurdum which fits the definition;
it's just a silly linguistic reduction which excludes many important details.

~~~
orly_bookz
You're like a living version of this comic but for IT nerds...

[https://xkcd.com/435/](https://xkcd.com/435/)

(Just to be clear, in the IT version, you're _not_ standing where the
mathematician is.)

