I know many programmers feel this way, but in my humble opinion it is a fallacy, and not a very healthy one.
No-one can deny that our industry is evolving at breakneck speed, and it is an exhilarating place to be. But just because there's a new technology every week on HN doesn't mean that we are losing old ones at a similar rate.
It is perfectly possible to have done nothing but C or Java for your entire career and yet remain extremely employable. And I wouldn't be at all surprised if there are highly paid COBOL jobs still out there, nursing some vast banking-industry mainframe which is too precious to risk replacing.
In fact I'm hard pressed to think of any programming language which I would dare declare 'dead' in a HN comment.
But even if you're a specialist in something which you feel is in decline, or for which there are newer, snazzier replacements, you've got every opportunity to learn something new, taking as long as you like to do so. There's extensive documentation for every technology under the sun available for free on the internet, and an army of friendly, helpful people willing to provide help and advice without expecting anything in return.
In fact, it's entirely possible you could even get paid to cross-train. In my own company we use RoR for which (in England at least) demand far outstrips supply. I've paid PHP developers to learn Rails, and I would consider anyone with an in-depth knowledge of any language as potentially employable.
Really, the only way an experienced developer is going to end up flipping burgers or flying a manager's desk is because they have lost the desire to learn - ie fallen out of love with programming in general. I believe few people work in this industry for money alone - you either love programming or you don't do it - and if you love it then you will pick up new technologies out of sheer intellectual curiosity.
Feverishly reading HN every day and feeling threatened by the emergence of every new 'next best thing' is not a good idea. I would advise anyone feeling this way to take a chill pill and remember why they took up programming in the first place.
This is, in fact, stamp collecting. Fewer and fewer engineers feel comfortable doing the basics. Implementing a raw custom data structure, writing a new parser, twiddling bits on a wire, debugging segmentation faults. The new-age programmer is in reality, a scripter who learns hundreds of different ways to do more or less the same thing.
There is no steamroller, but there is stagnation.
Actually, I wonder if the steamroller is age.
I can remember talking to a HR at a company I used to work for. I asked why they spent so much more effort on recruiting graduates and juniors than seniors.
'Seniors cost twice as much as juniors. We need them, but we only need one for every three juniors'
If that means only one in three juniors gets to be a senior, I wonder what happened to the other two. No one hires a junior with ten years experience, so I guess they don't work as programmers anymore. I hope they are project managers. Maybe that explains why project managers are always so angry.
Yes, some also go into technical management or project management roles, but in my base of anecdata, that was either out of a genuine preference or because they realized they weren't all that good at coding. And of course, some leave the field, but this is by no means an industry where only 1 in 3 entrants has a spot 10 years later if they want one. If you're even halfway decent as a coder, you'll have a chair and the music will keep playing for you as long as you reasonably elect.
It's far far easier than learning a new language, and it's now common knowledge that you shouldn't make hiring decisions based on what specific languages a candidate knows. I haven't seen a trend in the real world forcing people to know new frameworks as a requirement for a job. In fact, I'd say it's less so than needing to know the language.
Many that put that they "know" a framework that I've seen "know" it just by doing a project in it over a weekend.
I am Java programmer and the above seems to be true. However, it does not mean that Java programmers can stop learning and code as if it would be 1995. Even if you confine yourself into Java world only, there is enough to be learned in five lifetimes.
Many people here equate "learning something new" with "learning new cool programming language". It is sort of fallacy. New or improved frameworks are coming out every day. You can lean new design patterns, whole domains or simply dedicate few years to security, algorithms, graphics, gui design, sound manipulation, internals of common data formats etc.
That being said, it is ok to skip a technology or two. Most of them will disappear in few years anyway.
This is a great point. Java in 1995 and Java in 2014 are nearly different languages. If you add in common frameworks like Hibernate and Spring a programmer who hasn't learned anything new since 1995 will be just as confused as a person coming to 2014 Java from another imperative language.
As much as I love learning new things, and would love a day job working with Clojure/cljs or contemporary js on regular basis, in terms of my career I potentially would have benefitted more from consolidating skills in the aforementioned ecosystems. In the valley these things may be perceived as legacy and completely outmoded, but in the rest of the world that's hardly the case.
I think this is spot-on!
Mere inflexibility or old-fashionedness is often not enough to make one unemployable. I have colleagues who insist on various odd-and-nearly-dying technologies without that having a negative impact on their work. I have my own perks, like instantly turning off syntax highlighting whenever I have to use a new editor (sadly, the state of most embedded development ecosystems usually means new MCU = new fucking everything, oh, and on Windows!).
What I do find troubling and hope I never get is something akin to calcification, only intellectual rather than physical. I currently work for a company that has developed, among others, various computer security tools. Consequently, I have a lot of colleagues who have spent their entire careers writing software for nothing other than Windows. Some of them get very excited when they talk to me about other operating systems (I haven't used Windows, except when occasionally forced to, like a few years ago because well fuck you too Microchip!); I get to find out interesting stuff about various pieces in Windows, they find out about interesting stuff that hasn't made it to Windows (like ZFS) and we get to compare programming approaches.
Then there's the other group. The people who are absolutely 100% convinced that the only way to do something is the one they know. Unix? Phbt. Broken because the applications keep their settings all over the fucking place in whatever format they want. Windows does it neatly and keeps them in the Registry, in a singular format. You mean you read a file to find out if an Ethernet device has a link? That's really clumsy, what does a file have to do with anything? Why don't they give you a neat API? What do you mean "What happens if you want to use it from a language that doesn't have bindings for that API"? You got C# and C++ for complicated apps, VBScript for moderate ones, PowerShell for scripting and a bunch of hipster stuff like F#. Python? There's no support for that in Visual Studio, I'm not sure it even exists.
Sadly, the latter camp vastly outnumbers the former. It's not a case of I'll-just-use-the-tool-I-know-best. I see programmers I honestly respect doing that for the sake of safety: X is probably a better fit than Y for this, but I know Y inside-out and I only know the name of X. Deadline fast approaching + Y can do it as well without being bastardized = we're using Y.
No, these guys are at the other end of the spectrum: it's the "Y is a programming tool, I need to program something ergo I use Y". Technical merit is secondary as long as we can do it, and human effort is not only expendable, it's being paid for. The fact that this misled can-do attitude is indistinguishable from the one backed by technical prowess to most people in HR and management certainly doesn't help.
I like learning new concepts. I don't like learning things that are kind of the same as the what I'm already used to, lacks 5% of what I had in the 'previous' technology, adds 5% of what that other 20 year old technology did (and did better) that I already know, and "learning it" is more about rote learning and getting used to corner cases than it is about gaining a new perspective.
I don't know the ratio of this type of learning to the more perspective-shifting type of learning.
Nassim Taleb has this nice heuristic that says that the expected future life of something is proportional to how long it's been around.
So you might guess that Unix will outlast even the web. I doubt that a deep knowledge of Unix will become obsolete in the lifetime of anyone alive today. We might ubiquitous computing and networked sensors and AI, but it seems like all of those things will be running on Unix.
(note: this is of course a heuristic)
Lisp? We're all reading HN which is written in some Lisp dialect. There are also quite a few Clojure devs in here (including me) and a few using other Lisp dialects (and participating in flamewars about what a true Lisp is or is not). Don't know if it counts ^ ^
And then not 1969 but 1976: daily Emacs user here...
I have yet to do much with Clojure, mainly because I'm not really in the JVM ecosystem. But I think its focus on immutability is a great contribution to Lisp and will last a long time. It's an idea that transcends a particular implementation.
What changes are lot are the actual frameworks and libraries. E.g., while Java is twenty years old now, Hadoop, JAX-RS, Hibernate, etc. did not exist then. And while Java will probably still be in wide use in twenty years, those libraries and frameworks will have been replaced except in legacy code.
I think the only 'platform' that has been stable for ever is UNIX. People are still using libc, POSIX (ok, SUS), etc.
For those who were paying attention in language classes it's a game of waiting for industry to catch up. A long one at that.
80 - everybody does fortran (57)
- play around with pascal (70), basic (64)
- evaluates Unix and C (73)
- play around with small-talk (80), learn OO
I seems, like the pattern is, to figure out what language/paradigm, that was invented 10 years ago, and just seems to have matured enough, that everybody will be trying to use it/imitate it 10 years down the line :-)
I.e Ajax, invented ~2000, production use by google ~2005,
everybody's grandmother uses it ~2010.
It's the kind of thing that even a moderately experienced programmer can understand within 10 minutes, and then have used it successfully another 10 minutes after that.
I don't want to learn tech because I'm afraid of getting my bones crushed by a steam roller.
I would rather learn tech because of the new and interesting things I can do with it.
Depending on the size of the change, a reasonably competent programmer can pick up enough of a new programming language or framework in something between a weekend and a month of playing around with it.
However, if a company has invested years of work into a product, they can't simply switch the programming language or framework. Worst case is that they have to rewrite everything.
I'm in a position where pretty much exactly 5 years ago, at the company I worked at we evaluated which web-framework and programming language we should switch to and rewrite the product in (that was previously written in Perl).
We decided to go with Java, as it seemed a safe bet. That was before Sun was bought and, while Java wasn't an innovation leader, the language was being properly maintained. E.g. Lambdas didn't seem to far away, there was talk about replacing get/set methods with properties etc. Then Oracle came along and drove the whole process straight off a cliff. Meanwhile, even Objective C has both properties and blocks!
We chose the Seam Framework: Open Source, innovative, trying to get their stuff into the standards. The last part actually worked somewhat, however stateful web framework have proven to be a looser: Especially in enterprise, back in 08/09 browser performance was terrible and you'd try to do as much as possible on the server. Today, it's the other way around.
Plus, the stateful Seam Framework works really bad in the cloud. Each server needs memory to store session state. Want to load balance requests between servers? Want to be able to kill a server? Synchronize the state between servers and loose as much performance as you gain by adding a new server. Plus, you'll always have to buy beefy instances to have enough memory at hand. 5 years ago, cloud was much more about IaaS than PaaS and during the hype anything was ready-for-cloud and every solution had lots of initial quirks.
We tried to make a couple of safer bets 5 years ago. We lost. As a programmer, I easily moved on to greener pastures. But migrating that product off to something like Angular in the frontend or to a stateless architecture in the backend? Even if the new technology doubles the development speed, halfs the maintenance effort and hosting costs, it'll take a massive investment... by the time it pays off, half of the technology is probably hopelessly outdated again.
It takes a decade to learn something inside and out, and then you can't spend all day learning which JS mvc framework has gone out of style over the weekend. If your technology lasted long enough for you to become an expert, you can probably make a living on it even if it goes out of style like COBOL. And if you are a skilled developer you have no problem applying your knowledge to a new platform/language.
Those who get steamrolled are those who work but don't develop their skils. After 10 years with COBOL you will be steamrolled unless you have actually become an expert at software development, not just an expert at your job.
There is always something new to learn, something new pick up and explore.
I am doing this for over 20 years now, and it was like changing my job completely every 2-4 years. If one is open to change and adaptable there is no steamroller.
Yes. In the world of programming you have to always be learning, but you can take a break from learning, and jump in later if you stay in the same place technologically for too long. It's not like if you fall behind, you need to learn everything in the time period between then and now. You just need to learn now.
...well, they still can you know, and this is part of the beauty of the field, you get to work with people with all sorts of backgrounds. Now, yes, you need a github with some cool projects, some contributions to interesting projects and/or a mini-portfolio of projects that you did for free for schools/charities (and yeah, "cool" and "interesting" are in the eyes of the employer), but compared to most other technical field, it's much easier to make it without a degree or ample experience!
This so called steamroller is IMHO just mostly riding around in circles, and when you've been in the business long enough you start recognizing the repetitive scenery.
Staying in front of the steamroller is neither a goal nor a means to an end. It's a distraction. You're better off getting out of its path and observing, and pick up only the stuff you really need. Be it for the job your doing or to keep up your market value.
But for either of those you only need one out of a dozen new shiny toys every once in a while, not all of them.
Given the amount of maintenance needed for all the polyglot legacy code bases, I would imagine that there will be plenty of work for people as they move through their own career cycle.
OP Links to a Ray Kurzweil article at end of his piece that could be useful!
If you're smart about it, you hire programmers because of their ability to solve problems. If she used Ruby for 10 years and did great things, and you use Clojure, no matter. People can learn new things. The core ideas and competencies haven't changed much. The tools and APIs seem to have a half-life of 5 years, except for the proven winners (C, Unix, Lisp as a concept if not a specific dialect yet).
If you're a dumbass, however, you hire based on trivia. You ask for minuscule details of C++ templates or Python metaclasses or JVM internals. You let your HR write hiring specifications like "must have PhD and 5+ years in <three-year-old technology>". The trivia questions are fine interview material if the person is claiming to be an expert in that technology. They're useless at determining whether a person is generally capable.
The problem is that companies tend to fuck up one of two opposing ways. The more common failing is the HR fuckuppery I described above, of hiring for specific easy-to-learn skills rather than actual capability. This tends to hurt older people, who (after a decade or so, when all the half-hearted programmers have dropped out) want to develop genuine competence rather than chasing every crappy new thing that pops up. That seems to be why every good programmer above 35 either wants to be an architect or data scientist or something other than a "regular ol'" programmer, i.e. an "X who programs"; because the way programmers are evaluated is completely borked.
The (much less common, but still irritating) opposite end of the spectrum is the "hire ALL the talentz" attitude you see at certain very large tech companies, where they "hire generalists" but that's an excuse for a one-size-fits-all, CommodityDeveloper, internal attitude. If you're running a closed allocation shop, you really can't hire people "just because" they're talented. You have to hire to the specific role or you'll have a morale problem and possibly an HR disaster inside of 12 months.
So, really, the only way to avoid falling into one of these vicious patterns or the other is to implement open allocation. But we all knew that already.
The problem isn't that there's a steamroller. It's that it's driven by idiots who don't understand the first thing about any creative process-- closed allocation is a laughably terrible idea-- much less a constantly changing one like technology.