Don't specialize in things that depreciate as rapidly as programming languages! C++ experience is quickly getting less relevant over time, but stochastic calculus, digital signal processing, machine learning, and distributed systems aren't. Some of those specialties get more valuable; in fact, sometimes, the forces that make things like C++ less valuable actually clear the way for things like distributed systems to become more valuable.
I was allowed to offer only a single bit of career advice, that'd be what I'd say: don't specialize in programming languages. They're a trap!
(But then, I'm not quite 40 yet, so I could be wrong).
When you're young, it can be beneficial to specialize in the "here today, gone tomorrow" technologies, because it negates the advantage of all the greybeards and lets you get exposed to levels of responsibility you wouldn't have a prayer of reaching for 10 years otherwise.
I got my first tech job (at 15!) because I'd taught myself Java, which was then brand-new, and a number of local companies needed experience with that and this weird new technology called the WWW. Parlayed that into a bunch of web experience, often as sole developer or team lead. Parlayed that into a job at Google, where I got to learn information retrieval, machine learning, distributed systems, the stuff that actually has a shelf life.
Someone could certainly end up at the same place (skill-wise) by getting a Ph.D and then a job at Google. The thing is, if you take that path, you're also forgoing work experience, income, perspective on how the industry operates, and the opportunity to jump off into different careers if you find you like them better. My first job out of college was writing software for hedge funds, for example; it turns out I hated the financial industry, but if I hadn't, it wouldn't have been too big a leap to get a job at a hedge fund after that.
The trick is to not kid yourself into thinking that the twelfth Javascript framework you learn is still valuable. The first gives you opportunities, the second and third give you perspective. But after that, have a plan B for things you should be learning that you can't get on the web. All that Java Swing stuff I did around 2000? Basically useless. Ditto the PHP in college, and even my Django & Javascript skills are nearing the end of their shelf life. It's the stuff I've done besides that, the stuff I generally don't talk about on Hacker News, that makes me valuable
The PhD route is different from the work route, it's not any better or worse. The PhD route allows you to pursue original research, hone written and oral communication skills, and in general allows for very different life experiences. Also, the work you can get with a PhD "can" be quite different than without (e.g. from a pool of ever shrinking research jobs).
C++ experience is quickly getting less relevant over time
I hear this often on HN and elsewhere, if I'd had to guess I'd also say it's true. But is there any proper evidence to support that? And in any case, it will probably still take decades before C++ will be gone. Maybe you could as well say: specialize in C++ now, in 20 years specialized companies will be begging on their knees for you to come work for them to keep their legacy C+++ stack working :]
don't specialize in programming languages.
Depending on the language you do need some level of specialization/mastering though else you might end up being writing very sub-optimal or even bad code in some languages. I'd say learning how to learn a programming language also pays off. Have you seen the amounts of SO questions for C from people who haven't reached a certain level yet? Their programs continuously crash or don't work as expected. Not necessarily because their intent is incorrect or they are bad programmers/engineers but more often because they lack proper understanding of C. Have you tried approaching Matlab the same way as any other language? Oops, there goes the performance. Working in Labview? It might look like it's easy (hey, all you gotta do is draw wires, right) but good luck building a medium sized application with it if you don't know at least some of the ins and outs. Now you might say 'if you encounter such problems you need to pick a better language' but that is just not how it works.
>Don't specialize in things that depreciate as rapidly as programming languages
C has been around since 1972 and is just as relevant as ever even as the popularity of C++ wanes. Python has been around since 1991 and is growing in popularity.
There is a trick to figuring out which technologies will stick around for the long haul, and it is one which most developers don't even think to try and hone.
My career advice would not be to not specialize in programming languages but to learn to distinguish programming trends from programming fashions and invest (with training/picking the right job, etc.) accordingly.
I doubt there is going to be much call for people with experience in Mongo in 5-7 years' time, for instance, but Postgres? Hell yeah. I predict increased demand for F#, too, but not so much for, say, Java (even though it won't go away).
I am a C programmer. C is my first language. I have shipped C code. I'm pretty sure I've literally shipped C code, in shrink-wrapped boxes. I took my kids to the museum today and hung out in the cafe pushing commits while they wandered around. What was I hacking on? A C compiler.
C is less relevant now than it was in 1996. A lot less relevant. Way, way, way less relevant. In 1996, most serious software was written in C. Today, only a tiny fraction of it is.
While true, in 1996 almost everybody knew C or C++. Now, you can get a CSc degree without learning a manual memory management language. While C and C++ are a shrinking segment of the entire market, the market is growing.
I don't get it when people say Java will be less popular. Today it is in most cases, my platform of choice. Modern Java is a damn fun development environment. You just need the right tools like Maven, IntelliJ Idea, Spring Loaded and so on.
I have written a lot of Java the last 2-3 years, with Maven, IntelliJ, and Dropwizard. This after years of C++, Prolog, and Haskell. Fun is not exactly the predicate I would attach to modern Java.
Of course, this is all subjective. But I can barely imagine that people coming from more expressive statically typed languages would like Java very much.
(That's not to say that the tooling for Java isn't extremely good.)
Yes, abstracting knowledge to the point of not being able to use or think in basic primitives is often counterproductive.
I can't think of a great example. It can apply to anything. Though your larger point stands, C++ is a not so great example. Many languages are roller coasters or extremely specialized. In contrast, C++ is on another upswing with C++11/14/17. It's used practically everywhere, especially in graphics, systems, and game development. A point worth mentioning is that good C and C++ developers are often capable of adapting to other languages and settings quickly. They're used to a lot of basic building blocks.
Worth noting that the style of C++ most common (in my experience) in game development is pretty far from what is considered "modern C++11 best practices", and the modern C++11 style tends to be frowned upon.
Thats not ubiquitous, but it indicates that despite the upswing, hitching yourself to that language is easier said than done.
Ok, lets see. The languages that I have seen come and go as intense fads:
* Easel
* Telon
* Powerbuilder
* Codfusion
* CICS (was actually taught in a "CS" course not too far from here)
* PL/I (may be not totally dead, sadly)
* IITRAN (maybe not that intense)
To Tom's point, I know of a project involving 300 Telon programmers who had to be let go when a massive project failed. Few who invested in the above technologies are working in them today.
More importantly than the article says, develop skill in solving business problems using your programming skills.
None of these are programming languages in the sense people normally use the term (with the exception of PL/1). They are propriety frameworks and tools. Even PL/1 was an IBM developed language.
Proprietary frameworks, products, and tools are definitely a terrible thing to base your career on. Languages such as C, C++, Java, Ruby, etc. are a good investment.
I was maintaining a custom piece of PL/I software until 2010 when I changed jobs, as far as I know the place is still using it. I suppose I have potential value as one of the few people (barely) under the age of 40 that have experience in it.
I would like what you say to be true. However, I have three of the base skills you mention, but I can't get the time of day from most companies unless I did it in their language or platform of choice.
I think it is just a reality of the current state the industry is in. Everyone seems to want developers who are already trained and experienced in what they are using so they can "hit the ground running", at least for non-junior positions. I think that is the biggest impediment for more experienced developers.
Agree and disagree. I think that learning the minutiae of a specific programming language might be an evolutionary dead end. I say "might be" because there are C++ engineers making $500k in HFT. Since we're both in the same city (Chicago) now, we both probably know some of them. C++ isn't dead yet, and it's not even close (but a mediocre C++ engineer is probably hosed).
What seems to happen as a technology fades out is that the number of jobs drops, while the pay in those jobs increases. There's a risk to it. If you continue to be qualified for those jobs in the depreciating technology, then you're solid. If you lose at the game of musical chairs, you can be screwed.
There's value in learning programming languages if you keep an eye out for the concepts that make them what they are, and if those concepts have a future, which is more likely. We may not need C++ in 25 years, but we are going to have low-level programming problems that require a C-like language (and fluency with assembly languages). Haskell itself may or may not be around in 25 years, but the problems it exists to solve and the value of static typing will still be relevant. Common Lisp is nearly dead, but Clojure is doing well. Languages die, but concepts hibernate at worst.
The people who will be screwed are the ones who specialized in minutiae of Java but forgot the computer science itself: the ones who got really good at JVM configuration and Spring and Hibernate but have completely forgotten how computers really work. The really good ones will always have jobs: there'll be Java work 50 years from now, just as there is Cobol work now and some of it pays very well, but the more average engineers who only learned The Java Way will be screwed: not good enough to get the dwindling (if well-paid) Java jobs and with too much atrophy of their general CS knowledge to handle whatever is next.
The C++ engineers making $500k in HFT are more due to the HFT part of their skills. As was said elsewhere, focus on solving business problems with engineering skills, not so much on the language.
Fair enough but my general sense of it is that extremely low-latency C++ is harder to learn than the finance part of the job. If that's wrong it's still what hiring managers believe.
I don't think it's ever wise to specialize in only a programming language, even if it's a really good one like Haskell. I think it's important to have a lot of language-agnostic knowledge in one's skill profile in addition to knowing at least one language very well.
I was allowed to offer only a single bit of career advice, that'd be what I'd say: don't specialize in programming languages. They're a trap!
(But then, I'm not quite 40 yet, so I could be wrong).