That sounds rather blunt, but most organisations that aren't startups don't change technology quickly if at all. C++ has served me well for two decades; I probably ought to adopt C++14 but on the other hand my current job requires that the codebase build with a 2008 compiler.
I'm also extremely skeptical of the extent to which AI and VR are new, as opposed to incremental improvements to technology which takes it over an adoption barrier. Have you seen the 80s VR headsets? SHRDLU? The "AI winter"?
If you're worried about this stuff then it's helpful to develop a level of knowledge about it that's slightly higher than Wired but lower than actual implementation detail, in order to talk about it in interviews. You can then pick this stuff up as you go. Machine learning in particular is maths-heavy, matrix algebra in particular, and that's never going to go obsolete.
I also agree with the commentators who are saying that you should ignore the latest flash-in-the-pan frameworks unless you really have to to get frontend gigs.
I started in this field in 1992. I've seen the coming and goings of many flash-in-the-pan technologies.
If you are a dev and are selling yourself on your skillset, ask yourself, "is this sustainable?" The answer is "no." A fifty-year-old brain simply does not absorb new technologies as fast as a 25-year-old-brain. If your plan is to continually adopt new cutting edge technologies in order to stay marketable, I politely submit you need to rethink your long term plan.
As an almost 50-year-old, I promise you that the world is not a gentle place for older devs. Plan your exit into management, a related field, or some other job altogether. If you expect to be a coder at age 50 you're going to be disappointed.
As for me, I jettisoned the "technical skill set" war decades ago. I do not sell myself on my technical skill set. I sell myself as the ultimate generalist. This comes with its own set of problems. However, it has allowed me to age more gracefully in my field, because I do not create the expectation that my primary value-added is code generation.
I'd like to simultaneously disagree and agree with you. If you were a 50 year old coder 10 years ago, then your only hope of remaining in the tech industry would be to add "Manager" to your job title. If you are a 50 year old coder today, that's still a sound direction to go, but increasingly becoming less necessary. If you'll be a 50 year old coder 10 years from now...I think you'll be ok.
Yes, at 50 you almost certainly cannot crank out as much code as you can at 20, but so what? As a coder it is important to understand that most problems in technology are not solved with more code, but less. That you will generate less code at 50 should be seen as a benefit. The advantage you have at 50 over someone who is fresh and new at 20 is that you can recognize that there is very little that's happened in the last 30 years that is genuinely new.
So, if you are 20, go ahead and spend your weekends on side projects learning the latest frameworks. As you continue to do this over the years and decades, shift to focusing more on patterns. By the time you reach 50, you might only generate half as much code as your younger colleagues, but you should be able to solve problems with a quarter of the code required, still making you twice as efficient as them.
When coding was new, code was the only metric by which to measure coders, and non-technical management types would view anyone who generated less code as less valuable. If you're not writing 500 lines of code a day, the argument goes, then you should be managing coders who can. But management and problem solving are not a completely overlapping skill set. Some engineers make good managers, but most do not.
Luckily, the more technically inclined individuals that populate the ranks of company management, the more this is being recognized and the more these companies are willing to hire the 50-year-old-coder-who-codes-less-but-solves-more-problems.
> If you were a 50 year old coder 10 years ago, then your only hope of remaining in the tech industry would be to add "Manager" to your job title.
People outside the startup bubble value delivering, regardless of age. Developers outside the startup bubble work consistently in a few areas of technology, they develop deep personal understandings with cookbook / solution / frameworks they personally authored enabling them to construct stable solutions that expertly address the problems being addressed.
Essentially, if you remain a "coder" and you use your brain at all above simply being a coder, you will become a software scientist. I never start anything from scratch, as I have about a dozen application skeletons ready for various specific purposes, plus similar libraries I wrote, plus a knowledge of several large commercial SDKs, and developer experience in several major FOSS applications. The work that I do now would give my 20 or 30 year old self a heart attack with the large scope, number of complex technologies, and the time frame I'm expected to deliver. But I've been writing code for 40 years now, and I may bitch at my tools, but it will deliver, it will be well written, fully documented, and so on because anything less just creates technical debt.
If you like writing code, start acting like a scientist about it. Few developers do, and in time you will accelerate away from your peers into a truly enjoyable professional space very few seem to occupy.
I have spent the bulk of my career working in larger, team-based environments. In my experience, being able to conform to cultural norms is essential to being a high-performing team leader or member.
Many, many teams have prejudices and / or litmus tests. For example, I've spent decades doing project management. I use Gannt charts. Why? Because they are the best way to communicate timeline expectations with stakeholders. I don't use them to manage the project. But I've encountered more than one Agile team that flat-out considers the use of these charts to be anathema. You might as well crap on the rug.
That's just one example. Obviously there are also the buzzword technologies. Here's an overgeneralization you might agree with: young developers tend to naturally gravitate towards newer, less proven tech. Us older developers naturally gravitate towards more mature, established tech. This creates a source of age-related friction.
Then there's the inability of others to grasp the applicability of your experience. For example in the 1990s I built a lot of actually awesome Lotus Notes/Domino applications. Now that's a technology that, if you mention it in various meetings, will get you laughed at. However, Domino was kind of the original "noSQL" / "BigTable" world, and it turns out that my application architecture experience in the Domino world translates meaningfully into these "newer" (ha!) database technologies. But try to tell that to team members.
So I guess my comments are orthogonal to yours. I agree with pretty much everything you write, but it leaves out the important social aspect.
I've had a project manager (very insightful guy) tell me something similar before. He said "as you get more experienced you'll (ideally) write less code and instead reuse code from libraries you've written in the past". I've always wondered how that works for someone? I'm assuming certain projects will have requirements where specific technologies would be used in the front-end and persistent layers depending on the kind of project you are building and so those wouldn't be as reusable but perhaps algorithms you've written before might be reusable on the computational side of things. Like perhaps if you had a library that handles very advanced moving cost calculations you could ideally use that same library in different software forms (mobile, web, desktop, command-line).
Edit: Git might be a good example of a library that can be reused a lot and used within many kinds of applications (IE: inside web app, within IDE, from command-line, desktop app, etc)
Smart devs know this and cultivate their personal repetoire.
For example in the veeeery first days of .NET I had the opportunity to build a nice internal application for a client. Of course I didn't have a .NET library - the tech was just coming out of beta - but Microsoft was nice enough to provide a bunch of "example code" and "starter sites" to give all us noobs some patterns we could work from - at least most of the skeleton of a working, database back-ended website with a working security model.
From this .NET project it was easy to find another, and now I had a complete .NET library with solutions to all kinds of problems - how to do doc management, how to solve image resizing problems in .NET, how to talk to all kinds of heterogeneous systems using .NET, how to implement webservices, etc. because I'd built these once. With that toolbox I could easily solve all kinds of other problems with only a little creative remodelling of the skeleton app.
Then in 2007 when I jumped into OSS I got to start an all-new library building solutions on the LAMP stack. By 2009 I had a nice little library built up again.
The pattern is amenable to any software tech. Every project builds from the last one, if you do it right.
Another example of a library would be libjpeg for reading and writing the jpeg image format and dealing with all the different ways the format can be adjusted, which gets incorporated into many applications, for example ImageMagick.
Basically, a library consists of code that is intended to be reused by calling it from other code.
 It's astonishing how often this gets reinvented. Here are just Python libraries: https://pypi.python.org/pypi?%3Aaction=search&term=git
Wow, I'd only heard of libgit2 before. I can't believe how many git-related python packages there are. Yes git isn't a library, it's a program like you said, good catch
I started late (38, 41 this year) and I couldn't imagine jumping from Ruby to Rust to whatever new hotness arrives when it's clear that there are handful of technologies that go deep (enough) and that are being used to solve problems that require someone to be more than a coder or even a dev.
I'm in the connected-car space and the only question I'm asking myself right now is: will my current skill set (mostly iOS in Swift/Obj-C/C) allow me to build for the future of augemented reality in vehicles or will going down the path of C++ -only- allow me to grow as a "software scientist" (perhaps a third way...)? Honestly I don't know, but I do know that I won't/can't find out if I jump into every HN rabbit holes that opens.
I have no idea what higher level language in-vehicle AR will be written in, but C++ may be overkill, or a blind alley, or both.
You might be better off learning one or more of C, Go, Rust, Python, or Java (each has its own advantages and disadvantages) over C++.
Could you elaborate more on this?
This is precisely my argument as a generalist.
I may not write code as fast or even as elegantly as I once did, but I'm much, much better at "seeing around corners" to avoid problems, and I'm much more likely to "build the right thing" due to my very broad but not deep technical base. I do not fall into the trap of "when all you have is a hammer everything looks like a nail."
> there is very little that's happened in the last 30 years that is genuinely new
This is true, but try to convince your 20something peer group of it. They are all convinced that their technical skills are revolutionary.
I studied waterfall systems engineering in 1998. But at the same time they also taught "RAD" iterative waterfall, or the precursor to Agile. I ran with RAD, and made it mine, so I've been doing something very similar to "Agile" for two decades. However, I still think that Gannt charts and top-down planning have value.
Showing that shit to the wrong 20something developer is a great way to be excommunicated.
Of course they do. What you're seeing is a backlash to the way GANTT charts have been misapplied to projects that are less well defined, and to managers that ask for estimates and then are unpleasantly surprised when an estimate isn't entirely accurate.
Anyway, if you want to provide the same sort of value but present it in a different format, consider using a network diagram or a user story map.
You think I don't know this?
I think you missed the point :(
(As an aside, in my experience, network diagrams and user-story maps pale in comparison to Gantt charts, as they do not effectively communicate timeline expectations IMO.)
Of course, that is in part by design.
If you are delivering solution X that is just an instance of iteration Y of system Z with a few customer-specific tweaks, Agile methods will work just fine in combination with GANTT charts, and teams that are engaged regularly in that sort of work aren't too likely to object.
However, as soon as a task is a known unknown (that is, a well characterized problem with an understood though as-yet nonexistent solution) requiring new code that isn't just a variation on familar themes (ie. whatever is the equivalent of a CRUD web application for your industry), you start having to deal with uncertainty and risk (which humans are very bad at reasoning about). GANTT charts can make things worse because they foster a false sense of control (padding time estimates by x% for example feels like it helps, for example). And if anything in your critical path is a known unknown, your timeline expectations, however consensus-driven and reasonable as they may be, are likely to get smashed to smithereens, and all a GANTT chart is going to do is tell you when your carefully constructed schedule is starting to slip.
Throw unknown unkowns into the mix (from "we haven't yet figured out how to solve this" all the way to "are we even solving the right problem"), and timelines become entirely fictional.
I am not aware of any chart format that incorporates this sort of risk and uncertainty well (FogBugz for example, which incorporates probability and risk based on an estimator's past track record, basically just sticks that data into a chart's tooltips)
Now, if you are willing to mitigate these risks by basing your timeline around deadlines for handoffs with the understanding that what gets handed off is "best effort working system in available time" (in which case continuous integration/delivery is your best friend) that's a different matter, but that just isn't how most projects are concieved or managed. Yet.
BTW, related to all this is the conflict between feature-based and time-based approaches to software releases.
Given that you probably have much more experience than many of the people here, I'm going to assume that your advice is more sound than theirs.
Generalism is insurance, while specialism can be more profitable if the speciality you pick ends up in demand.
I've been worried sick trying to learn the things I see people commenting here or in hackernoon. I had the idea that by the time I graduate, these technologies will be trending or have a good market share.
With historical perspective, you can choose a "v1.0" or even a "v0.1" modern tool with confidence, when you have reason to believe that it's effectively "v255.0" in a lineage of discontinuous improvement that spans centuries.
There's a good chance that something else could be trending by the time you graduate. The media mood is very ephemeral. For employment purposes it's better to look at job ads statistics (annoyingly hard to search for phrase) and make your own inferences from there; anything currently popular will take a while to sink. As with any statistics, also ask yourself what is left out.
Stackoverflow dev survey is a reasonable starting point: http://stackoverflow.com/research/developer-survey-2016
There are only a few patterns in programming: imperative, OO, functional, etc. Learn those.
There are only a few abstraction levels in problem solving: meta, business, system, physical. Learn those.
There are only a few types of patterns in ML and Big Data. Looks like it's time to learn those.
But the principle is the same. Learn the patterns of various forms of solutions, not actual languages or tech (they'll be required, of course, but they're only a prop). Be able to move between these various patterns. Then deep dive from time to time on various projects in each area.
We've passed the point where a person could keep up long ago. Now it's simply about being both broad and deep at the same time. T-shaped people. If you want to make a lot of money you can be that one guy who knows everything about some tiny point -- but you'd better hope that point doesn't become obsolete in ten or twenty years. I've seen this happen far too often in tech.
Should I become an expert at one language/domain, or, should I constantly learn new things and change roles?
I've done the latter, and I don't know yet if it will have been worth while. I worry about being a "jack of all trades, master of none". Yet, as you point out, a master of one trade had better hope it doesn't become obsolete in their life time.
So my hope is that the investment in learning, and adapting, will pay off in the long haul. I can write an iOS app, I can write an android app, I can code a backend server in scala + akka, or I can write a backend server in PHP. Can I do these things as well or as quickly as a master in each domain? Certainly not.
Whether that is possible depends on the definition of "mastery" that you use, and how broadly you construe "all".
TL/DR: Even generalists have become much more specialized over time.
Consider that "web design" circa 1996 used to include project management, end-to-end development, PR, search engine placement, graphic design, copy writing/editing, information architecture, browser testing, server administration, domain registration, email configuration, etcetera, etcetera.
Over the past two decades, these tasks and many more besides have swollen to bursting, splitting into entire subdisciplines (some of which have endured, others not so much) and in some cases re-merging into new hybrids.
There is no way for anyone to maintain more than passing familiarity, much less competency, with every aspect of building and launching a non-trivial website (much less a web application). At best, we'll wing it on personal projects, or even skip major .
These days, what passes for a generalist is the "full-stack developer", which pretty much leaves off everything that doesn't have to do with code, or outsources tasks like design to 3rd-party services (eg. Netlify), libraries (eg. Bootstrap), themes, and so on.
I expect this trend to continue and even accelerate, so don't drive yourself nuts trying to keep up with the developments across everything you do today. I guarantee that whatever you consider to be the set of skills a generalist should understand now, it will get narrower over time (but not shorter), and we will have as many kinds of generalist as there are specialists today.
I am pretty sure that the "serverless" trend will lead to a new breed of "middleware developer" that works on various sorts of smart caching proxies, for example.
My advice is to try and be strategic in the things you drop and no longer keep up with, and keep the things that you do maintain competencies in adjacent to each other and to one or two main areas of expertise.
Another enlightening moment for me was when I was working on a hobby machine learning project, and shared my design concerns with a brilliant but very much non-ML coworker, and all of a sudden that coworker laid out the whole design in a pretty convincing detail, like he's been doing this work for years. After the initial shock from his seemingly birth-given ML skills, I noticed that he simply takes a lot of good online classes and goes through all the top ML material on the web in his spare time, even though it was irrelevant to his tech focus at the time. Well guess what, two years later he got promoted and he's making that sweet data science money, and guess where he would have been if he only focused on his old day-to-day instead.
Skills vary both in how much the market values them and in their durability. There's often a trade-off between these two characteristics. For example, half a year's worth of study in a foreign language or pure math is only somewhat valuable to the market but that value doesn't tend to decrease over the years. Learning AngularJS in 2013, on the other hand, was so highly valued by the job market that it was a great way for junior programmers with no degree to break into a software engineering career.
I believe it's best to generally focus most learning efforts on durable skills, but occasionally when there's an opening, to flop and focus 100% on an ephemeral skill that's highly valued and appears likely to be even more highly valued in the near future. After capitalizing on the opportunity, return to mostly focusing on durable skills.
So following the latest technology in detail is unnecessary.
Far more useful is just having a broad sense of what tools are available out there; it takes less time, and it's more useful since it gives you access to a broader set of tools on-demand.
Beyond technology, the things that persist are much more fundamental skills:
1. Ability to read a new code base, and ability to quickly learn a new technology.
If you can do this you don't need to worry about new technologies since you can always learn them as needed. E.g I just wrote a patch for a Ruby project (Sinatra) at work even though I don't really know Ruby and never saw the codebase before. It got accepted, too.
2. Ability to figure out what the real problem is, what the real business goal is. This makes you a really valuable employee.
Technology is just a tool. More fundamental skills are your real value.
More detailed write-up on how to keep up without giving up your life: https://codewithoutrules.com/2017/01/11/your-job-is-not-your...
We're still doing new apps in Angular 1 here, because everyone knows it, we can reuse more code, we know most of its quirks and how to squeeze performance out of it, and we can get the apps out the door a lot faster. Eventually we will have a new project where we decide to use something more current, though.
The point is that trying to keep up with all bleeding edge technology is a waste of time, because it's constantly being replaced.
While at the same time you need to learn whatever technology you use at work, which may not be the latest-and-greatest.
Don't confuse advances in technology with intentional churn generated by vendors and platforms. The latter is a plague, and it doesn't only cost people money, it costs them productivity. You may be getting confused and mistaking skillset for toolset. Large companies will always be able to replace your toolset and demand you learn a whole new one, because the more you do, the more you'll be locked in to their toolset. If you can abstract out the functions being implemented and express them in different ways, you can take your skillset different places.
Whether you do that, depends on how good you are at finding niche markets. As someone who's stayed in business for ten years selling GUI-less audio plugins with no advertising and no DRM of any sort, I can tell you (1) niche markets exist and they're loyal, and (2) they're small, which is what makes them niche. :)
Do you think churn is intentional within a single vendor, e.g. to force upgrades? Could churn be a by-product of competition between vendors, e.g. AWS refactored most of enterprise computing into "low-end" services that steadily improved, but were proprietary and increased lock-in.
> The dev tools I'm using won't even work on current computers. I code on a time capsule laptop and depend on the very simplified plugin formats I've chosen (generic interface AU and VST) to remain functional.
Is the time capsule laptop for old operating systems or old hardware? COuld the old operating systems work in a virtual machine?
> They'd have to break the most fundamental interfaces to kill my stuff (which doesn't make it impossible to do, just very user-hostile)
Apple tried to get ride of files (!) entirely, but they are slowly making a comeback on iOS, e.g. now you can insert an attachment within an email, with the right application plugin. Social networks have done their best to replace RSS push notifications with proprietary pubsub. WebDAV, CalDAV, CardDAV are thankfully still supported by a few good apps.
> niche markets exist and they're loyal, and (2) they're small, which is what makes them niche.
How do you market your services/products within your niche?
The old laptop is just the most convenient sort of virtual machine. At some point it'll be easier to run a virtual time capsule laptop… however, the physical time capsule laptop is from a time before intense spyware, so there are security issues as well.
As far as niches, Airwindows doesn't market at all. It's only word of mouth, for ten years, with few exceptions (notably, Console2 got reviewed in Tape Op, a trade magazine). This is personal: I loathe getting harassed by marketers so much that I won't even email, much less advertise. I collected a list of the 'Kagi generation' customers who specifically said they wanted to be on a mailing list and hear from me that way. And then I haven't emailed anything to them for months and months :) So, effectively, my business is 'for people who hate marketing so much that they want to do business with someone who will absolutely leave them alone and not bug them'.
By definition this is a niche to starve in, but it's sincere. I really do hate most everything about marketing, so I simply will not do it. Sometimes when I have a notable post or product I leave out the patreon link on purpose :)
More like, 'new' marketing category: Trust Building Exercise. Attempt to give everything possible away, and see if social pressure can cause a lot of people to go 'yay!' and throw money. Hence the Patreon with literally no tiers above 1$, with at least half the patrons from 2$ to 10$ on their own volition.
The trouble with that (speaking as someone who has some notion of marketing but chooses to undermine it) is, it's one of those power-law relationships where basically you have to be me to do it :) without ten years of sorta grassroots presence in the industry and a large number of successful products that perform well as software, you can't do it. You can't simply start up and have a Patreon work on those terms, even if your products are exactly as good, and this is a problem.
Solving that would be a very big deal but it's a bit beyond me for now…
0. Assume any "new" thing is worse than the "old" alternative - until proven otherwise.
1. Critically filter out hype/PR.
2. You're left with much less to learn.
3. Invest "out-of-work" time in something really valuable.
On the other hand, some fields like web development have peaked a while ago, I would argue that 2012 was the high watermark. I think it's a very precarious choice of career right now. It has been steadily going downhill since the introduction of trendy front-end frameworks that don't offer any value to the end user (including React, Angular, et al). The culture stopped being about making usable and accessible interfaces for people, and more about "component architecture", "server-side rendering", "tree shaking", that solve problems created by the very tools they are using.
That isn't to say that web development is dead, but I think that the future will be more specialized around certain features of the platform such as WebAssembly, WebRTC, WebGL, Web Audio, et al. And these will be more readily picked up by people with more durable skills, than those who only know the most popular front-end framework.
Speaking as a confirmed cynic, that seems overly cynical. ;-)
The problems being solved by the web framework hamster wheel are those of rising bars for usability and speed, with measurable in $$$ effects (ie. an extra 1s delay could increase shopping-cart abandonment rate by 1.5%). Which matters a lot more at web-scale.
So, these trendy frameworks are solving problems that most developers shouldn't worry about (it is premature optimization) but that matter a great deal to the companies that release them (Google: Angular and Polymer; Facebook: React and Flux; etc.). OTOH, it is tempting to tap into of all the engineering effort that goes into libraries like these. You just have to know where to stop before sinking into the HammerFactoryFactory mire. ;-)
As for the anxiety, turn off HN every so often and just focus on being a good engineer with your current tools. Nothing changes so fast that you can't go a few months or even a year without being in the know. When it comes time to stsrt a new project, spend a week researching the current tools and see how they fit into your stack.
It's true technology progresses in dog years. When you are working you are not learning outside that bubble. When you are between assignments you absolutely must treat that time as a sabbatical to learn something/anything new.
By broadening your skillset through selective project engagement, you are better off than Skippy who has worked on the same application with great job security for 5 years - Skippy will not be someone you will re-encounter 10 years from now unless you are buying a used car and they happen to be the sales person. The industry is self-selective this way. The complacent "I got mine" mentality is toxic to longevity in the industry.
Let me also dispell the meme that sticking to a specialty is a desirable thing. The fact of the matter is that the ocean of legacy code grows exponentially and there is always a need for someone who knows a legacy language or technology. this kind of career trajectory is as desirable as cleaning out septic tanks. There's job security to be had and you'll hear plenty of "Ho, ho, ho - I don't need no stinkin' new fangled whatever" to be indispensible. My advice is not to be that guy/gal.
It is a much harder and a much richer experience to navigate a career in the flow of technology than to get myopically paralyzed by a desire to featherbed where you are today.
But your question is "how" to keep up. IMO, the answer is to skim lots of material and only dive in at the last most relevant moment. The generalist is far more qualified that the specialist these days because most companies cannot afford a prima donna - they need people who can perform many jobs and serve many needs.
It not only allows me to discover the tech, it's also especially important because I refuse to make first contact with a tech by implementing it directly in a production project meant to stay around for years. The most projects I've used a new tech in before introducing it in my main project, the more comfortable I am that I did not do gross mistakes.
For me, it's not the amount of time using a new tech that matters, it's the amount of projects I used it in (because each time, I can try a different architecture).
The following are signals that a technology is in and early part of the hype cycle:
* It has the backing of a major corporation or a startup with a marketing budget.
* There are a lot of rave articles/blog posts about building relatively simple or small self contained applications using this technology.
* There is a small but vocal contingent of users who are passionately against the new technology. Their arguments often link to bugs on the bug tracker, cite edge cases that occur under heavy usage and indicate fundamental oversights in the design as well as assumptions (and often arrogance) on the part of the architects.
* The benefits cited are either A) vague or B) explicitly aimed at beginners.
* Arguments in favor often appeal to authority (e.g. "Google uses it" or "XYZ company use it in production"), popularity ("everybody's containerizing these days") or cite benefits which were already possible.
* A high ratio of talk to action: the technology is talked about on Hacker News a lot but there appears to be very little real life stuff developed with it and a lot of the talk involves jamming it in where it's not really necessary.
* Sometimes I experiment with a technology for an hour or two and I see if there's anything obviously wrong with it or if the criticisms are valid.
- You should be able to move back and forth between management and technical positions. They are not mutually exclusive. You can a skill set that allows you to do either. It gives you greater perspective and flexibility. One piece of advice that I was given in college is that even if you are the Director of IT you should leave small (non-critical) pieces of software for you to work on. So you never lose touch.
- Try to work for good companies. My definition of good companies are those where you can be productive every day.
- Some skills will be helpful all your life. I learned Unix in 1989 and have used it almost every day.
- Learn the fundamentals. Data structures, algorithms, relational theory, structured programming, object oriented programming, functional programming, networking, Operating systems, theory of computation, et al.
- Understand the business domain in which you are working. That makes you extra valuable for your current company.
- Develop your soft skills. http://www.skillsyouneed.com/general/soft-skills.html
2. New technologies are adopted, doesn't mean old ones quickly disappear. Sometimes not even slowly.
3. Area focus. If my area of expertise is networking, what do I care about VR? We can't be generalists any more than we could be 20 years ago.
4. If you feel like being a generalist, understanding & internalising (basic) principles is more important than being familiar with specific technologies
5. Critical, transversal thinking. You can weed out heaps of new technologies by understanding _how_ they fit in a system and the tradeoffs they require, before you have to become intimately familiar with them. Base your approach on tangible end-to-end measurements to understand how technologies might fit in a system, and after that you'll have to keep up with a lot less than the various FOTM
For example, I still do my web development in Django or Flask because they do the job, I'm pretty good at Python and most projects don't really need the concurrency Go or Elixir offer.
One of the best recent additions to my skillset was Docker... a lot of people say containers are not a must but they really made my life easier and allowed me to do cool things for clients from different industries.
It doesn't sound as cool as doing machine learning, computer vision or natural language processing, but don't let the AI/VR hype make you anxious, just focus on what you really want to do.
AI,ML and VR are all really interesting, but as we all know they are not completely new and will not likely account for the majority of the future jobs.
Fundamentals are what matter, most of these "new things" are just something that you can learn with relatively limited effort if needed. Classic programming skills, analytical skills or things like the ability to reason about concurrency issues never go obsolete.
The exact tech choices doesn't matter that much it's more of the overall direction (in my case analytics, in a bunch of varied sub-fields).
Although the top comment has some merit I'd argue c/c++ is an outlier here rather than the norm.
At the moment I'm taking this course in Deep Learning
1. Decide what the program should do.
2. Decide how the program should do it.
3. Implement these decisions in code.
Only the last part is actually coding.
In other words, as a software developer, you are not paid to type. You are paid to think. And the deeper your knowledge and experience, the better tools you have to actually do that. So focus on learning step 1 and 2!
For me personally, I find that I generally have free reign to test out "new (to me) technologies and my experience helps me realise quickly if they are going to be helpful or a bust!
Take any opportunity you can to do a "little project" in something that interests you and then apply it to problems in your work!
I've been making web apps since 1998. Last year I learned how to use CouchDB and PouchDB. Before that I used a flat file database and the built-in filesystem to manage data. I used Perl on the backend with just a bit of JS on the front end to run the apps.
I never did learn how to use MySQL/PHP. I looked at it, decided it was a butt ugly way to make websites, and apps and admitted to myself that I wasn't qualified to design secure SQL apps and didn't want to learn because that is a career all by itself.
So I waited for something better to come along and last year CouchDB along with PouchDB hit the mark so I spent the year learning and using them. It was worth it. With those tools I am faster and better.
The years in-between were spent getting stuff done with the tools I was good at using, not trying to learn how to use a zillion other tools to do the same things.
I looked at a lot of newer tools again last year with an eye towards what was "best". There is a lot of cool stuff out there that does some really jazzy stuff, but in the end I decided to take another look at what was "easiest".
I ended up with CouchDB, PouchDB, and JQuery. Easy to learn and incredibly rich APIs with lots of support and example code. There's more than enough in those to learn and keep up with and if I need to add something I'll look for easy ways to do that too.
The truth is, it takes time to be productive with any language or tool or framework you use. It's a scatterbrained approach to try to build software with something new every time you start a new job.
Right now there are tools being built that will make "AI/VR" easier to implement. Wait for the tools.
I believe that even though technology changes fast, the things you need to change do not move as quickly but that probably depends on the company and technology that company is using in the first place.
Where I am working, we are writing software mostly in Java, some analysis on the data with Splunk and SQL for the database.
Sure enough I had to keep up with Java development but it is not _that_ rapidly. Nevermind the fact that the company only now is switching to Java 8.
That being said, I do like learning new things in the field but they are not the "latest cool things", for example now I am learning Haskell by reading books on it and doing excercises, the normal way to learn a new language afaik.
I do tend to check out things that tickle my interest, lately I have made a small app in Angular2/Dart because it sounded interesting, but by no means have I learned to use them in-depth.
Eventually I realized that, out of maybe fifty theorems, only maybe three were used to prove all the others. So I memorized those three, and worked out anything else I needed for the test on the fly.
You don't need to keep up with the Hotness Of The Week(TM). You need to know the fundamentals well, and you need to be able to learn the rest when you need it.
The technology churn in other spaces like front end development can be mitigated by learning to read really fast. I don't have deep experience with any one front end framework, but I can inhale the docs and source code pretty fast when I need to get my hands dirty with one. Again, speed reading is a skill that will never be obsolete.
As far as updating my skillset, I watch conference talks, and I built a site that has a big index of them: https://www.findlectures.com/?p=1&class1=Technology&type1=Co...
I'm a game developer who have worked in 2d social/mobile lately and am now getting into VR. Turns out, if you know how rendering pipeline works, it's not such a foreign land after all.
The situation: my early-stage startup is fundraising right now, which can be kind of a time sink. Lots of accelerator/grant/angel applications. There's a good chance we hit our seed round, but also a good chance I'm unemployed next quarter when/if runway runs out.
In either case, I decided that AI, specifically deep learning, would be incredibly important to my career. The startup will need the expertise in the future (so I'll have to understand how to hire people with it), and should I need to find another job in a few months, this is a pretty cool field to learn and I find the work enjoyable (previously was a data scientist but foused more on vanilla regression and convex methods).
Therefore since November I've portioned out 20 hrs/week to the startup focusing on its fundraising and BD needs, which leaves a whole lot of other hours for skills development. Here has been roughly my curriculum:
- Mathematics review, and basic neural networks. For this I went over multivariable calculus and lin alg, which I've always been fairly strong, by essentially trying to derive the backpropagation derivatives for simple vanilla neural networks. Then make sure I understand derivatives and matrix data organization for convolution, which is a key component of modern ML.
Sources: pencil and paper, and lots of Google to answer any of my questions.
Time: 1-2 weeks.
- CS231n online course: http://cs231n.stanford.edu/
Great summary of modern methods in deep learning, plus more foundational level stuff. I read through all the lecture content and made sure I could work through derivations, because for me at least this cements technical understanding. Some of them are sort of tedious, e.g. manual RNN backprop. Also this course has great and simple software examples, I read through the code to make sure I understood the numerical computation and data organization parts. I also ran a few software examples and played around with parameters for fun (and learning).
Time: 1 month.
- Reading research papers (and online lectures) on applications that interest me. For this phase, I found 10 initial research papers that interested me. The topics for myself included image classification (starting w/ classic 2012 Hinton paper), reinforcement learning, robotics applications, video prediction. This step was harder, can be like learning a new language. Not every paper is going to make sense at first. But go through enough of them and you'll build up familiarity.
Sources: can start by searching through reddit.com/r/machinelearning
Time: 2 weeks.
- Learning software frameworks. From the above step I came up with my own small sample problem related to stuff I read that I could test even on my weak laptop (remember, training these big networks requires big computing power). So in this step I started researching different frameworks, and settled on starting a small project with Keras.
Sources: google around for deep learning libraries, read up on them, see what you like, and most importantly, have a motivating sample problem that you wanna code up.
Time: 2 weeks.
- Harder problems, more software, more papers. This is where I'm at now, it's sort of like an iterative research loop where I 1) come up with new problems I want to solve, 2) learn more about the software I need to implement it, and 3) search more prior work to gain insights on how I can solve the harder problems. In particular, I've switched over to learning and using Tensorflow, and also learning how to use AWS for stronger computing. So I had to dust off some linux scripting and command line skills too. Like I said, this is fairly iterative and probably closer to "modern research" where learning from my (virtual) peers and experimentation and production are closely linked.
Time: from the last month to present.
Overall, in the last 3 months or so at about 30 hrs/week I've added an extremely powerful new skillset to my arsenal that I've been meaning to do for quite some time. I can understand 90% of all modern research in the field, and create useful software to solve data-driven problems. Completely for free as well, aside from the $0.81/hr I pay to AWS for training some networks overnight. This is the type of thing I'd have wanted from a Master's (or even PhD) program, but who wants to go back to school...
Hope this helps someone :) Remember, AI/ML is more approachable than most people think, you just need to start with a solid mathematics background. After that you'll be flying, the field is relatively quick to learn, especially if you like learning through doing.