Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: With such fast changes in technology, how do you update your skillset?
114 points by ak93 on Jan 17, 2017 | hide | past | web | favorite | 88 comments
AI/VR and slowly lot more. I am getting quite anxious if my skillset might go obsolete.



To be honest, it isn't and I don't.

That sounds rather blunt, but most organisations that aren't startups don't change technology quickly if at all. C++ has served me well for two decades; I probably ought to adopt C++14 but on the other hand my current job requires that the codebase build with a 2008 compiler.

I'm also extremely skeptical of the extent to which AI and VR are new, as opposed to incremental improvements to technology which takes it over an adoption barrier. Have you seen the 80s VR headsets? SHRDLU? The "AI winter"?

If you're worried about this stuff then it's helpful to develop a level of knowledge about it that's slightly higher than Wired but lower than actual implementation detail, in order to talk about it in interviews. You can then pick this stuff up as you go. Machine learning in particular is maths-heavy, matrix algebra in particular, and that's never going to go obsolete.

I also agree with the commentators who are saying that you should ignore the latest flash-in-the-pan frameworks unless you really have to to get frontend gigs.


I completely agree with you.

I started in this field in 1992. I've seen the coming and goings of many flash-in-the-pan technologies.

If you are a dev and are selling yourself on your skillset, ask yourself, "is this sustainable?" The answer is "no." A fifty-year-old brain simply does not absorb new technologies as fast as a 25-year-old-brain. If your plan is to continually adopt new cutting edge technologies in order to stay marketable, I politely submit you need to rethink your long term plan.

As an almost 50-year-old, I promise you that the world is not a gentle place for older devs. Plan your exit into management, a related field, or some other job altogether. If you expect to be a coder at age 50 you're going to be disappointed.

As for me, I jettisoned the "technical skill set" war decades ago. I do not sell myself on my technical skill set. I sell myself as the ultimate generalist. This comes with its own set of problems. However, it has allowed me to age more gracefully in my field, because I do not create the expectation that my primary value-added is code generation.


> Plan your exit into management, a related field, or some other job altogether. If you expect to be a coder at age 50 you're going to be disappointed.

I'd like to simultaneously disagree and agree with you. If you were a 50 year old coder 10 years ago, then your only hope of remaining in the tech industry would be to add "Manager" to your job title. If you are a 50 year old coder today, that's still a sound direction to go, but increasingly becoming less necessary. If you'll be a 50 year old coder 10 years from now...I think you'll be ok.

Yes, at 50 you almost certainly cannot crank out as much code as you can at 20, but so what? As a coder it is important to understand that most problems in technology are not solved with more code, but less. That you will generate less code at 50 should be seen as a benefit. The advantage you have at 50 over someone who is fresh and new at 20 is that you can recognize that there is very little that's happened in the last 30 years that is genuinely new.

So, if you are 20, go ahead and spend your weekends on side projects learning the latest frameworks. As you continue to do this over the years and decades, shift to focusing more on patterns. By the time you reach 50, you might only generate half as much code as your younger colleagues, but you should be able to solve problems with a quarter of the code required, still making you twice as efficient as them.

When coding was new, code was the only metric by which to measure coders, and non-technical management types would view anyone who generated less code as less valuable. If you're not writing 500 lines of code a day, the argument goes, then you should be managing coders who can. But management and problem solving are not a completely overlapping skill set. Some engineers make good managers, but most do not.

Luckily, the more technically inclined individuals that populate the ranks of company management, the more this is being recognized and the more these companies are willing to hire the 50-year-old-coder-who-codes-less-but-solves-more-problems.


I'm seeing a lot of mythic stereotypes here. I am a 52 year old full time coder, and have been actively developing since taking a college pascal class at age 11.

> If you were a 50 year old coder 10 years ago, then your only hope of remaining in the tech industry would be to add "Manager" to your job title.

People outside the startup bubble value delivering, regardless of age. Developers outside the startup bubble work consistently in a few areas of technology, they develop deep personal understandings with cookbook / solution / frameworks they personally authored enabling them to construct stable solutions that expertly address the problems being addressed.

Essentially, if you remain a "coder" and you use your brain at all above simply being a coder, you will become a software scientist. I never start anything from scratch, as I have about a dozen application skeletons ready for various specific purposes, plus similar libraries I wrote, plus a knowledge of several large commercial SDKs, and developer experience in several major FOSS applications. The work that I do now would give my 20 or 30 year old self a heart attack with the large scope, number of complex technologies, and the time frame I'm expected to deliver. But I've been writing code for 40 years now, and I may bitch at my tools, but it will deliver, it will be well written, fully documented, and so on because anything less just creates technical debt.

If you like writing code, start acting like a scientist about it. Few developers do, and in time you will accelerate away from your peers into a truly enjoyable professional space very few seem to occupy.


I appreciate your different perspective enormously.

I have spent the bulk of my career working in larger, team-based environments. In my experience, being able to conform to cultural norms is essential to being a high-performing team leader or member.

Many, many teams have prejudices and / or litmus tests. For example, I've spent decades doing project management. I use Gannt charts. Why? Because they are the best way to communicate timeline expectations with stakeholders. I don't use them to manage the project. But I've encountered more than one Agile team that flat-out considers the use of these charts to be anathema. You might as well crap on the rug.

That's just one example. Obviously there are also the buzzword technologies. Here's an overgeneralization you might agree with: young developers tend to naturally gravitate towards newer, less proven tech. Us older developers naturally gravitate towards more mature, established tech. This creates a source of age-related friction.

Then there's the inability of others to grasp the applicability of your experience. For example in the 1990s I built a lot of actually awesome Lotus Notes/Domino applications. Now that's a technology that, if you mention it in various meetings, will get you laughed at. However, Domino was kind of the original "noSQL" / "BigTable" world, and it turns out that my application architecture experience in the Domino world translates meaningfully into these "newer" (ha!) database technologies. But try to tell that to team members.

So I guess my comments are orthogonal to yours. I agree with pretty much everything you write, but it leaves out the important social aspect.


> I never start anything from scratch, as I have about a dozen application skeletons ready for various specific purposes, plus similar libraries I wrote, plus a knowledge of several large commercial SDKs, and developer experience in several major FOSS applications

I've had a project manager (very insightful guy) tell me something similar before. He said "as you get more experienced you'll (ideally) write less code and instead reuse code from libraries you've written in the past". I've always wondered how that works for someone? I'm assuming certain projects will have requirements where specific technologies would be used in the front-end and persistent layers depending on the kind of project you are building and so those wouldn't be as reusable but perhaps algorithms you've written before might be reusable on the computational side of things. Like perhaps if you had a library that handles very advanced moving cost calculations you could ideally use that same library in different software forms (mobile, web, desktop, command-line).

Edit: Git might be a good example of a library that can be reused a lot and used within many kinds of applications (IE: inside web app, within IDE, from command-line, desktop app, etc)


Everything you build becomes your code library that you carry forward.

Smart devs know this and cultivate their personal repetoire.

For example in the veeeery first days of .NET I had the opportunity to build a nice internal application for a client. Of course I didn't have a .NET library - the tech was just coming out of beta - but Microsoft was nice enough to provide a bunch of "example code" and "starter sites" to give all us noobs some patterns we could work from - at least most of the skeleton of a working, database back-ended website with a working security model.

From this .NET project it was easy to find another, and now I had a complete .NET library with solutions to all kinds of problems - how to do doc management, how to solve image resizing problems in .NET, how to talk to all kinds of heterogeneous systems using .NET, how to implement webservices, etc. because I'd built these once. With that toolbox I could easily solve all kinds of other problems with only a little creative remodelling of the skeleton app.

Then in 2007 when I jumped into OSS I got to start an all-new library building solutions on the LAMP stack. By 2009 I had a nice little library built up again.

The pattern is amenable to any software tech. Every project builds from the last one, if you do it right.


Git is not a library, it's a program. There are, of course, libraries in many languages for using git[0] from within many other kinds of applications.

Another example of a library would be libjpeg[1] for reading and writing the jpeg image format and dealing with all the different ways the format can be adjusted, which gets incorporated into many applications, for example ImageMagick[2].

Basically, a library consists of code that is intended to be reused by calling it from other code.

[0] It's astonishing how often this gets reinvented. Here are just Python libraries: https://pypi.python.org/pypi?%3Aaction=search&term=git

[1] https://en.wikipedia.org/wiki/Libjpeg

[2] https://en.wikipedia.org/wiki/ImageMagick


> It's astonishing how often this gets reinvented

Wow, I'd only heard of libgit2 before. I can't believe how many git-related python packages there are. Yes git isn't a library, it's a program like you said, good catch


> start acting like a scientist.

I started late (38, 41 this year) and I couldn't imagine jumping from Ruby to Rust to whatever new hotness arrives when it's clear that there are handful of technologies that go deep (enough) and that are being used to solve problems that require someone to be more than a coder or even a dev.

I'm in the connected-car space and the only question I'm asking myself right now is: will my current skill set (mostly iOS in Swift/Obj-C/C) allow me to build for the future of augemented reality in vehicles or will going down the path of C++ -only- allow me to grow as a "software scientist" (perhaps a third way...)? Honestly I don't know, but I do know that I won't/can't find out if I jump into every HN rabbit holes that opens.


> will going down the path of C++ -only- allow me to grow as a "software scientist" (perhaps a third way...)?

I have no idea what higher level language in-vehicle AR will be written in, but C++ may be overkill, or a blind alley, or both.

You might be better off learning one or more of C, Go, Rust, Python, or Java (each has its own advantages and disadvantages) over C++.


>If you like writing code, start acting like a scientist about it.

Could you elaborate more on this?


> As a coder it is important to understand that most problems in technology are not solved with more code, but less. That you will generate less code at 50 should be seen as a benefit.

This is precisely my argument as a generalist.

I may not write code as fast or even as elegantly as I once did, but I'm much, much better at "seeing around corners" to avoid problems, and I'm much more likely to "build the right thing" due to my very broad but not deep technical base. I do not fall into the trap of "when all you have is a hammer everything looks like a nail."

> there is very little that's happened in the last 30 years that is genuinely new

This is true, but try to convince your 20something peer group of it. They are all convinced that their technical skills are revolutionary.

I studied waterfall systems engineering in 1998. But at the same time they also taught "RAD" iterative waterfall, or the precursor to Agile. I ran with RAD, and made it mine, so I've been doing something very similar to "Agile" for two decades. However, I still think that Gannt charts and top-down planning have value.

Showing that shit to the wrong 20something developer is a great way to be excommunicated.


> However, I still think that Gannt charts and top-down planning have value.

Of course they do. What you're seeing is a backlash to the way GANTT charts have been misapplied to projects that are less well defined, and to managers that ask for estimates and then are unpleasantly surprised when an estimate isn't entirely accurate.

Anyway, if you want to provide the same sort of value but present it in a different format, consider using a network diagram or a user story map[1].

[0] https://www.quora.com/What-are-the-best-alternatives-to-usin...

[1] https://www.scrumalliance.org/community/articles/2013/august...


> What you're seeing is a backlash to the way GANTT charts have been misapplied to projects that are less well defined, and to managers that ask for estimates and then are unpleasantly surprised when an estimate isn't entirely accurate.

You think I don't know this?

I think you missed the point :(

(As an aside, in my experience, network diagrams and user-story maps pale in comparison to Gantt charts, as they do not effectively communicate timeline expectations IMO.)


> they do not effectively communicate timeline expectations

Of course, that is in part by design.

If you are delivering solution X that is just an instance of iteration Y of system Z with a few customer-specific tweaks, Agile methods will work just fine in combination with GANTT charts, and teams that are engaged regularly in that sort of work aren't too likely to object.

However, as soon as a task is a known unknown (that is, a well characterized problem with an understood though as-yet nonexistent solution) requiring new code that isn't just a variation on familar themes (ie. whatever is the equivalent of a CRUD web application for your industry), you start having to deal with uncertainty and risk (which humans are very bad at reasoning about). GANTT charts can make things worse because they foster a false sense of control (padding time estimates by x% for example feels like it helps, for example). And if anything in your critical path is a known unknown, your timeline expectations, however consensus-driven and reasonable as they may be, are likely to get smashed to smithereens, and all a GANTT chart is going to do is tell you when your carefully constructed schedule is starting to slip.

Throw unknown unkowns into the mix (from "we haven't yet figured out how to solve this" all the way to "are we even solving the right problem"), and timelines become entirely fictional.

I am not aware of any chart format that incorporates this sort of risk and uncertainty well (FogBugz for example, which incorporates probability and risk based on an estimator's past track record, basically just sticks that data into a chart's tooltips)

Now, if you are willing to mitigate these risks by basing your timeline around deadlines for handoffs with the understanding that what gets handed off is "best effort working system in available time" (in which case continuous integration/delivery is your best friend) that's a different matter, but that just isn't how most projects are concieved or managed. Yet.

BTW, related to all this is the conflict between feature-based and time-based approaches to software releases.


Right, my only quibble with your previous comment was the notion that being 50 in tech means migrating to a management role. I think that advice is slowly (hopefully) becoming outdated as there is more recognition that software engineers have more value than just the SLOC/day they produce.


Ah, I see. I agree. I think "management" includes lots of quasi-technical roles, in which people and project leadership are the core skills, with one's technical base serving as a platform for managing others with highly focused technical skill sets. Hopefully you are also right that these jobs are increasingly in demand.


I completely disagree with this. I am also a few years shy of 50 and way ahead compared with 20+ year old self-styled "ninja" developers. They simply don't know what they don't know. They truly believe that they are "experts" or "ninjas" because they have read a few tutorials on the latest soon-to-be-forgotten fashionable language/framework. Or believe that framework/language X is truly something new because a "hello world" web application is easy to write. And then a year later, when things get tough and the realise that the wonder framework/language has limitations, they jump ship to yet another shiny "silver bullet" framework/language that will save them from having to actually learn the hard stuff, allowing themselves to still believe that they are "ninjas". Meanwhile, people who actually know their stuff take over the ruins of their efforts and make it work. Rewriting it step by step to fix it.


And how do I know this by the way? Because I have done it many many times in my career. Taking over a ruin build by "ninjas" and making it actually work and bug free in production. Meanwhile the "ninjas" have jumped ship to another company, building a new ruin with some new "better" framework/language. Until of course things get tough and they again jump ship, leaving the ruin to somebody who actually know what they are doing.


Er, one of the best developers I know is over 50. However, they consciously chose that path instead of tracking into management because they like being a principal engineer where that means they still get to work with code, but also have enough experience to anticipate pitfalls and make considered decisions.


Great reply. Interesting how your decision to become a generalist stands in direct opposition to most of the replies to this thread: https://news.ycombinator.com/item?id=13409239

Given that you probably have much more experience than many of the people here, I'm going to assume that your advice is more sound than theirs.


The truth is somewhat in the middle, I think. An experienced generalist is a very different prospect to a graduate "generalist" - in some sense, all graduates are generalists because they don't have experience. See the thread the other day about how many people with physics degrees crosstrain into tech and programming.

Generalism is insurance, while specialism can be more profitable if the speciality you pick ends up in demand.


I must add that I have undergraduate and graduate technical degrees and spent from 1983-2005 building lots and lots of apps (I started coding at ~ age 15). So I pursued "generalism" only after being a reasonably qualified (but not specialized) technologist.


I'm in complete agreement. I've been working steadily with older platforms for 20+ years. I pick up newer frameworks if needed, but as I rarely choose to work with startups, they are rarely needed. I have been working through some Deep Learning coursework... you are right that the math behind it all is not new, but the practical application of it all with readily available GPUs and libraries is starting to make it approachable to the average coder, so I feel it is time to become familiar with it. And a few such things have come along over the years... SPAs via AJAX, MVC/MVVC patterns, etc. But I don't consider a new pattern every few years to be exactly a fast pace to keep up with. And underneath it all, for us web developers, we still are working with the same basic HTTP request/response model, just doing fancier stuff with it at either end.


I have to say, as a student it is very comforting to read this.

I've been worried sick trying to learn the things I see people commenting here or in hackernoon. I had the idea that by the time I graduate, these technologies will be trending or have a good market share.


As a student, you can learn how to place "the new shiny" into the context of technology industry history, including academic research papers and industry war stories from teams that built successful products. When you read tech "history" books, the lens of time separates hype from lasting innovations. Also worth going back a few hundred years in engineering history [2] to look at first principles that still apply in the modern world, even though tools have changed.

With historical perspective, you can choose a "v1.0" or even a "v0.1" modern tool with confidence, when you have reason to believe that it's effectively "v255.0" in a lineage of discontinuous improvement that spans centuries.

[1] https://news.ycombinator.com/item?id=7952209

[2] https://news.ycombinator.com/item?id=8158385


Glad to be of reassurance. There's more in the world than it's possible for one person to know in a lifetime. The "last man who knew everything" died in 1829 ( https://en.wikipedia.org/wiki/Thomas_Young_(scientist) ). So we have to pick who and what we keep up with, and to what depth.

There's a good chance that something else could be trending by the time you graduate. The media mood is very ephemeral. For employment purposes it's better to look at job ads statistics (annoyingly hard to search for phrase) and make your own inferences from there; anything currently popular will take a while to sink. As with any statistics, also ask yourself what is left out.

Stackoverflow dev survey is a reasonable starting point: http://stackoverflow.com/research/developer-survey-2016

Javascript frontend has been big for a while, but suffers from a tremendous amount of churn. Maybe the best solution here is what people used to do with music: find someone "cooler" than you and copy their taste.


This is a good answer. Anyways, I was going to say that hopping a new technology is going to be a lot less profitable than really mastering the old which lots of systems are built on and will be built on, and which new talent never wants to sit down and really learn.


It may not be in the spirit of HN, but deliberately being 3 years behind on the latest technologies is a really good way to stay employable without driving yourself mad keeping up with the latest and greatest. After 3 years, the flash-in-the-pans and the duds have been winnowed out and you can just concentrate on the stuff which will earn you paying gigs.


I agree with that. I don't think it's counter to the spirit of HN. I build apps for clients with the tools I know I'm proficient with but I play and hack a lot too. Sometimes that leads to use with client software though more often it doesn't, but learning is always good.


Learn patterns and pop up the abstraction level.

There are only a few patterns in programming: imperative, OO, functional, etc. Learn those.

There are only a few abstraction levels in problem solving: meta, business, system, physical. Learn those.

There are only a few types of patterns in ML and Big Data. Looks like it's time to learn those.

But the principle is the same. Learn the patterns of various forms of solutions, not actual languages or tech (they'll be required, of course, but they're only a prop). Be able to move between these various patterns. Then deep dive from time to time on various projects in each area.

We've passed the point where a person could keep up long ago. Now it's simply about being both broad and deep at the same time. T-shaped people. If you want to make a lot of money you can be that one guy who knows everything about some tiny point -- but you'd better hope that point doesn't become obsolete in ten or twenty years. I've seen this happen far too often in tech.


I've wrestled with this question virtually since the day I began coding professionally:

Should I become an expert at one language/domain, or, should I constantly learn new things and change roles?

I've done the latter, and I don't know yet if it will have been worth while. I worry about being a "jack of all trades, master of none". Yet, as you point out, a master of one trade had better hope it doesn't become obsolete in their life time.

So my hope is that the investment in learning, and adapting, will pay off in the long haul. I can write an iOS app, I can write an android app, I can code a backend server in scala + akka, or I can write a backend server in PHP. Can I do these things as well or as quickly as a master in each domain? Certainly not.


You say "or" but it can be "and". Best is to be an expert in one, "and" having a jack of all trades experience. I.e. you can specialize in front-end work with framework X, but doesn't mean you can't have experience with optimizing db queries. I think a big part of how to do that is just to stay curious and learn new things for the fun, while having a go to language to get things done efficiently.


> You say "or" but it can be "and". Best is to be an expert in one, "and" having a jack of all trades experience.

Whether that is possible depends on the definition of "mastery" that you use, and how broadly you construe "all".

TL/DR: Even generalists have become much more specialized over time.

Consider that "web design" circa 1996 used to include project management, end-to-end development, PR, search engine placement, graphic design, copy writing/editing, information architecture, browser testing, server administration, domain registration, email configuration, etcetera, etcetera.

Over the past two decades, these tasks and many more besides have swollen to bursting, splitting into entire subdisciplines (some of which have endured, others not so much) and in some cases re-merging into new hybrids.

There is no way for anyone to maintain more than passing familiarity, much less competency, with every aspect of building and launching a non-trivial website (much less a web application). At best, we'll wing it on personal projects, or even skip major .

These days, what passes for a generalist is the "full-stack developer", which pretty much leaves off everything that doesn't have to do with code, or outsources tasks like design to 3rd-party services (eg. Netlify), libraries (eg. Bootstrap), themes, and so on.

I expect this trend to continue and even accelerate, so don't drive yourself nuts trying to keep up with the developments across everything you do today. I guarantee that whatever you consider to be the set of skills a generalist should understand now, it will get narrower over time (but not shorter), and we will have as many kinds of generalist as there are specialists today.

I am pretty sure that the "serverless" trend will lead to a new breed of "middleware developer" that works on various sorts of smart caching proxies, for example.

My advice is to try and be strategic in the things you drop and no longer keep up with, and keep the things that you do maintain competencies in adjacent to each other and to one or two main areas of expertise.


Do you mind expanding a little bit on the "few types of patterns in ML and Big Data". Do you mean models, pipelines etc or something different. I am studying ML in grad school and I'm seeing way more patterns than just a few, almost too many to be honest if by patterns you mean models. I'd love to hear your view and what you meant by only few types of patters.


Have to do side projects. In my past life, I was getting sucked into becoming a very conservative tech stack lifer at a huge, all-encompassing company. Most people that surrounded me, even the good, hard-working ones, were 9-5 and expressed surprise and hostility to learning anything outside of the company bubble. Then one day a new, more active guy joined our team and whipped up a complete REST-based service in a week. My mind=blown. I quit for the startups, and moved on to using dozens of different stacks since and never looked back. The best, most educating moments typically happened outside of work, when you combine the patterns and observations from work with a different stack or a smart outsider friend who chimes in on your daily struggles from a surprising different angle.

Another enlightening moment for me was when I was working on a hobby machine learning project, and shared my design concerns with a brilliant but very much non-ML coworker, and all of a sudden that coworker laid out the whole design in a pretty convincing detail, like he's been doing this work for years. After the initial shock from his seemingly birth-given ML skills, I noticed that he simply takes a lot of good online classes and goes through all the top ML material on the web in his spare time, even though it was irrelevant to his tech focus at the time. Well guess what, two years later he got promoted and he's making that sweet data science money, and guess where he would have been if he only focused on his old day-to-day instead.


My current strategy has a bit of complexity and might take an entire blog post to explain clearly. The high level view is this:

Skills vary both in how much the market values them and in their durability. There's often a trade-off between these two characteristics. For example, half a year's worth of study in a foreign language or pure math is only somewhat valuable to the market but that value doesn't tend to decrease over the years. Learning AngularJS in 2013, on the other hand, was so highly valued by the job market that it was a great way for junior programmers with no degree to break into a software engineering career.

I believe it's best to generally focus most learning efforts on durable skills, but occasionally when there's an opening, to flop and focus 100% on an ephemeral skill that's highly valued and appears likely to be even more highly valued in the near future. After capitalizing on the opportunity, return to mostly focusing on durable skills.


A lot of the technology on the bleeding edge will be gone in a couple of years. AngularJS v1 used to be the next big thing, now it's obsolete. Who knows if v2 will stick around.

So following the latest technology in detail is unnecessary. Far more useful is just having a broad sense of what tools are available out there; it takes less time, and it's more useful since it gives you access to a broader set of tools on-demand.

Beyond technology, the things that persist are much more fundamental skills:

1. Ability to read a new code base, and ability to quickly learn a new technology.

If you can do this you don't need to worry about new technologies since you can always learn them as needed. E.g I just wrote a patch for a Ruby project (Sinatra) at work even though I don't really know Ruby and never saw the codebase before. It got accepted, too.

2. Ability to figure out what the real problem is, what the real business goal is. This makes you a really valuable employee.

Technology is just a tool. More fundamental skills are your real value.

More detailed write-up on how to keep up without giving up your life: https://codewithoutrules.com/2017/01/11/your-job-is-not-your...


While Angular 1 might be 'obselete', there's still a lot of corporations out there with tons of Angular 1 code, and most of them are not going to be upgrading anytime soon. Angular 2 is enough of a paradigm shift that it would require rewriting those apps from scratch, pretty much, and there won't be a business need to do so for another 5+ years for a lot of these companies.

We're still doing new apps in Angular 1 here, because everyone knows it, we can reuse more code, we know most of its quirks and how to squeeze performance out of it, and we can get the apps out the door a lot faster. Eventually we will have a new project where we decide to use something more current, though.


The point is not that Angular 1 is bad somehow.

The point is that trying to keep up with all bleeding edge technology is a waste of time, because it's constantly being replaced.

While at the same time you need to learn whatever technology you use at work, which may not be the latest-and-greatest.


Depends. I'm shepherding http://www.airwindows.com/ through a switch to Patreon, by expressing new DSP ideas in a context of very, very old audio plugin frameworks. The dev tools I'm using won't even work on current computers. I code on a time capsule laptop and depend on the very simplified plugin formats I've chosen (generic interface AU and VST) to remain functional. They'd have to break the most fundamental interfaces to kill my stuff (which doesn't make it impossible to do, just very user-hostile)

Don't confuse advances in technology with intentional churn generated by vendors and platforms. The latter is a plague, and it doesn't only cost people money, it costs them productivity. You may be getting confused and mistaking skillset for toolset. Large companies will always be able to replace your toolset and demand you learn a whole new one, because the more you do, the more you'll be locked in to their toolset. If you can abstract out the functions being implemented and express them in different ways, you can take your skillset different places.

Whether you do that, depends on how good you are at finding niche markets. As someone who's stayed in business for ten years selling GUI-less audio plugins with no advertising and no DRM of any sort, I can tell you (1) niche markets exist and they're loyal, and (2) they're small, which is what makes them niche. :)


> Don't confuse advances in technology with intentional churn generated by vendors and platforms. The latter is a plague, and it doesn't only cost people money, it costs them productivity.

Do you think churn is intentional within a single vendor, e.g. to force upgrades? Could churn be a by-product of competition between vendors, e.g. AWS refactored most of enterprise computing into "low-end" services that steadily improved, but were proprietary and increased lock-in.

> The dev tools I'm using won't even work on current computers. I code on a time capsule laptop and depend on the very simplified plugin formats I've chosen (generic interface AU and VST) to remain functional.

Is the time capsule laptop for old operating systems or old hardware? COuld the old operating systems work in a virtual machine?

> They'd have to break the most fundamental interfaces to kill my stuff (which doesn't make it impossible to do, just very user-hostile)

Apple tried to get ride of files (!) entirely, but they are slowly making a comeback on iOS, e.g. now you can insert an attachment within an email, with the right application plugin. Social networks have done their best to replace RSS push notifications with proprietary pubsub. WebDAV, CalDAV, CardDAV are thankfully still supported by a few good apps.

> niche markets exist and they're loyal, and (2) they're small, which is what makes them niche.

How do you market your services/products within your niche?


Yes, I do. Forcing upgrades is just another way to force sales, and that's competition. Churn happens.

The old laptop is just the most convenient sort of virtual machine. At some point it'll be easier to run a virtual time capsule laptop… however, the physical time capsule laptop is from a time before intense spyware, so there are security issues as well.

As far as niches, Airwindows doesn't market at all. It's only word of mouth, for ten years, with few exceptions (notably, Console2 got reviewed in Tape Op, a trade magazine). This is personal: I loathe getting harassed by marketers so much that I won't even email, much less advertise. I collected a list of the 'Kagi generation' customers who specifically said they wanted to be on a mailing list and hear from me that way. And then I haven't emailed anything to them for months and months :) So, effectively, my business is 'for people who hate marketing so much that they want to do business with someone who will absolutely leave them alone and not bug them'.

By definition this is a niche to starve in, but it's sincere. I really do hate most everything about marketing, so I simply will not do it. Sometimes when I have a notable post or product I leave out the patreon link on purpose :)


Haha, new marketing category: product only available for purchase in a 24-hour period that happens twice a year, or on a date decided by the roll of a dice in a youtube video, or by solving a puzzle, or .. :)


But you're still saying 'purchase', and my market sector is completely dominated by software piracy to the point where it's choking on sketchy and unreliable DRM.

More like, 'new' marketing category: Trust Building Exercise. Attempt to give everything possible away, and see if social pressure can cause a lot of people to go 'yay!' and throw money. Hence the Patreon with literally no tiers above 1$, with at least half the patrons from 2$ to 10$ on their own volition.

The trouble with that (speaking as someone who has some notion of marketing but chooses to undermine it) is, it's one of those power-law relationships where basically you have to be me to do it :) without ten years of sorta grassroots presence in the industry and a large number of successful products that perform well as software, you can't do it. You can't simply start up and have a Patreon work on those terms, even if your products are exactly as good, and this is a problem.

Solving that would be a very big deal but it's a bit beyond me for now…


3 steps program (specially crafted for the aged)

===============

0. Assume any "new" thing is worse than the "old" alternative - until proven otherwise.

1. Critically filter out hype/PR.

2. You're left with much less to learn.

3. Invest "out-of-work" time in something really valuable.


Yep, totally agree. And I think the more experience you have, the easier it gets to filter out the crap.


It depends on what you already know, I think embedded development with systems level languages and hardware know-how is a very durable skill.

On the other hand, some fields like web development have peaked a while ago, I would argue that 2012 was the high watermark. I think it's a very precarious choice of career right now. It has been steadily going downhill since the introduction of trendy front-end frameworks that don't offer any value to the end user (including React, Angular, et al). The culture stopped being about making usable and accessible interfaces for people, and more about "component architecture", "server-side rendering", "tree shaking", that solve problems created by the very tools they are using.

That isn't to say that web development is dead, but I think that the future will be more specialized around certain features of the platform such as WebAssembly, WebRTC, WebGL, Web Audio, et al. And these will be more readily picked up by people with more durable skills, than those who only know the most popular front-end framework.


> The culture stopped being about making usable and accessible interfaces for people, and more about "component architecture", "server-side rendering", "tree shaking", that solve problems created by the very tools they are using.

Speaking as a confirmed cynic, that seems overly cynical. ;-)

The problems being solved by the web framework hamster wheel are those of rising bars for usability and speed, with measurable in $$$ effects (ie. an extra 1s delay could increase shopping-cart abandonment rate by 1.5%). Which matters a lot more at web-scale.

So, these trendy frameworks are solving problems that most developers shouldn't worry about (it is premature optimization) but that matter a great deal to the companies that release them (Google: Angular and Polymer; Facebook: React and Flux; etc.). OTOH, it is tempting to tap into of all the engineering effort that goes into libraries like these. You just have to know where to stop before sinking into the HammerFactoryFactory mire. ;-)


Just like you have to fight feature creep in your products, you have to fight "shiny new tool / language / framework" creep in your skillset. Become an expert in your topic of choice and use the best tools to get it done quickly, whether its 20 years old or two. If you spend too much time learning new tech, you won't get it done quickly, but you shouldn't force an old tool to do something just because you don't want to learn something new.

As for the anxiety, turn off HN every so often and just focus on being a good engineer with your current tools. Nothing changes so fast that you can't go a few months or even a year without being in the know. When it comes time to stsrt a new project, spend a week researching the current tools and see how they fit into your stack.


Understand the principles behind things. Most stuff is reinventing the wheels of implementation of a much smaller base of theories.


To give you some context, I'll answer this as a [largely] life-long consultant. As such, I don't chase technology, I anticipate the trajectory of job growth. I want the project that I decide to work on next to propel me to a project that will likewise broaden and deepen the experience and skillset I have to offer. <p>I also try not to get trapped into working for clients whose only interest in my experience is to recreate one of the last things I did. The easiest way to kill your value is to do the same thing over and over - even twice is too much.

It's true technology progresses in dog years. When you are working you are not learning outside that bubble. When you are between assignments you absolutely must treat that time as a sabbatical to learn something/anything new.

By broadening your skillset through selective project engagement, you are better off than Skippy who has worked on the same application with great job security for 5 years - Skippy will not be someone you will re-encounter 10 years from now unless you are buying a used car and they happen to be the sales person. The industry is self-selective this way. The complacent "I got mine" mentality is toxic to longevity in the industry.

Let me also dispell the meme that sticking to a specialty is a desirable thing. The fact of the matter is that the ocean of legacy code grows exponentially and there is always a need for someone who knows a legacy language or technology. this kind of career trajectory is as desirable as cleaning out septic tanks. There's job security to be had and you'll hear plenty of "Ho, ho, ho - I don't need no stinkin' new fangled whatever" to be indispensible. My advice is not to be that guy/gal.

It is a much harder and a much richer experience to navigate a career in the flow of technology than to get myopically paralyzed by a desire to featherbed where you are today. But your question is "how" to keep up. IMO, the answer is to skim lots of material and only dive in at the last most relevant moment. The generalist is far more qualified that the specialist these days because most companies cannot afford a prima donna - they need people who can perform many jobs and serve many needs.


For me, the answer is side projects. I keep playing with new ideas, and use new techs by the way.

It not only allows me to discover the tech, it's also especially important because I refuse to make first contact with a tech by implementing it directly in a production project meant to stay around for years. The most projects I've used a new tech in before introducing it in my main project, the more comfortable I am that I did not do gross mistakes.

For me, it's not the amount of time using a new tech that matters, it's the amount of projects I used it in (because each time, I can try a different architecture).


I developed a checklist to spot technologies that are in an early stage of the hype cycle and avoid them.

The following are signals that a technology is in and early part of the hype cycle:

* It has the backing of a major corporation or a startup with a marketing budget.

* There are a lot of rave articles/blog posts about building relatively simple or small self contained applications using this technology.

* There is a small but vocal contingent of users who are passionately against the new technology. Their arguments often link to bugs on the bug tracker, cite edge cases that occur under heavy usage and indicate fundamental oversights in the design as well as assumptions (and often arrogance) on the part of the architects.

* The benefits cited are either A) vague or B) explicitly aimed at beginners.

* Arguments in favor often appeal to authority (e.g. "Google uses it" or "XYZ company use it in production"), popularity ("everybody's containerizing these days") or cite benefits which were already possible.

* A high ratio of talk to action: the technology is talked about on Hacker News a lot but there appears to be very little real life stuff developed with it and a lot of the talk involves jamming it in where it's not really necessary.

* Sometimes I experiment with a technology for an hour or two and I see if there's anything obviously wrong with it or if the criticisms are valid.


- Be aware of new trends. You don't have to learn everything. But pickup and try those that are promising to solve real problems in your current position.

- You should be able to move back and forth between management and technical positions. They are not mutually exclusive. You can a skill set that allows you to do either. It gives you greater perspective and flexibility. One piece of advice that I was given in college is that even if you are the Director of IT you should leave small (non-critical) pieces of software for you to work on. So you never lose touch.

- Try to work for good companies. My definition of good companies are those where you can be productive every day.

- Some skills will be helpful all your life. I learned Unix in 1989 and have used it almost every day.

- Learn the fundamentals. Data structures, algorithms, relational theory, structured programming, object oriented programming, functional programming, networking, Operating systems, theory of computation, et al.

- Understand the business domain in which you are working. That makes you extra valuable for your current company.

- Develop your soft skills. http://www.skillsyouneed.com/general/soft-skills.html


Focus on the fundamentals. How do you find the write tools and libraries? How do you translate requirements into projects? How can you communicate with other people effectively? How can you learn what you need to know when you need to know it? How do you recognize sunk costs? What makes good code good code, regardless of language? What compromises should you make, and when? How do you ask the right questions?


Learn to learn, and focus on your current client/task first instead of technology. People still get paid to write Cobol.


1. Relevant skills don't change. Your abilities to reason on problems are never becoming irrelevant.

2. New technologies are adopted, doesn't mean old ones quickly disappear. Sometimes not even slowly.

3. Area focus. If my area of expertise is networking, what do I care about VR? We can't be generalists any more than we could be 20 years ago.

4. If you feel like being a generalist, understanding & internalising (basic) principles is more important than being familiar with specific technologies

5. Critical, transversal thinking. You can weed out heaps of new technologies by understanding _how_ they fit in a system and the tradeoffs they require, before you have to become intimately familiar with them. Base your approach on tangible end-to-end measurements to understand how technologies might fit in a system, and after that you'll have to keep up with a lot less than the various FOTM


Well, a critical eye is a must. Just keep away from the noise and pay attention to what would really make an improvement on your current framework/workflow.

For example, I still do my web development in Django or Flask because they do the job, I'm pretty good at Python and most projects don't really need the concurrency Go or Elixir offer.

One of the best recent additions to my skillset was Docker... a lot of people say containers are not a must but they really made my life easier and allowed me to do cool things for clients from different industries.

It doesn't sound as cool as doing machine learning, computer vision or natural language processing, but don't let the AI/VR hype make you anxious, just focus on what you really want to do.


I'm more worried about people constantly believing that every new framework is a major advancement for programming and that it's not just something that could be learned in an afternoon (e.g. React). Or about people following the latest hyped trend without learning anything and without producing much other than more hype.

AI,ML and VR are all really interesting, but as we all know they are not completely new and will not likely account for the majority of the future jobs.

Fundamentals are what matter, most of these "new things" are just something that you can learn with relatively limited effort if needed. Classic programming skills, analytical skills or things like the ability to reason about concurrency issues never go obsolete.


I learned the hard way about 10 years ago what happens when your current skill set becomes obsolete, since then I've become very focused (and lucky) and only take on really enjoyable and unique projects. That way it's easy to be exited and do a good job during a project, communicate your passion in future interviews and transfer that enthusiasm in your previous projects to potential future employers.

The exact tech choices doesn't matter that much it's more of the overall direction (in my case analytics, in a bunch of varied sub-fields).

Although the top comment has some merit I'd argue c/c++ is an outlier here rather than the norm.


I learn new skills at the job, but at the same time I find that it's not enough. What works best for me to really get into new areas and update my skill set is to regularly take MOOCs and then try to find ways to use the new skills at my job.

At the moment I'm taking this course in Deep Learning

https://www.kadenze.com/courses/creative-applications-of-dee...


Beware of those that make it more important to follow the latest trend than to follow their business model (or that just go "what's a business model?!")


A great quote from Lambert about what software development is:

1. Decide what the program should do. 2. Decide how the program should do it. 3. Implement these decisions in code.

Only the last part is actually coding.

In other words, as a software developer, you are not paid to type. You are paid to think. And the deeper your knowledge and experience, the better tools you have to actually do that. So focus on learning step 1 and 2!


I think having a good core is more important than the "latest flashiest framework" / "New Language" / "whatever" at the end of the day it will be your experience that gets you out of the shit, not some new tool! However saying that I think it is important to not let yourself go stale. I worked with a company that had about 6 devs that were doing things seriously old school, they had no interest in upgrading their skills and for someone who likes to keep on top of the new features on languages it was painful to work with them. In the end however, when they left that company they probably found pretty quickly that they were unemployable!

For me personally, I find that I generally have free reign to test out "new (to me) technologies and my experience helps me realise quickly if they are going to be helpful or a bust!

Take any opportunity you can to do a "little project" in something that interests you and then apply it to problems in your work!


Like pjc50, for the most part I don't. Last year was the exception.

I've been making web apps since 1998. Last year I learned how to use CouchDB and PouchDB. Before that I used a flat file database and the built-in filesystem to manage data. I used Perl on the backend with just a bit of JS on the front end to run the apps.

I never did learn how to use MySQL/PHP. I looked at it, decided it was a butt ugly way to make websites, and apps and admitted to myself that I wasn't qualified to design secure SQL apps and didn't want to learn because that is a career all by itself.

So I waited for something better to come along and last year CouchDB along with PouchDB hit the mark so I spent the year learning and using them. It was worth it. With those tools I am faster and better.

The years in-between were spent getting stuff done with the tools I was good at using, not trying to learn how to use a zillion other tools to do the same things.

I looked at a lot of newer tools again last year with an eye towards what was "best". There is a lot of cool stuff out there that does some really jazzy stuff, but in the end I decided to take another look at what was "easiest".

I ended up with CouchDB, PouchDB, and JQuery. Easy to learn and incredibly rich APIs with lots of support and example code. There's more than enough in those to learn and keep up with and if I need to add something I'll look for easy ways to do that too.

The truth is, it takes time to be productive with any language or tool or framework you use. It's a scatterbrained approach to try to build software with something new every time you start a new job.

Right now there are tools being built that will make "AI/VR" easier to implement. Wait for the tools.


Well about the ones you have said in specific, AI/VR, I do not think they will be a requirement for the majority of jobs in the software industry for quite some time.

I believe that even though technology changes fast, the things you need to change do not move as quickly but that probably depends on the company and technology that company is using in the first place.

Where I am working, we are writing software mostly in Java, some analysis on the data with Splunk and SQL for the database.

Sure enough I had to keep up with Java development but it is not _that_ rapidly. Nevermind the fact that the company only now is switching to Java 8.

That being said, I do like learning new things in the field but they are not the "latest cool things", for example now I am learning Haskell by reading books on it and doing excercises, the normal way to learn a new language afaik.

I do tend to check out things that tickle my interest, lately I have made a small app in Angular2/Dart because it sounded interesting, but by no means have I learned to use them in-depth.


Don't until you need to. If you are planning to leave your current job then look at job offerings in the market and learn the tech needed to get one of those jobs. And since the technology underlying whatever tech is currently fashionable hasn't changed the last 50 years, it will be relatively easy to learn enough to get a job. The rest you learn on the job when solving specific problems. I would instead focus on learning the core tech that hasn't changed for 50+ years. Including functional programming, logic programming, how a computer fundamentally works (NAND gates) etc. What you learn from that will never become obsolete. You just need to translate what you learn into whatever the latest fashion framework/language call it and ignore the false hype.


For example: people who learned lisp years ago are now laughing about the hype higher level functions (lambdas etc.) are getting. It is ancient technology. Same when Java introduced garbage collection. Yet another ancient technology. However my argument is not that you should learn lisp today to get a job. You should learn it to understand the fundamental technologies that future fashionable frameworks/languages will use. Same goes for type theory. Rust is also using ancient technology (linear types) that you would already have known about long before Rust showed up if you learned type theory. The next step after Rush is of course dependent types.


When I was in college, I majored in math and physics. Only two math classes gave me trouble. One was Real Analysis. There were just too many theorems. I couldn't remember them all, and then I couldn't figure out what to use to solve the problems on the test.

Eventually I realized that, out of maybe fifty theorems, only maybe three were used to prove all the others. So I memorized those three, and worked out anything else I needed for the test on the fly.

You don't need to keep up with the Hotness Of The Week(TM). You need to know the fundamentals well, and you need to be able to learn the rest when you need it.


3D graphics math is all trig and linear algebra, with some signal processing. You don't have to know that stuff to use unity to build VR apps, but it helps. Those skills will never, ever, be obsolete. Even deep learning is rooted in linear algebra.

The technology churn in other spaces like front end development can be mitigated by learning to read really fast. I don't have deep experience with any one front end framework, but I can inhale the docs and source code pretty fast when I need to get my hands dirty with one. Again, speed reading is a skill that will never be obsolete.


I think you're mixing 2 things. First, what investors are investing in (AI/VR), then what kind of jobs will tech companies offer in a near future. I think the second one won't change, what will change will be the visions and missions of the future companies. There will still be software development for a while, even if you work for an "AI" company. These are just hot keywords in 2016/2017


I wouldn't let AI & VR worry you- that type of growth in the field means new types of jobs.

As far as updating my skillset, I watch conference talks, and I built a site that has a big index of them: https://www.findlectures.com/?p=1&class1=Technology&type1=Co...


I'm concentrating on learning stuff that fascinates me. This also happens to be computer science skills that won't go out of date for a while. Some of it discovered before silicon chips! Even if it won't directly help me in my job today or getting a new job, it Will help my career in the long term by making me a better thinker and programmer.


Do not update it. Technology is actually painfully slow. Maybe you need to change technology once in ten years (such as moving from C++ to Java).


Whoa there, let's not go overboard. Implementing a decadal cadence for switching would require budgeting for about a year of reduced earnings while you build competency in the new platform/language/ecosystem (and the associated client base).


If you have the possibility then try joining larger and younger teams at work. Also: be constantly aware of the current state of technology. You can't have a deep knowledge of everything but at least knowing which problems a new technology solves helps a lot.


What is the advantage of larger and younger teams in regards to staying up to date?


You are right, I did not expand on that. Larger teams bring in a potentially wider set of skills, and your "next" is probably some younger developer's "current" already.


Side projects


Fundamentals.

I'm a game developer who have worked in 2d social/mobile lately and am now getting into VR. Turns out, if you know how rendering pipeline works, it's not such a foreign land after all.


Can confirm fundamentals. The longer i work as a developer, the more i see the importance to have a grip of the basics. For me this is mysql, linux, math rather than new frameworks, languages and build tools.


Don't hold on to the bleeding edge so tightly and you won't get cut. It's fine to ride the wave, but don't forget to polish your less specific skills.


Asking this question might help - how could I have learnt what I have learnt already in very less time. Writing down steps or rules could help narrow down to current scenario.


Here is how it's going for me. For background, over the last few months I've very quickly become adept in deep learning, being able to understand current research papers, read through textbooks, implement my own models with Tensorflow and other libraries, and train real models on remote servers (e.g. AWS). For reference, I have an engineering background but no formal schooling passed an undergraduate degree.

The situation: my early-stage startup is fundraising right now, which can be kind of a time sink. Lots of accelerator/grant/angel applications. There's a good chance we hit our seed round, but also a good chance I'm unemployed next quarter when/if runway runs out.

In either case, I decided that AI, specifically deep learning, would be incredibly important to my career. The startup will need the expertise in the future (so I'll have to understand how to hire people with it), and should I need to find another job in a few months, this is a pretty cool field to learn and I find the work enjoyable (previously was a data scientist but foused more on vanilla regression and convex methods).

Therefore since November I've portioned out 20 hrs/week to the startup focusing on its fundraising and BD needs, which leaves a whole lot of other hours for skills development. Here has been roughly my curriculum:

- Mathematics review, and basic neural networks. For this I went over multivariable calculus and lin alg, which I've always been fairly strong, by essentially trying to derive the backpropagation derivatives for simple vanilla neural networks. Then make sure I understand derivatives and matrix data organization for convolution, which is a key component of modern ML. Sources: pencil and paper, and lots of Google to answer any of my questions. Time: 1-2 weeks.

- CS231n online course: http://cs231n.stanford.edu/ Great summary of modern methods in deep learning, plus more foundational level stuff. I read through all the lecture content and made sure I could work through derivations, because for me at least this cements technical understanding. Some of them are sort of tedious, e.g. manual RNN backprop. Also this course has great and simple software examples, I read through the code to make sure I understood the numerical computation and data organization parts. I also ran a few software examples and played around with parameters for fun (and learning). Time: 1 month.

- Reading research papers (and online lectures) on applications that interest me. For this phase, I found 10 initial research papers that interested me. The topics for myself included image classification (starting w/ classic 2012 Hinton paper), reinforcement learning, robotics applications, video prediction. This step was harder, can be like learning a new language. Not every paper is going to make sense at first. But go through enough of them and you'll build up familiarity. Sources: can start by searching through reddit.com/r/machinelearning Time: 2 weeks.

- Learning software frameworks. From the above step I came up with my own small sample problem related to stuff I read that I could test even on my weak laptop (remember, training these big networks requires big computing power). So in this step I started researching different frameworks, and settled on starting a small project with Keras. Sources: google around for deep learning libraries, read up on them, see what you like, and most importantly, have a motivating sample problem that you wanna code up. Time: 2 weeks.

- Harder problems, more software, more papers. This is where I'm at now, it's sort of like an iterative research loop where I 1) come up with new problems I want to solve, 2) learn more about the software I need to implement it, and 3) search more prior work to gain insights on how I can solve the harder problems. In particular, I've switched over to learning and using Tensorflow, and also learning how to use AWS for stronger computing. So I had to dust off some linux scripting and command line skills too. Like I said, this is fairly iterative and probably closer to "modern research" where learning from my (virtual) peers and experimentation and production are closely linked. Time: from the last month to present.

Overall, in the last 3 months or so at about 30 hrs/week I've added an extremely powerful new skillset to my arsenal that I've been meaning to do for quite some time. I can understand 90% of all modern research in the field, and create useful software to solve data-driven problems. Completely for free as well, aside from the $0.81/hr I pay to AWS for training some networks overnight. This is the type of thing I'd have wanted from a Master's (or even PhD) program, but who wants to go back to school...

Hope this helps someone :) Remember, AI/ML is more approachable than most people think, you just need to start with a solid mathematics background. After that you'll be flying, the field is relatively quick to learn, especially if you like learning through doing.


What is your current skillset? Maybe we can help guide you and ease the anxiety.


I'm taking a couple courses from Udacity. They have both AI and VR courses.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: