If you can turn a design into code, learn to turn a spec into a design.
If you can turn a spec into a design, learn how to understand a problem and produce a spec to solve it.
If you can understand a problem, learn to talk to people and discover the problems they have so you can solve them for them.
If you can do that, learn a million other things and run your own business.
[You can also skip any of these steps if you're happy managing people to fill in the downstream aspects rather than doing it yourself.]
This is what makes waterfall hard and agile.attractive. it is also what makes someone who can shift mental focus from one to the other more useful.
Personally I think of it in terms of field of view. Zooming out to bigger and bigger picture or down to finer and finer detail.
I guess what I'm saying boils down to "increase your range" - if you're a coder you'll be more useful/valuable the more you can look out. (Whether that be in terms of business sense or devops practicalities).
The best strategy is to continuously brush up on skills. Experiment and dabble with new languages and frameworks as often as your time allows!
Second, once a language/framework/toolset reaches a certain level of usage, then there will be demand for that skill in the foreseeable future. Even if the language falls out of favor for new projects, maintenance/patches/upgrades will always be needed.
I know freelance PHP developers who have to turn away work. I know freelance Rails developers who have no problem billing out at $200/hour.
If I were to make a bet now, I'd bet on Node still being widely used in 10 years, and to a lesser extent Rails and Django as well.
Personally I find Go interesting and it's something I'm hoping to pick up in the coming year. It seems like a fun language, well suited for building web services that handle lots of traffic.
Lua might also be nice to learn. It's used for scripting in a lot of games. For example: in World of Warcraft you can create your own Lua add-ons. Lua can be easily integrated into your own apps / games, since it's just a small C library. It might be a good language to learn if gaming interests you, since lots of games make use of Lua in some way.
And as someone else already mentioned in this thread: functional programming will become bigger in the future. You can use the functional programming style with .NET if you choose to learn the F# language.
I've found it difficult to find work with Java, Ruby and Python outside of London, however the rest of the country is flush with C# and .NET jobs.
Although mostly in the south east there are well known studios in Scotland, Ireland and the north and midlands. Rockstar, Havok, Ubisoft, Natural Motion, Rebellion and Codemasters come to mind as a few of the bigger names who have studios further from London.
however the tech requirement for AAA is almost exclusively C/C++ (with good hardware and API knowledge strongly preferred for some specialities like audio, rendering, physics, AI, networking...)
Wait, are you talking about JEE?
Things like algorithms and data structures are largely independent of language (although there are some 'exotic' things you can do in many of them).
I would however strongly suggest getting a strong background in cross platform C/C++ code. For a very long time this has been the only really practical way to write properly platform agnostic code and it still is... you'll need a bit of Java to glue it into Android, and a little Objective-C might help with iOS and OS X. It also lots of quirks due to being low-level enough and will give you a better understanding of why x is slow or difficult to implement in the general case rather than in just 'language and platform X'
I strongly advice against deeply following the next big thing too - check it out see what it is, learn about it a little, but don't go nuts. There is a chance that you will learn nothing of long term value from investigating it thoroughly.
Knowledge of APIs and standard libraries is something Google can provide for you these days... understanding never will be.
My current hunch is that statistics/data science/machine learning will satisfy at least two of the above three, if not all.
I've been learning the basics of data analysis with R, much helped by the awesome RStudio. Initially this was for fun (and a tryout of MOOCs, specifically Coursera's data analysis class) but after only a few weeks I found occasion to use it at work (finding patterns in an application's response times in response to user load throughout the day).
Now I'm getting into stuff that's even more fun, specifically the reactive Web framework Shiny.
This is usually a sure sign of a good match between learner and subject matter: an iterative process where learning and applying, tightly interleaved, form a feedback loop.
Mathematics and algorithms not tied to any particular language. Avoid paying too much attention to the "next big thing", that's counterproductive.
The field of computer science is too fluid right now -- and for the foreseeable future -- to expect to be able to choose a language or environment that has any serious staying power.
For example, 10/20 years from now, everyone will have to learn how to write parallel algorithms to a degree not even imagined today, for lack of appropriate current hardware. That future is virtually certain, but there's no present way to even prepare for it.
So you could learn about low-level code and compilers. Application of graph and set theories. Automated modeling.
As far as I know, there have been developments for more than 10 years that aim towards better parallelising compilers. I believe the current compilers can already use some SIMD instructions when they conclude that the working set is possible to split for parallel execution. (Without programmer hinting, that is!) As the problems become better understood, I expect this trend to continue. Right now it's done by some binary compilers. The next step will likely involve using the same logic in JIT compilers and hence in some language runtimes. After that? I have no idea. Maybe going way beyond map-reduce by applying the same logical solutions to bigger data sets and allowing for more delayed execution. Instead of doing map-reduce by hand, why not have a logical "warehouse compiler" which generates these jobs and their pipelines for you automatically?
My point is that there are fields with known hard problems. As technology and theories evolve, some or perhaps even many of these fields will find new applications outside their current (possibly narrow) scope. Finding interest in them, and tinkering with the problems will expand your own knowledge about the field - as well as the practical applications. From there, applying that knowledge in other fields should become a possibility.
If you're ever-curious, just focus on the fields you are already interested in. (You will discover new ones that overlap.) Find out what's still missing, and then... let there be hacking.
> So you could learn about low-level code and compilers.
No, you could learn mathematics and algorithms (which I already said). A least-squares curve-fitting regression method works the same in all languages. A Fast Fourier Transform works the same in all languages. Quicksort works the same in all languages. These examples have in common that they are all expressed in universal mathematical notation, and they all carry out the same basic algorithms in the same way, regardless of which language they're expressed in. But all of them would need to be modified to take parallelization into account and exploit it to its fullest.
> As far as I know, there have been developments for more than 10 years that aim towards better parallelising compilers.
That's true, but it's also true that the real challenges of parallelizing algorithms have yet to be addressed in any meaningful way, for lack of suitable hardware. One example is the problem posed by race conditions among independent processes, which is a big argument in favor of immutable variables and functional languages. But these issues have to be examined in light of specific hardware -- they can't be fully worked out in advance of the existence of the target hardware.
As the world realizes that they're relying on software for everything, and that a lot of it is horribly buggy, pure functional programming with typed languages is going to get a lot more popular.
Additionally, just immutability by itself will make programming any kind of software much easier in the multicore era.
Regarding types, people will soon realize that these are not your grandmother's (dumb) strongly typed languages. They don't need you to spell out everything for them - they have extremely good type inference - someitmes to the point of feeling almost dynamic, but without sacrificing correctness.
There might be some issues with tooling and foreign-looking terminology at the present moment - but I'm confident they will be sorted out in the next couple of years.
But most importantly, its not just technical merits. People seem to be talking about functional programming a lot lately. Some of it is correct and some isn't, but that doesn't matter - the interest is growing there and I optimistically predict that the trend will continue.
Its probably too early to say whether types will become popular, but I certainly hope so. It sure would help if advocates for types distance themselves from Java and similar languages with little or no type inference. Many people have a lot of ill will accumulated for some of them and tend to blame the types for that. "Inferently typed" seems like a good buzzword for that.
Predictions are often horribly wrong, but they're also so much fun!
These things are very abstract, to a degree where I think many people just aren't comfortable with them. Not that there's anything wrong with that, as I have a hard time with very low levels of abstraction. We've all got our comfort zone.
There is just not enough (approachable) material on the subject right now but the amount seems to be growing every day. The percieved "coolness" factor is also rising, and even though in principle we should choose only based on technical merits, us programmers will definitely go through many more obstacles for something that is perceived as cool / fashionable / in demand.
My guess is that most programmers are simply put off by the completely alien terminology rather than the complexity or abstraction level. But this terminology wont remain alien forever - its already entering into more widespread use.
I yearn for a reasonably "right way" to develop software - for a set of solid patterns with truly good characteristics. Not a silver bullet, but at least a set of universal principles that apply to any high quality bullet. The acceleration of the current chaos only reinforces this yearning. There is just too much magic, too many approaches, too many opinions, too much rehashing of the same-old, too much cargo-culting, but not enough facts. I have a hunch that the functional languages of today may be on the verge of being able to offer that "right way". I guess I'll find out soon...
You are probably working in the wrong company. I know dozens of colleagues who know what lambda calculus and moands are.
If you want to explain monads to a software developer, call it a 'design pattern that lets you chain actions on wrapped values' rather than "an endofunctor, together with two natural transformations".
So if you're looking for a job now, stop listening to hipsters and start looking at want ads.
Any "cool" technology today is going to take 5 years to have a significant impact on the want ads. You can try and pick one, and if you get lucky you'll be one of the few to be able to check the "5 years experience" box in 5 years time and be in a position to choose your own salary.
However, your chances of choosing it are slim -- many more technology choices wither and die than succeed.
My advice: when you have a problem, solve it with the best stack for the problem. Winners are chosen because they're good at solving problems.
If you're going to be job hunting soon, angular may be your best choice. It appears to have reached "escape velocity", and is starting to be adopted in places that actually hire people. It may or may not be uncool soon, but demand should handily outstrip supply for a few years.
Want to learn a language? Take a weekend with it, write something that's useful (even if not suitable for production or long term use). Do that often enough and you'll be mentally agile enough to pick up anything when you need to.
However, just a decade ago, a large majority of Internet used JS primarily for form validation, which was sad. A lot of web developers were not comfortable leaving their code open for the visitors to see.
I personally believe JS will continue to soar but I also believe that nobody can answer this question perfectly as nobody knows the future.
In any case, if you spend a lot of time learning any language very well, the time required to learning another language after that, decreases substantially.
Then Node.js and Angular. You should be set for the next 10 years.
Outside enterprise, .NET is sinking into irrelevance. I don't know for Xamarin though.
Edit : fwiw my current job is C# / WPF since 2009. So don't take my comment about .NET irrelevance as mindless Microsoft hatred.
You're an iOS dev, so you're familiar with OOP. Why not learn a functional programming language like Erlang or Haskell? There's a lot of momentum behind FP right now, and both languages can solve problems in different ways than what you're probably used to.
Learning only programming language (PL) people are limited to scope of that language. Learning paradigms (better in terms of one of PL) you gain knowledges which are "portable" between PLs of the paradigm. You will have a boost when switching PL of the same paradigm: learn PL faster, looking into PL's features and not its basics.
I recommend to check out before you choose what to learn: Lisp dialects like Closure, CL, etc; Ruby; Go / Rust; Java.
I've been a mobile UI developer for over 10 years. World of mobile looked very different 12 years ago. Technologies changes that fast that I wouldn't make career affecting predictions about tech for that long time span.
Learn to ship. It's hard, surprisingly rare skill and doesn't get old.
For example I cut my OO teeth on Smalltalk in the 90s, but Java looked fun, so some colleagues an I built game applets in our spare time. I moved straight into a serious commercial Java job. I also played a little with Flash actionscript- got me nowhere, but fun nonetheless.
But to stay relevant for the next 42 years, learn how to learn.
I have to admit, I like to experiment with weirder stuff.
Haskell is awesome. Incomprehensible sometimes, but it is the only language I know of, that implements interfaces for you :D
PHP is the new Haskell and LISP in terms of legacy. 5 years, when everyone is getting trendy and NodeJS is as mainstream as finding it on free shared hosting, PHP will be powering a massive backlog of web applications which need maintenance.
WordPress powers over 20% of the internet, is used by over half of the top 100 sites in the world, Fortune 500's use it, and the government uses it (i.e. http://www.data.gov). There are some stats here: http://en.wordpress.com/stats/ but these estimates are low because all sites aren't tracked.
Despite being around for over 10 years now, it's still growing rapidly. It's widely underestimated and often derided by other developers, but it's the most popular CMS in the world and evolving into an app framework. In the WordPress community devs are the minority, as most are designers, bloggers, hobbyists, or typical users.
Don't follow the stock market strategy of chasing the hot stocks, look at value. Remember last year when Apple's stock dipped into the $300's? That's how WordPress is: great fundamentals, massive loyal customers base, and highly undervalued.
D3, Angular ...
If it was easy to guess the next big thing everybody would do it ;-) A 5-10 year horizon is a very long time in computing years. Look back ten years. How many people were accurately guessing the current environment? How many of the big-things now even existed ten years ago?
When I look back at my career I can't point to a single instance of seeing the next-big-thing.
I can point to lots of great things that have happened because I'm continually poking at new ideas, new processes and new bits of tech. So I'm ready to take advantage when one of those does become the next-big-thing.
I agree with a lot of things said in this thread, but my suggestion would be to focus on building relationships and seek out great people to work with. In 10 years, you'll have engineering experience AND be surrounded by talented, like-minded people. One caveat is, you have to be great yourself, because great people don't want to work with mediocre people. So really dig into whatever you're working on, try to impress yourself and try to have fun.
+1 for dirtyaura's "Learn to Ship" comment.
But the irony is noone wants a pure java dev. It is implied you need to be good at html/css/js, have handson exp with at least 2 popular databases, different web/app servers, numerous frameworks and also very good admin skills. Not the shortest path to success.
Stay agile, aware, and receptive. Learn how to learn better. Don't wed yourself off for life to one technology, and don't grow stubborn. Be prepared to turn the ship, quickly, when it comes time. Learn JS today? Sure, I'd argue however that it's far more important that you're ready to alter your priorities when that Thing comes along and smashes whatever you think is important today.
Look beyond programming. There's many other intellectually and financially rewarding pursuits that just writing code. A few examples: research, project management, business development, teaching, and writing. All of these can build on your experience to date.
If we are talking work and programming php will not diminish, it is great for most glue stuff.
If we are talking forefront cool stuff programming GOlang.
Just general stuff, I would say the "next" big thing is embedded stuff and what annoying people like to call "the internet of things".
Suppose you're now top 5% iOS developer. 5 years later you'll probably be top 1%. That's cool.
I can see only one hugely improbable risk in doing this. If Apple goes down then Objective-C developers will be as valuable as toilet paper.
- Algorithms and data structures (It's very hard to find a person who can describe well how hash tables work. I interviewed many people who studied this at university).
- Programming language theory. It makes you better programmer, since you will have a better understanding of why languages are implemented the way they are.
- Mathematics which is relevant to software: discrete math, statistics (data science is on the rise now), mathematical logic. These skills are timeless
This part is more speculative (however, I personally bet on this stuff):
- Reactive libraries. They allow you to become so much productive. You can learn Reactive Extensions (https://rx.codeplex.com/) or one of its port to other languages. Also I can recommend you the framework which we developed at JetBrains: https://github.com/JetBrains/jetpad-mapper.
- Web platform. I mean WebGL, WebCL, WebWorkers, and other HTML5 APIs. The next generation of apps will be based on the web platform, and you need to be fluent in these APIs to be relevant.
- Emerging languages: Scala, Kotlin, Rust. Don't learn Go, that language is defective by design. Also don't learn Haskell, the language isn't widely used and if we take a language + popular extensions it's more complicated than C++. Haskell will probably be replaced by one of the dependently typed programming languages.
- Dependently typed programming language. Currently, they are mostly academic languages, however, they give us a promise of writing reliable software by construction without too much effort. The best language here is Agda. However, you might want to learn Coq or Idris.
Things you should stay away from:
- Do not learn language which try to make themselves easier at the cost of correctness: Go (lack of generics) and Dart (optional unsound type system). I think they won't work out despite the fact that Google backs them.