These are certainly all things that get a lot of chatter on the forums lately, so as a summary of the current most-hyped ideas in software development this article seems pretty well on the money.
IMHO discussions like this will be more interesting if we try to avoid loaded words like "modern" (which makes it sound like something new is inherently better) and concentrate on honestly assessing both the advantages and the limitations/dangers of emerging technologies and techniques. That way, those of us who might find these ideas useful in future can learn from those who have been using them already and make more informed decisions. It's not as if each of the ideas in this article hasn't been horribly overhyped in some advocacy, even if the idea might have genuine merit in the right circumstances.
Learn and use a modern scripting language
Learn thogoughly and embrace the philosophy of a modern version control system
Be familiar with NoSQL solutions like MongoDB, CouchDB.
Learn a functional language — or more than one.
Study agile methods and concepts.
Here is a non-trend that should be: be more than just familiar with SQL solutions like MySQL, PostgreSQL. This to me is the most common missing skill among software developers.
Since this isn't all that cutting edge of a list, ajax seems missing. How about this one: design a website that never does a full page refresh and responds to server requests so quickly that you never see that annoying loading ring going around in a circle.
If you're suggesting that Python is a 'good enough' functional language, and that's why the two steps can be consolidated, may I respectfully disagree. Python is either indifferent to functional concepts or actively hostile, depending on how much of a fanboy you are.
Most of the other scripting languages are much more close to functional than Python is.
I don't think it's necessary to learn all the new things, but you should at least know which ones could help you with what projects... And then learn them when that becomes useful.
I don't agree with one the points and I think at least one important is omitted. You don't need to know any scripting language and while Ruby, Python etc is popular in the LAMP camp .Net (and possibly Java) and probably a dussin other compiled languages are holding their own. There's nothing intrinsically elegant or better with scripting languages.
One major trend that's omitted is the coming important of concurrent/multicore programming. It has become clear for many that computers aren't getting that much faster but they're getting more and more core's. To be be able to utilize that power you need to start code from the ground up to take advantage of multithreading.
On the .Net side Microsoft seems very aware of this and has rolled out a number of new techologies and tool support, things like Parallel.net, PLINQ, Reactive Framework and most recently Async. F# is by nature of being a functional language very suited for parallel processing.
No doubt other frameworks and languages will soon follow.
Try a scripting language; you might change your mind. There is a reason why they are so popular. It doesn't take too long to see why once you try. Things that used to take forever to get working just seem to work on the first or second try. Am I getting smarter? Maybe, but more likely the language really is intrinsically better (I tried Python).
> There's nothing intrinsically elegant or better with scripting languages.
I would argue that dyanmically typed languages (or static languages with Hindley-Milner) are intrinsically more elegant than statically typed, since you don't have the type information cluttering up your code.
That says nothing about 'better,' of course. But certainly more elegant.
Good point on concurrent programming. Programs aren't going to get twice as fast every 18 months on their own anymore. We are going to have to learn to make use of multiple cores. Google's go language is an obvious contender.
Wait, "Modern" SC has different thinking behind it? That of making commits often, sharing what you make that's useful, working on small, useful chunks of work?
IMHO discussions like this will be more interesting if we try to avoid loaded words like "modern" (which makes it sound like something new is inherently better) and concentrate on honestly assessing both the advantages and the limitations/dangers of emerging technologies and techniques. That way, those of us who might find these ideas useful in future can learn from those who have been using them already and make more informed decisions. It's not as if each of the ideas in this article hasn't been horribly overhyped in some advocacy, even if the idea might have genuine merit in the right circumstances.