Programmers who are interested in learning and growing will always be attracted to new languages and technologies. What's interesting about Scala is that it's not solely academic. You can learn and do useful things with it at the same time. Later, when you've learned more, you can go back and improve your old code for added beauty and performance. That's a fun process, I think.
EDIT: Using "esoteric" languages can also be a good long-term hiring strategy. It's pretty easy to get a place in people's minds as the "go to" company to work with a language. Google did that with Python, Twitter is doing it with Scala, Basho are doing it with Erlang.
By the time the early 2000s hit, the people who listened to college radio no longer self-selected for people who knew a lot about music, but rather were just people who were generally anti-establishment. But they hadn't built up an especially rich musical knowledge base.
Fast forward to 2011 and languages. Now I'm almost as likely to see Clojure on a resume as C++. Yet the quality of applicants doesn't seem to have gone up all that much. TDD knowledge seems to be MUCH more prominent, but I haven't noticed an uptick in knowledge of algorithms, complexity, anything under the hood, or programming (as can be judged in a day long interview).
In 2011 the choice of language says less about the programming ability of the person, and probably more about their personality. Much like the college radio heads weren't especially musically sophisticated (on average, but of course some were brilliant), but they all seemed to share a counter-cultural vibe (even when you went from goth to hip hop to polka).
I ask this because I am a fan of Python and Clojure. I got into both as a result of reading pg's essays on Python and Lisp many years ago. Both languages were sold to me not as an act of rebellion but as more-powerful, higher-level languages that could get the job done more quickly. I quickly found this to be the case, compared to Java which I was used to. Nowadays, I grimace when I have to use Java or C and find myself missing a language feature from Python or Clojure that could have saved me an awful lot of typing and repeating myself.
So, in my experience, was the choice of esoteric languages born of a desire to be different? No. Does it make me out as being an amazing programmer? Sadly no, not necessarily. I'd like to think, though, that it at least shows that I was enthusiastic enough to learn and use languages that I get a lot of benefit from.
Not necessarily, but often so. More specifically, I think it used to signal someone who was more enthusiastic, passionate, engaged -- but now people will use them because they think it signals these things. And I don't think its intentional. But I think a lot like college radio, you see the guy who is leading the crew of outsiders and you see what he listens to. You see a lot of people who are signaling what resonates with you and follow that.
It's odd that C++ is one of the most denigrated languages on the planet -- yet it really does seem like the people I know who most write C++ code still are a lot more hardcore than those I know who do Clojure (and its odd that I may know more people that do Clojure than C++).
To put it another way the people doing the less mainstream languages are often putting themselves in the middle of it... "look at me -- look at the language I'm using". People doing C++ are saying, "look at my code -- look at my app".
Actually, it's a good short term hiring strategy but a terrible long term strategy. Most of these languages remain a niche (Ruby, Scala, Clojure, etc...) and hiring more than a couple of people that know how to use them turns out to be a very, very tough challenge.
I have seen several companies try that and then revert back to Java just because they couldn't staff up to meet the demand.
Overall, color me still extremely skeptical of Paul Graham's original argument (which was that Lisp gave his company an advantage over the competition). Interestingly, ViaWeb was quickly switched back to a more mainstream language shortly after PG sold out. Deduce from that what you will.
That's tackling the wrong challenge. Almost none of engineering staff at foursquare had used scala before starting here. But they're all talented engineers who can learn new skills and thought that learning scala would be interesting. We've had very few problems with people picking up the language.
Now go scale that up to 50 or 100 engineers and you understand why it's not a viable strategy.
Try it yourself, go pitch a startup and then tell the VC's and their consultants that you're going to use Scala...
I know you're going to say that they don't know their stuff and you should be the one deciding, but it's actually the other way around: they know how to grow a business, and nobody with that knowledge will ever recommend to go with non mainstream language and technologies.
2) We've already have ~20 engineers writing scala code at foursquare. Not 50 yet, but the challenge of getting there will having nothing to do with our choice of language. If anything it's helped with hiring so far.
3) I did tell our VCs that we'll be using scala. They didn't have a problem with it. In fact a VC on our board agreed with me that "using cutting edge technology can give a real edge in recruiting."
Basically my experience growing the engineering team at foursquare has been the exact opposite of your claims.
What makes Clojure so special that you'll never need that many?
But don't listen to me, here's Paul Graham from Beating the Averages (Footnote Edition). And ponder this question: why didn't Yahoo just rewrite the templates in C++?
 Viaweb at first had two parts: the editor, written in Lisp, which people used to build their sites, and the ordering system, written in C, which handled orders. The first version was mostly Lisp, because the ordering system was small. Later we added two more modules, an image generator written in C, and a back-office manager written mostly in Perl.
In January 2003, Yahoo released a new version of the editor written in C++ and Perl. It's hard to say whether the program is no longer written in Lisp, though, because to translate this program into C++ they literally had to write a Lisp interpreter: the source files of all the page-generating templates are still, as far as I know, Lisp code. (See Greenspun's Tenth Rule.)
PG left the equation, and when he did, a big argument in favor of keeping Lisp around did as well. Without another similarly skilled Lisp hacker to fill the void, it's easier to make a strong case for a more mainstream language so that ViaWeb could still maintain some forward velocity even if they never got to move quite as fast as they did when PG/Lisp were driving.
tl;dr - I bet that it's not Lisp that made ViaWeb tick - it was Lisp + PG.
But more seriously I did write a search engine using only sed for a little site of mine (http://toybin.org/) just to see how far I could push sed. Its only feature is that you can have negative works like 'blue -car' will show search results for blue and not car. Haven't yet found any hard core sed hackers out there though... :D
Edit: GitHub would also be another Erlang company (with a ruby frontend)
What grieves me as I read that article is that it seems like he had a perfectly good way of doing it simply and easily, but then decided to go for a much more complex and risky solution (with what _he_ describes as poor tooling) ... for what? Simply to increase the difficulty level?
Take the analogy of a backyard pool. Instead of just running up to the pool and doing a bomb (or even a belly flop), he has to climb up on the roof of neighbours rickety garage, where he is going to attempt to do a triple twist half pike helicopter/superman maneuver in order to score higher from the judges. Problem is, he's got a real risk of either missing the pool entirely or cracking his head open on the concrete.
I'm not saying we should never use new languages or techniques or tools. What annoys me is that given the choice of doing something simple, or doing something complex, he chose the complex way, and then piled on the risk, with a side order of complexity.
From the article:
"Shortly after working this out and drawing my architecture diagram (a pretty insane-looking tangle of boxes and arrows on a sheet of paper)"
Shouldn't that have been a pretty big red flag? Sure he found some superstar programmer to pull it off for him, but I can't help but think that a little bit of darwinian natural selection would have been in order here.
The annoying thing is that you see this all the time in the Enterprise. Time and time again someone with architect in their title goes and makes an appalling horrible mess of the design, leaving the poor bastards at the coal face to sort it out and try to make the abominable crime against nature, reason and common sense actually work.
His point about code fashion is also equally valid, what Seurat did with pointillism is impressive, printing a picture on on your inkjet is not, even though what Seurat did is essentially the same thing.
Coming from an imperative background writing in a functional style has risk, coming from a functional background imperative style has risk. You can implement the lambda calculus on a UTM and implement a UTM in the lambda calculus, for all intents and purposes they are the same, however, in terms of elegance they are far different. (And there are arguments on both sides as to which is more elegant).
I tend to agree with you particularly on 'enterprise architects' but the solutions are usually a reinvention of the wheel rather than marrying two disparate technologies that actually fit quite well together. Scala or another functional language is a great fit to a message bus architecture. Doing something for the difficulty of it usually doesn't attract a lot of smart minds, putting two things together that at first one wouldn't think of but that makes sense WILL pique the interest of a talented developer because he wants to see if it actually bears fruit.
When you get down to it, pretty much any language with even a rudimentary ability to manipulate strings and connect to a socket can do messaging. But right from day one Java has had the capability of communicating with remote machines baked in.
Oh wait, I just checked, I'm wrong. The Remote and Serializable interfaces date since Java 1.1, NOT since day 1. My bad. Of course 1.1 was released Feb 19, 1997 so RMI is almost exactly 14 years old at this point.
So my conclusion is that if you've been living under a rock for exactly 14 years, you have an excuse when you say that Java can't do messaging.
Someone always comes along and says "what about $blindingly_obvious_thing and gets massive amounts of karma for doing so, so let me just say that yes, I am aware of... bugger it, I'll do it myself and reap the karma. Muahahahaha!
...dang, there's no reply to my own message option. (For obvious reasons) Oh well, we'll have to settle for option #2 which is that someone comes along and posts an objection which has nothing to do with any of the posts before it, and gets masses of karma for it.
If you're implementing a messaging system in your handy dandy language with sockets and strings, and you want it to be robust, you will of course need to take into account this:
or a more in-depth look at them:
cheerio, pip pip
I also like the lengthier, but most excellent, tutorial by Mark Volkmann:
Project Euler worked well for me as warm-up exercises. And I just found a Wiki of Project Euler solutions in Clojure:
http://pragprog.com/titles/shcloj/programming-clojure (which was current at the time that I read it but might be a bit out of date by now since I think it covers Clojure 1.1)
For Clojure you have very nice video introductions by Rich Hickey on blip.tv.
If you are coming from Java, as I was, the learning curve was pretty shallow towards doing useful things.
Delimited continuations in 2.8 make up for it and then some, but they're much harder to grok, imo. Python's yield gets you 80% of the same benefits with about 20% of the brain bending.
Each feature has both a description and an example. I became productive very quickly using almost exclusively this tour.
Alas. If only we could live on technical stimulation alone :(
(the last pardox thread)
"Scala is not ready yet, but when it is, it is gonna take over Java as the next big language".
Years have passed and this has failed to materialize. Maybe it will never be 'ready enough' to take over? What do you think?
as a mere mortal I gravitate to simpler things.