I must say that I am taken aback by the title you used to link to my article. I did not mean it as a snide response to the Arc challenge. I thought the Arc challenge was pretty cool, and I replied to it with an Erlang implementation -- you can see it here http://arclanguage.org/item?id=722. If it come across as snide, then I didn't communicate it as I had intended. The reason I said it's in the "spirit" of the Arc challenge is that the Arc challenge highlights what Arc is good at, and the Erlang challenge highlights what Erlang is good at.
Interesting. I thought you were making a sly joke too.
I guess this just proves people read their biases into whatever they encounter. Like many, I thought the Arc challenge amounted to a test of how much a language was like Arc (guess what, Arc wins). So I assumed your post had to be satire.
And -- maybe the requirement of spawning a million threads doesn't seem like a cruel joke to you, but it is to anyone who has to work in a language other than Erlang.
Apparently, I should have been more sensitive to the way people would interpret this challenge and picked my words differently.
A few other languages, including Scala and Stackless Python, have lightweight concurrency capabilities. AFAIK those languages do allow you to spawn millions of processes (or actors), but they don't handle garbage collection, scheduling, IO, distributed programming and native code interfacing as well as Erlang does.
I don't know if there's anything different you should have done.
If you discuss two different computer languages in any piece of writing on the internet, a large majority of the readers will assume you are advocating one over the other. I've seen this so many times already.
I'm guessing that very few people here on news.yc are qualified to compose the Cobol Challenge: the problem for which Cobol is uniquely qualified to provide the elegant solution.
Of course, the budget and the legacy code will have to be a very explicit part of that problem.
Meanwhile, this Erlang Challenge only tends to emphasize my impression -- which is just a stereotype, really -- that Erlang is certainly not the hundred-year language: it's a special-purpose language that is brilliant at a narrow class of problem. I'm convinced that, if I have some process that I want to spread across half a million processor cores, I should ask my company's Erlang guru to work on it (because, come on, if you can afford half a million processor cores you can afford at least one Erlang guru). What I'm not yet convinced of is that a world with N programmers who work in general-purpose languages needs more than (0.005)N Erlang gurus.
The challenge of the Erlang advocate is not to convince me, over and over, that Erlang wins its class; the challenge is to convince me that Erlang's class of problem is so important to my life that I should study Erlang rather than vascular surgery, or television repair, or other obscure technical skills that I don't know that much about.
The follow-up question then would be how many existing problems can be converted to the type that Erlang can solve well? And how many problems previously impractical are now practical to solve? So if it ends up to be a bigger class than you thought, you may well be limiting the number problems that you can solve easily and practically out of the ones that will be important in the future.
I think it's important because it allows a program to easier to scale out rather than scale up. We'd want speedups in this way because CPUs are becoming multi-cores, and to take advantage of them one will have to write some type of parallel program, since it's been difficult to fully automate parallelizing serial programs.
In addition, parallelized systems (not just erlang) can be more fault tolerant and can fail more gracefully (or hobble along, if you'd like to call it that). Sensor networks are one example. Instead of a single radar to detect the environment, you many sensors out there, and they network themselves and report what they see/hear. If some of them gets destroyed that's ok, because the system's still functioning with less sensors.
So what concrete examples of problems fall into this class Erlang is good at? The obvious ones are the embarrassingly parallelized algorithms, like genetic algorithms, neural networks, 3d rendering, and if I'm not mistaken, FFTs. Indexing web pages is another. That said, there are problems that are inherently serial. It takes 9 months to begat a baby, no matter how many pregnant women you have.
Ruby and Haxe language writers are both implementing the actor model like Erlang, if that's any indication of how important they think it is. While I don't think Erlang will be the 100 year language, the ideas by which it's a poster child will reverberate in the descendant languages for a long time to come.
well I have never understood the "use erlang because CPUs are going multicore" argument. See if I write a program in C (which for some mysterious reason doesnt use threads) and run it on a single core and it runs in 1 sec. If an equivalent erlang program runs in 10 seconds on 1 core, then I will need at least 10 cores before it beats C. That is assuming 100% of my application/algorithm is parallel.
While erlang is a nice language, saying that it runs faster on multicores is just a lame excuse.
Given the scenario you've mentioned, sure, it doesn't make sense at to use something like erlang. This scenario is where a job takes longer the more you split it up.
So it would be kinda like if I was a rolling a 10 meter (radius) snowball up myself and it takes me 1 hour, but if I split up the job amongst 9 other friends to get another 10m snowball, it would take each of us 1 hour instead of just 1/10 of a hour to roll up a tenth of that in volume (a 4.6m radius snowball)
But if it doesn't take longer to do 1/10th of the job if I split the job up 10 times, then erlang would be a good fit. There are problems that cater to that, and those that don't. Pick the problems that do for this technique. These are the embarrassingly parallel problems that I mentioned earlier.
So yes, just because you have multi-core, doesn't guarantee speedup. However, if the speed of a single CPU caps or slows in the future, and the number of cores increase on a chip, then us programmers have better have a good way to take advantage of that in our programs. Erlang and its actor model is one way.
Just because Erlang is very good at solving a certain class of problems, doesn't mean it's inferior at solving general computing problems.
The Erlang challenge highlights what Erlang is good at, but it's not meant to convince you that Erlang is good at everything. I've never claimed that this challenge proves you should use Erlang to solve any problem that comes your way.
Well, okay, it sounds like I have therefore interpreted the Erlang Challenge correctly: it highlights what Erlang is good at. And that's fine. There's nothing wrong with ass-kicking special-purpose programming languages, and I look forward to the day when I will actually spend ten minutes learning Erlang and be able to start discussing it instead of just marveling at it from afar.
The thing is, the Arc Challenges (I presume there will be others...) are explicitly intended to convince us that Arc is an awesome language for solving general computing problems. (Of course, they're also tools for designing Arc: If Arc proves unconvincingly awesome at any point, the plan is to twiddle its design knobs until the awesomeness returns. We will see how it goes.)
When you adopted the Arc Challenge's title and style, it left me wondering if you were aiming at the same goal. If you were, I wanted to invite you to shoot again, because you kinda missed. However, it seems that "convincing me that Erlang is good at everything" was never your goal at all, so I'm sorry to have interpreted it that way. (My interpretation may have been unduly influenced by the original title of the submission...)
I guess that you and I had different interpretations of the Arc challenge. I thought it was intended to demonstrate how easy it is to implement multi-page flows in Arc by binding closures to links and forms. I didn't think it was intended to convince people that Arc is a good general purpose language because its scope was so narrow (if the challenge were to build a search engine or an image editing tool, it would have been a different story).
By the way, although I didn't expect people to extrapolate from the Erlang challenge that Erlang is a good general purpose too, I do think that it is, at least for building backends for a wide range of applications. I wrote a followup on my blog to the Erlang challenge where I listed some of them.
You said it. I checked out the Erlang tutorial just now, and apparently you can't use user-defined functions in the test-expression of a conditional, to prevent people from putting side-effecty functions in there? Now maybe there's some good reason for that, that has to do with the mysteries of handling concurrency or something, but it gives me the prima-facie feeling of java for the fp crowd.
Mostly the reason for that is to allow the compiler to optimise the order of evaluation of the conditions. If you could put functions with side-effects in there then it would be impossible to do that since the compiler would have to guarantee some kind of consistent evaluation order.
There have been proposals for some way to mark user-defined functions as side-effect free and thus allow them to be used in guards. None has had acceptance from the core Erlang developers yet.
In any case this is not usually too much of a problem. The Erlang idiom is to use 'case' instead of 'if' in most cases (no pun intended) and where a calculated value is needed then just assign it to a variable beforehand - not quite as pretty I agree but no big deal.
Well the PS3 has 9 cores, or something like that. Don't you think that says something about the computers we'll all be using, and by extension, the parallel-processing capabilities we'll have to take advantage of in the future? Personally, I'd rather not learn Erlang either, but I hope that this sort of easy parallel processing becomes one of those popular features that every language will implement sooner or later. Hm, I wonder if it should be part of Arc's core...
The "eventually everything will be massively parallel" argument is not very convincing. The Be guys made this argument 15 years ago and we're only now getting to where they were back then (2/4 processor cores). Consumers aren't that excited by more cores, and developers still don't know what to do with them. So there is massive investment in advancing the status quo.
Sony thought they could impose the PS3's multiprocessing regime on game developers through market force. But developers found the programming overhead unmanageable, which led to a less-than-compelling lineup, which failed to interest consumers, which means that now developers just work on the Xbox 360 version and port to PS3.
I was offered a job porting an existing game engine to the PS3 under the gun (the game had to ship in six months) and I would not touch it with a ten-foot pole.
"and developers still don't know what to do with them."
Excuse me, at least in the imaging processing community we know very well what to do with all those extra cores. Particularly when processing 1 Gb images, and/or a couple hundred thousand 2048x2048 16-bit images.