

Concur.next — No Free Lunch - wglb
http://www.tbray.org/ongoing/When/200x/2009/11/26/No-Free-Lunch

======
gruseom
These posts are interesting, but I can't shake the feeling that something is
fundamentally wrong. There have been decades of research into parallelism. Why
are we groping in the dark like this at such an elementary level? I'm not
picking on Tim Bray (in fact I applaud his experiments). But it feels like
we're once again committing the perennial mistake of our industry: ignoring
previous work. What we ought to do, instead of naively muddling around the
multicore problem, is go back and understand what (for example) the Fortran 90
and SISAL guys did. Huge amounts of work went into these very questions before
the advent of commodity hardware caused the money to run out.

~~~
silverlake
Who's "we"? Bray has been muddling around naively because he's a regular guy
with a popular blog. Academics are well aware of the literature. I was a lousy
academic and even I can tell you the history as far back as the '70s. Frankly,
I don't think it's that hard to write most concurrent code. Pick a concurrency
pattern and stick to it. It's only in high-performance code that things get
tricky.

~~~
wglb
Well, he is not exactly a regular guy. He arguably wrote the first successful
web crawler, managed the project to bring the OED to CD-ROM, was part of two
startups, co-authored XML specification. This clojure series relates to his
wide-finder project at
<http://wikis.sun.com/display/WideFinder/Wide+Finder+Home>, in which "Wide
Finder is an attempt to answer this question: What is a good way to write
computer programs to take advantage of modern slow-clock-rate/many-core
computers, without imposing excessive complexity on programmers?"

I have written lots of concurrent code, and I think it is hard, as does Herb
Sutter, Donald Knuth.

