Hacker News new | past | comments | ask | show | jobs | submit login
Parallel Computing: Why the Future Is Non-Algorithmic (rebelscience.blogspot.com)
15 points by Mapou on July 1, 2008 | hide | past | favorite | 17 comments



A thread is an algorithm, i.e., a one-dimensional sequence of operations to be executed one at a time.

Well, it's too bad that none of these pesky algorithms can be logically decomposed into independent processes and run all at once. Like, say, every matrix multiplication ever done by a multi-pipelined GPU in the past 20 years.

This is really the worst sort of trolling. Every time I looked up from studying today, I feel like I read something after which I was stupider than before. I mean, just look at what's become of my grammar, if nothing else...


He clearly doesn't even know what an algorithm is, yet is pontificates madly about how they preclude parallel code. I guess he doesn't think that distributed ray-tracing involves pesky things like "algorithms" either, eh? Naw, ray-tracing is an algorithm if you run it in a single thread, but it's not an algorithm when running in parallel... at least according to his logic.

I've found that bad engineers detest threads, but good ones like them when they're appropriate for the task at hand. This is further evidence of that.



And jobeemar == Louis Savain too? http://news.ycombinator.com/submitted?id=jobeemar


Actually, naa, don't think so. Not enough "LOL"'s in his comments.


I'm not sure I understand this, but it appears to me that the consequence would be a reversal of defaults. Now we assume that a sequence of commands is the default and parallelism has to be explicit. He proposes to make parallelism the default and sequence explicit.

My question would be, is parallelism really the more frequent case? I mean if we take all the instructions in a program and analyse which ones are just accidentially sequential and which ones have to be sequential to be correct, would we have more parallel pairs or more squential pairs of instructions?

I believe we have more sequential pairs and that it'd be very cumbersome to default to parallelism. But there may be special cases where defaulting to parallelism is beneficial.


There has been some genuine research work in this area, with pretty much the ideas that you're suggesting. Google for data-flow computer architecture for a low-level take on the idea at the instruction level. (This guy seems to be talking about some similar concepts in places, but it's difficult to tell beneath the incoherence.)

There's also a bunch of work been done (with pure-functional computations or languages, for example, as well as some weirder research work with imperative languages), examining the writing of programs with automatic parallelism, with sequential ordering only when explicitly requested.

However, there's a big difference between actual work and the stuff this guy is spouting.


Bravo. You're the only commenter who had something interesting and intelligent to say about the article. The others are jumping up and down and foaming at the mouth. LOL. In the future, when parallel programming becomes easy and powerful parallel computers become the norm, we will find that most applications are significantly more parallel than sequential. The coming intelligent programs will have huge numbers of sensors (visual, auditory, tactile, etc...) and effectors. These naturally call for massive parallel processing.


Who keeps posting these links to rebelscience? It's just gibberish.


yes, that blog is the work of a troll. what he's talking about has merit though, intentionally or not.


The blog post is full of hand-waving and the mindset that "if you think this you must be stupid".

I'm extremely wary of anyone promoting the next silver bullet for writing software. The Project COSA, http://www.rebelscience.org/Cosas/COSA.htm, that the author mentions sounds very complex, and I have no idea how using such a system would simplify writing code.


Louis Savain's "arguments", which he likes to claim "over people's heads", are mostly just examples of the Argument from Intimidation fallacy. Ie, "only an idiot would disagree with this". (Complete with overly complicated and inadequately labeled diagrams, to try to make you feel stupid, so that you'll more susceptible to this tactic.) I don't buy it for a second. I've built logic circuits. I've written assembly code. He hates computer languages because he's full of shit; there's no polite way to put it.

Greater parallelism in hardware is a fine goal. But it's madness to write programs for processors. If humans can't read it, then your program is crap. There are no exceptions.

We have these many processors. The task is not to make parallelism fit on a single chip. The task is to develop practices and languages that allow humans to write code for other humans, in such a way that programs also incidentally take advantage of the many (threaded, algorithmic) von Neuman machines that we have wired together in modern computers.

In just a few years, 8 or 16 (or more!) cores won't be a big deal. When you have 50 single-threaded cores, you have a different kind of computer than what we're used to today. We already have languages that take advantage of these features pretty well. They're not hugely popular yet, but they're also not being taught at universities, and we're not yet at a point where they're terribly necessary.

The COSA project is imaginary yak shaving of the worst sort. If he was half the hacker he claimed to be, he'd be writing programs instead of crackpot papers. Where is all the useful error-free software that COSA makes it so easy to create? Where's the web server he built with it, or the word processor? Hell, I'd be impressed with a 4-function calculator! Show me the goods, and I'll take it seriously. I'll even join the beta and play around with the stuff.

Until then, in my view, Savain is on par with the muttering bum at the bus stop.


Exactly - thanks - you saved me making virtually the same points. This guy really should make use of a dictionary and look up terms such as "Algorithm" - he might learn some stuff.


This picture the OP includes reminds me of TimeCube:

http://bp3.blogger.com/_BXJTG_K68fU/SB6ibLP3ExI/AAAAAAAAAE0/...


You Turing worshippers are so full of hate. You sound like somebody just blasphemed against your two-bit god. LOL. Too bad for you that the parallel programming crisis is about to bring Turing down from his pedestal. Then you'll shut the hell up. I have a wide grin on my face just thinking about it.

Money always talks and no chicken-shit cult of propeller heads can stop that. LOL.


I logged in just to mod you down, due to the dull sadness I feel at your apparent sense of obligation to spew such bile.

This blog post is full of misconceptions and appears to be written by an individual who has not sufficiently delved into algorithms, parallel computing, or operations research. Similar problem has been around for decades, such as controlling the flow of tasks on parallel factory assembly lines. Guess what helps us find near-optimal flows? That's right: algorithms.


Damn myself, I wanted to mod down and I modded up by mistake. It could be good to be able to change the vote.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: