
Microwulf: A Personal, Portable Beowulf Cluster (Breaking the $100/GFLOP Barrier) - toffer
http://www.clustermonkey.net//content/view/211/1/
======
blats
The writer makes assumptions that strike me as completely arbitrary in this
article and I feel the need to comment.

The writer quotes a "law" which states that in order for a computer system to
be balanced the hertz of the CPU, the bytes of RAM, and the bps of IO should
be the same. I shrugged it off to an old quote taking in assumptions of the
time in which it was penned. I continued to read, expecting that he was
illustrating a point. Then I see:

"So using GigE as our interconnect, a perfectly balanced system would have 1
GHz CPUs, and 1 GB of RAM for each core."

How can one assume that the clock of the CPU has anything at all to do with
the amount of ram or network IO for a "balanced" configuration? Is it 32b or
64b processor? How wide and fast is the front side bus? Does it use a minimal
instruction set or a full featured IA style set. How long is the pipeline? In
effect, how much work does it do per unit time? Is this not the real question?

Also, how much data can you store and read from RAM per unit time? This
information seems at least equally as important for an evaluation of balance
as the total amount of RAM available. And finally, what is the workload? Will
the parameters of the computing workload not completely determine the optimal
"balance" of hardware capabilities?

I am commenting here as a reality check on my own thoughts about this article.
Am I missing something or is this guy totally off the mark with this "law" and
computing balance equation?

~~~
reitzensteinm
It's totally off the mark in terms of absolute numbers, especially when you
take into account the huge variety of uses for a machine... my machine, for
instance, isn't really bottlenecked by a 52mbps network connection (not to
mention 4mbps internet), but I'd still like more than 3ghz dual core. A
whitebox in a Google render farm probably WOULD be noticebly slower with 100
megabit compared to 1 gigabit on a MapReduce.

The thing is, though, the 'law' was created in the 60s, and it's stayed mostly
order of magnitude correct since... so that's quite impressive in itself.

