
The False Hope Of Apple's Snow Leopard - raganwald
http://whydoeseverythingsuck.com/2008/06/false-hope-of-apples-snow-leopard.html
======
hunterjrj
Alot of words, not alot of content. I'll sum up the article for all that
haven't read it: "Though I haven't seen what Apple will be offering with Grand
Central, I am making an assertion that the fundamental problems found in
multi-core computing can't be solved by anybody. Ever."

Does this remind anyone else of a certain Bill Gates quote regarding 640k
being enough for anyone?

~~~
wmf
No, it doesn't. There are fundamental laws governing parallelization of code,
and Apple can't beat them:

<http://news.ycombinator.com/item?id=222279>

And that's not even getting into the speed of light.

------
rcoder
The way I read this article, it seemed to be a rant taking Apple to task for
wasting their energy on a pervasive multicore strategy, when in their author's
opinion multicore doesn't offer a compelling story for improving performance.

This leads to two obvious questions. First, why exactly does Apple get singled
out here? As I see it, the CPU vendors are the ones pushing multicore; OS
developers are simply trying to make the best lemonade they can from the
lemons that are stalled core speeds.

Second, what the hell is this "core thread" Hank talks about? Re-entrant, SMP-
capable OS kernels have been around for a long time. If the major commercial
OS vendors are just now getting around to implementing such scalability, fine;
that doesn't mean the problem is unsolvable, though.

~~~
hank777
No, I am not taking Apple to task for anything. And I am, I hope, not ranting.
I am simply saying that people should not over-estimate what will come of
Grand Central, or any multi-core abstraction layers, or multi-core as a
broadly helpful performance increasing strategy.

From one of my other commenters:

Multicore is definitely over-promised. I agree with what you wrote and
recommend what Donald Knuth had to say on this
topic:(<http://www.informit.com/articles/article.aspx?p=1193856>)

"I might as well flame a bit about my personal unhappiness with the current
trend toward multicore architecture. To me, it looks more or less like the
hardware designers have run out of ideas, and that they’re trying to pass the
blame for the future demise of Moore’s Law to the software writers by giving
us machines that work faster only on a few key benchmarks!"

~~~
raganwald
"No, I am not taking Apple to task for anything"

I just want to share with you that when I read the title, I thought you meant
something like "The false expectations Apple is setting with Snow Leopard."

After I read the article, I came away with the thought that what you were
saying is "The false hopes people are clinging to when they read about Snow
Leopard."

------
tptacek
I'm confused. Who thought the promise of Snow Leopard was that it was going to
solve concurrent programming? You'd think if it was that big a deal, they'd
have named it Liger.

Apple has done a release like Snow Leopard already: 10.1. This is supposed to
be the release that fixes all the broken stuff and optimizes the platform for
X86 and (more importantly) the X86 chipset and peripherals on Macs. 10.1
worked beautifully.

If this writer had "false hopes" about Snow Leopard, he acquired them from
somewhere other than Apple.

------
stcredzero
The author is using a Mythical Man-Month idea about how fast people can solve
problems and applying it to computer programs and algorithms. Sorry, but that
doesn't wash. You can't automatically analyze the basic blocks and dataflow of
a human researcher's brain, but compilers can do this to programs.

What's been missing is a high-level approach to parallelism. This is exactly
where Apple has a core competency: software tools. I think they stand a good
chance of producing something useful. It won't be universally applicable, but
it doesn't need to be. If it will be useful in making parallelism easier, then
this will be a meaningful success.

------
pavelludiq
The problem is that high clock speeds cost heat and energy consumption. So it
is possible to solve this problem by finding a new material for making
processors. Its only a matter of time until someone figures it out. I read an
article 2 years ago about using diamonds in chips, to solve exactly the heat
problem, the problem was that it was going too be expensive, so maybe if
someone thins of a cheap alternative to diamonds we can have some very good
results. This may take a year or 10 years, i don't know if its even going to
happen, but until then multi core CPUs are a good alternative.

------
richcollins
"The problem is that most algorithms and program logic cannot be made to run
better across many processors" evidence for this?

I was hoping to hear him talk about the Von Neumann Bottleneck
([http://en.wikipedia.org/wiki/Von_Neumann_architecture#Von_Ne...](http://en.wikipedia.org/wiki/Von_Neumann_architecture#Von_Neumann_bottleneck)),
a problem faced by serial and parallelizable algorithms.

------
stcredzero
Intel is up to this with a C/C++ like language. Apple stands a good chance of
making something like this easier to use.

[http://www.informationweek.com/news/software/development/sho...](http://www.informationweek.com/news/software/development/showArticle.jhtml?articleID=208403616)

