

Faster Chips Are Leaving Programmers in Their Dust - pixcavator
http://www.nytimes.com/2007/12/17/technology/17chip.html?ex=1355547600&en=61dfe25f3006afd7&ei=5090&partner=rssuserland&emc=rss

======
iamelgringo
Ok, so maybe I'm missing this whole multi-core dilemma. But I don't see that
big of a problem. I just think that people are asking the wrong questions. The
question isn't how can I write software for mutli-core chips, it should be
"what problems are solved best with multiple cores?"

There are categories of "embarrassingly parallel" problems that have been
solved for years using multiple cores: video rendering, 3D graphics rendering,
etc... In short, anything that does repetitive processing on large datasets.

Now, people are upset, because we're not going to be able to improve the
software for the average user with multiple cores, and they are correct if
they want word processing and email to get better with multiple cores. The
question that we need to be asking, is this: how can embarrassingly parallel
problems make user software better?

How can we use multiple streams of video in software? How can 3D rendering
improve my software? Or, what sort of very large datasets can I process with
multiple cores on the desktop?

The companies that answer those questions (e.g GOOG) are companies that will
make a lot of money in the next decade.

~~~
mechanical_fish
I agree.

At least half of the "dilemma" is pure, unadulterated marketing. In this
article, the marketer is Microsoft, which is trying to convince end users that
there's some kind of big problem, one which no mere mortal can comprehend,
that is somehow preventing our expensive new Vista machines from being any
better than the XP boxes they replaced. It certainly has nothing to do with
Microsoft's incompetence, nor with their slavish devotion to Hollywood-
approved, mind-mangling, box-breaking DRM. And it's certainly not their
insistence on foisting incompatible proprietary crap like IE on the industry.
No, it must be a Fundamental Problem of Computer Science that is holding us
back. Naturally, this problem can only be solved by the big academic brains
that work for... Microsoft!

The email example is a dead giveaway... it's laugh-out-loud funny:

"In the future, Mr. Mundie said, parallel software will take on tasks that
make the computer increasingly act as an intelligent personal assistant."

Ah, the intelligent personal assistant -- it's the application of the future,
and it always will be.

How do we know that intelligent email processing is not being held up by the
lack of suitable coding techniques for eight-way parallel processors? Because
I have a dual-core processor right now, and it spends the night contemplating
its digital navel and counting to 2^64 by fives for fun. If there was
something smart it could be doing with my email, why isn't it working on it
right now? I am _drowning_ in unused processor cycles.

~~~
jgrahamc
+1 Insightful

The Microsoft personal assistant example is just cracking me up. Firstly,
Microsoft has been selling the whole 'automated assistant' thing to us for
YEARS. Clippy is just one hideous example.

Secondly, automated inbox processing has been around for _years_. My POPFile
open source program is now over 5 years old and there are older examples than
that (I was doing automated emailing sorting in the late 90s and others before
me). So multi-core machines are what's holding this back? What a joke.

Thirdly, it mentions features (looking at who I correspond with) that are
already available (see Xobni and others). And automatic response systems are
also around to deal with customer service.

Sorry, for the rant put two pages of fluff about multi-core processors with
some freak out speculation about email processing that's been available for a
long time.

How about talking about something interesting, like parallel aware languages
(Occam, erlang, ...)?

------
geebee
Nice article, but what a strange title. Faster chips are leaving programmers
in their dust? Huh? I wasn't aware that I was competing with cpu speed. All
this time, I thought more processing speed expanded my possibilites. After
all, my entire field (mathematical optimization) would be pretty useless
without lots and lots of processing power.

I talked to a dude with an OR background, and he said that in the 80s people
laughed at the notion of using this kind of math to solve business problems.
Processing power increases the opportunities for programmers, without
question.

Anyway, fun to read, just a very strange take on the relationship between
programmers and cpu speed.

~~~
wmf
The point is that more processor speed expands your possibilities _only if you
are smart enough to actually use it_. Lazy (dare I say "blub") programmers
_are_ being left in the dust.

(Of course, this isn't the first time some programmers have resisted change. I
remember a great rant about how if you program in assembly and never hit in
the cache, the Pentium 4 was no faster than the Pentium III and thus useless.
That guy probably quit programming entirely when multicore arrived.)

------
pchristensen
Maybe those extra cores can be used to do a better job of remembering,
learning, and anticipating what users will do. This could pre-load or unload
applications or components based on historical usage, rearrange toolbars to
have the most commonly used commands, and in general make the kinds of
improvements to computing in general that the squiggly underlines did for
spell checking last decade. Right now computers are largely deterministic and
users have to learn how the computer works. That's why users hate computers.
Some of that pain could be taken away. I think this will benefit OSes, thick
clients, and installable programs more than the web.

~~~
jgrahamc
Magic toolbar rearrangement is a world of pain. That's like having my SO
notice that I always look for my keys before I go out the door so she moves
them to the table near the front door, when I habitually put them in the vide
poche near by desk.

~~~
pchristensen
I'm not saying that's what I would want, but have you ever watched a non-
techie use the computer? Just looked over their shoulder and watched them?
It's horrifying! I can only watch my wife for about 2 minutes before I start
belting out things like "You don't need to double click links on the web!" or
"If you double click on an app, you have to give it a couple seconds to load
or else you're making the problem worse!"

What a smart OS would do was watch you for a while and notice what you do, and
then make a decision based on the statistics: let's say 85% of the time, your
keys are in the vide poche and you find them immediately, and the other 15% of
the time they're somewhere else and it takes you 5 minutes to find them. Then
it would warp them to the right place (the place your actions showed them was
right) with a note and an option to "always do this". I shudder to think of
MSFT's implementation of this (Clippy), but if a smart company along the lines
of Humanized did it, and used statistics instead of some clever algorithm
(didn't we just read about Google doing that?), it could be a godsend for
average users.

------
chaostheory
"Indeed, a leading computer scientist has warned that an easy solution to
programming chips with dozens of processors has not yet been discovered."

weird I thought it was called erlang

~~~
cellis
erlang is not easy. At least, not for me, a mere mortal.

~~~
chaostheory
ok let me elaborate - it is "easier" to do parallel programming on erlang
compared to other languages

~~~
cellis
I know this. I watched a video on this whole argument not long ago...link is
floating around on YC somewheres.

~~~
chaostheory
well then why complain?

------
mattmaroon
The article seems to be a retread of the same thing that's been printed every
day for 20 years. "New processors are going to make all these new things
possible." And yet there's still no computer holding a reasonably intelligent
conversation with you if you call Dell customer support. Actually, there isn't
even a human doing that.

------
mynameishere
If just a small portion of an application is non-parallelizable, then a
million cores won't help you.

~~~
jgrahamc
I think it's time for a refresher on Amdahl's Law.

<http://en.wikipedia.org/wiki/Amdahl's_law>

The non-parallel portion limits the available speedup.

------
gills
PR.

~~~
chaostheory
I don't know why u got modded down - this is a submarine PR article for MS...

~~~
gills
How do I tell if it was modded down?

Maybe I should explain my terse comment. But first, I agree that it's a
submarine PR piece for Microsoft. How many software companies are cited in
that article? How many technologies are mentioned and/or explored? In short it
says "Multi-core processors have potential, and [only] Microsoft has the best
minds trying to exploit that potential."

Parellelism has been available in less local forms for decades.
Supercomputers, clusters, more recently grid computing. These aren't new
problems for computer science, they are only new problems for desktop software
developers that have always assumed a simpler processor architecture.

So imagine a program that could identify parallelizable sequences of
instructions in other programs, and split the program accordingly. Such a
program may be able to exist, but it appears to be a very hard problem (it's
even a hard problem for humans). Vista is definitely NOT this program.

~~~
chaostheory
"How do I tell if it was modded down?"

your 1st comment had a -1 point on it

------
DarrenStuart
surely if MS could make it so threads were processed by spare cores
automattically then this would make life easier.

Servers could use the cores to process more connections and lessen the need
for more servers. I guess virtualization could be used.

------
edw519
"...we at least have a nucleus of people who kind of know what's possible and
what isn't..."

A sure fire recipe for missing the boat.

Honestly, which would you rather have, more power in your hand or a faster
pipe to the rest of the world?

