Hacker News new | comments | show | ask | jobs | submit login

With a little imagination, you can come up with countless ideas on how to use a thousand cores for each of the applications you listed. I've listed a few, which are likely a bunch of crap, but I'm just one person. There are millions and millions of other minds coming up with ideas better than these every day.

And who are you to say that today's parallel algorithms for solving things like these are optimal? To me, it certainly seems worthwhile to consider possible alternate parallelization models.

> My email client is pretty much I/O-bound. > My word processor is perfectly well able to keep up with my typing speed.

Better speech to text? Better prediction (so you have to type less)? Better compression for attachments? Better grammar checking?

> My web browser is largely I/O-bound, except on pages that do stupid things with JavaScript.

It's clear that web sites are slowly transitioning into a role that is similar to desktop apps, so there will be a million applications that can use hefty computational power.

> Compiling things can take a while, but that's already easily parallelized by file.

Better optimizations? Things like evolutionary algorithms for assembly generation come to mind, which could use any conceivable number of cores. Better static analysis?

> I'm told image/video processing can be slow, but there are already existing algorithms that work on huge numbers of cores (or on GPUs).

GPUs are typically not as good as CPUs for integer math, in particular they're not deeply pipelined, etc. This could conceivably become much faster.

> Recalcing very large spreadsheets can be slow, but that should be rather trivial to parallelize (along the same lines as image processing).

There are a ton of things that could happen here with massive computational power. Spreadsheets that use machine learning techniques to predict things? More powerful models?

>* So isn't the article pretty much garbage?*

Come on, people have been coming up with novel uses for our ever-increasing computational power for well over a hundred years now (from mechanical computers to octo-core desktop machines). Why would they suddenly stop now?




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: