
Obama sets $126M for next-gen supercomputing - Husafan
http://www.computerworld.com/s/article/9209918/Obama_sets_126M_for_next_gen_supercomputing
======
mikerhoads
Maybe the next generation of supercomputers can figure out how to pay off the
debt this funding creates.

~~~
svlla
My reaction is more like... $126m? That's it?

"IBM won a $244 million DARPA contract in November 2006 to develop a petascale
supercomputer."

~~~
okaramian
Similarly the prosthetic mind controlled arm took 100 million dollars to
develop:

[http://healthland.time.com/2011/02/11/coming-soon-
the-100-mi...](http://healthland.time.com/2011/02/11/coming-soon-
the-100-million-dollar-robotic-arm/)

Meg Whitman could have either had a sweet robotic arm or a super computer for
the cost of a failed political campaign.

~~~
mrtron
I remember her cost per vote acquired was something like 8$?

That kind of spending has an anti-democratic vibe to it.

------
drallison
The budget proposal is for more than $126M split between the Office of Science
($91M) and the National Nuclear Security Administration ($36M). For the first
time this is called "exascale"; in the past it was known as "extreme".
Advanced computing gets $465M at DOE, of which this is part. The actual
numbers are all speculation as the budget is the work of Congress and not the
straw man proposed by the President.

Exascale computing presents major technical problems. See the short note by
Peter Kogge in IEEE Spectrum,
[http://spectrum.ieee.org/computing/hardware/nextgeneration-s...](http://spectrum.ieee.org/computing/hardware/nextgeneration-
supercomputers/0), and the 2008 DARPA report,
[http://www.er.doe.gov/ascr/Research/CS/DARPA%20exascale%20-%...](http://www.er.doe.gov/ascr/Research/CS/DARPA%20exascale%20-%20hardware%20\(2008\).pdf)
.

The problems scaling traditional supercomputers into the exascale range are
many and include, for example, the fact that that current computers require
about 1.5 to 4 MW/PF.

The US used 61 billion kilowatt-hours of power for data centers and servers in
2006. That's 1.5 percent of all US electricity use, and it cost the companies
that paid those bills more than $4.5 billion. This is expect to double by
2011.
[http://spectrum.ieee.org/computing/hardware/nextgeneration-s...](http://spectrum.ieee.org/computing/hardware/nextgeneration-
supercomputers/0) A BOE computation says that if we were to simply scale
petaflop computers to a single exascale cluster the power consumption would be
4GW, or roughly 10% of all US electricity used in the US in 2006.

Clearly computation on this scale is not a priority or the real money is
elsewhere.

------
bioh42_2
_Exascale systems are 1,000 times more powerful than the Tianhe-1A, the
Chinese supercomputer that was recently ranked as the world's fastest._

Alright, who here watched Watson on Jeopardy and now thinks what ever 1,000
times more powerful computer they build will have strong-AI?

~~~
pjscott
The heavily statistical techniques used in Watson won't turn into a strong AI
just by throwing more computing power at them, any more than I can turn into a
horse by walking on my hands and knees while snorting periodically.

[http://www.scribd.com/doc/13863110/The-Unreasonable-
Effectiv...](http://www.scribd.com/doc/13863110/The-Unreasonable-
Effectiveness-of-Data)

(Wow, that analogy ended up a little more disturbing than I'd intended.)

------
burgerbrain
I love how the only justification of why the government might need such a
computer, or why this spending might be a good idea, is "it will be bigger
than China's".

~~~
MichaelSalib
A lot of the government's supercomputer funding is much more practical than
that: its about doing simulation on nuclear weapon assemblies. During the cold
war, nuclear weapons were not designed for long lifetimes or to be upgraded
after construction. Most of our nuclear stockpile is or will soon be past its
design lifespan. We can't just replace them with new weapons because of the
Comprehensive Test Ban Treaty (yeah, it hasn't been ratified, but no
administration is excited about breaking it and effectively telling everyone
on Earth that the US has no interest is sticking with the NPT).

So we have weapons assemblies that are degrading every day and we need to
ensure that they work (i.e., they detonate when instructed to and more
importantly fail to detonate when NOT commanded to). Physical inspection and
retrofitting is extremely difficult because the assemblies were not designed
for that. So the only option left is massive simulation. Which is what
Lawrence Livermore and the other national labs do. China's got nothing to do
with it. Without supercomputers, the US lacks an effective nuclear deterrent
(or at least, confidence that it has an effective deterrent).

~~~
burgerbrain
_"(or at least, confidence that it has an effective deterrent)"_

Well that's the key isn't it. The only effective way to use nuclear weapons is
to "not" use them. They could just as easily just release a few press-releases
saying "oh yeah, we checked it out and these things totally still work", and
almost as if by magic, they _would_ continue to work.

~~~
MichaelSalib
But no one would believe that. It just wouldn't be credible. This isn't a new
problem; it has been building for decades. Components age. Materials fracture.
Seals crack. Everyone knows that and you can't wish that knowledge away.

I mean, the national labs that are charged with ensuring the safety and
reliability of the nuclear stockpile are not some tiny 5-man operation in the
middle of nowhere; we're talking about thousands of people working an hour
away from San Francisco, very much plugged into the local university and tech
scene.

~~~
burgerbrain
It's not like they are publishing in detail all of their findings currently.
For all we _really_ know, they _already_ are bluffing, and just using all the
money for hookers, blow, and blackjack.

