
Japanese supercomputer blisters 10 quadrillion calculations per second - coondoggie
http://www.networkworld.com/community/blog/japanese-supercomputer-blisters-10-quadrillio
======
patio11
Let me hum a few bars: suppose you were sitting on an enormous pile of money
(borrowed from your citizens, but they don't really get irked about that), you
had an industry which was almost universally popular (one of the triumphs of
marketing technology in your country), its primary beneficiaries were
important constituencies, and you could justify almost any amount of
expenditure by saying it was for National Greatness And The Advancement Of
Science (TM). You might spend an awful lot of money on white elephant
projects, regardless of whether that was an efficient method of achieving the
stated goals of the program.

For the benefit of previous employers: いや、うちのプロジェクトではなくて、アメリカのNASAを考えていますよ。

~~~
anigbrowl
Processing masses of seismic data and reorganizing Japan's energy
infrastructure with or without its existing nuclear fleet seem like the sort
of problems that would benefit from ultra-large computational capacity, to
name but two possibilities.

------
ChuckMcM
I find these announcements interesting, but perhaps more interesting from a
technology point of view but they do seem a bit like putting a V8 engine in a
mini-cooper just to show off. They offer the following possible "uses":

 _Analyzing the behavior of nanomaterials through simulations and contributing
to the early development of such next-generation semiconductor materials,
particularly nanowires and carbon nanotubes, that are expected to lead to
future fast-response, low-power devices._

 _Predicting which compounds, from among a massive number of drug candidate
molecules, will prevent illnesses by binding with active regions on the
proteins that cause illnesses, as a way to reduce drug development times and
costs (pharmaceutical applications)._

 _Simulating the actions of atoms and electrons in dye-sensitized solar cells
to contribute to the development of solar cells with higher energy-conversion
efficiency. Simulating seismic wave propagation, strong motion, and tsunamis
to predict the effects they will have on human-made structures; predicting the
extent of earthquake-impact zones for disaster prevention purposes; and
contributing to the design of quake-resistant structures._

 _Conducting high-resolution (400-m) simulations of atmospheric circulation
models to provide detailed predictions of weather phenomena that elucidate
localized effects, such as cloudbursts._

Which sound a bit squishy. I'd be interested to know what China's super has
done since they showed pictures of it. Clay Dillow over at the Popular Science
blog has been putting out snippets on various super computers under the titles
"what are you working on now?" and it seems like a whole lot of nothing.

Franklin (#27) - [http://www.popsci.com/technology/article/2011-10/day-life-
su...](http://www.popsci.com/technology/article/2011-10/day-life-
supercomputer-nerscs-franklin-oct-27-2011) Climate change modelling.

iForge (unranked) -[http://www.popsci.com/technology/article/2011-11/what-are-
yo...](http://www.popsci.com/technology/article/2011-11/what-are-you-doing-
today-iforge) Doing fluid dyanmics (this is probably the most common thing)

Roadrunner (#10) -
[http://www.popsci.com/technology/article/2011-10/roadrunner-...](http://www.popsci.com/technology/article/2011-10/roadrunner-
what-are-you-working-today) Super sekrit weapons stuff. Which seems to be the
most common use. (Is there like some sort of open source nuke simulator or
something?)

Anyway, there are probably a dozen machines PopSci has looked at and I really
am trying to get a sense of how they pay for themselves.

~~~
hugh3
I don't get it. If materials modeling, drug discovery, seismology and
atmospheric modeling sound "squishy" to you, what on Earth _wouldn't_ be
squishy?

Most of those machines are shared between a bunch of researchers working on a
bunch of different problems. There's some great stuff going on, some semi-
great stuff going on, and probably some fairly useless stuff going on, too.
I've used many of 'em myself. In terms of science done per dollar spent, big-
arse supercomputers are certainly far more efficient than most publicly-funded
research.

~~~
ChuckMcM
Fair enough, non-squishy is the return on investment Amazon gets with AWS from
its clusters. I don't really expect a monetary return on pure research
(although it does happen) and I don't see a lot of papers rolling by in
Science, Nature, or Xarchiv where the work required the use of one of the top
500. Nor do I see announcements from Pfizer or Merck saying "This new
simulation allowed us to track down this drug in record time which treats
condition X and _because we're more efficient we can charge less for it and
still make a huge profit._ "

The shuttle CFD work that Ames did, came out with a lot of info on their use
of the Cray super to do the modelling, that is an example of something I would
expect to see.

However, as others have pointed out you can get a machine that is close to
that of a Cray 2 today off the shelf.

Schlumberge has one doing oil field seismic analysis, I understand the value
proposition there. But there are many many machines on the T500 list where I
just wonder, "Ok why do they have that?"

~~~
hugh3
_I don't see a lot of papers rolling by in Science, Nature, or Xarchiv where
the work required the use of one of the top 500_

There are many papers in Science and Nature based on work from supercomputers.
And if by Xarchiv you mean arXiv, the proportion is bloody huge. Just try
searching for "NERSC" for instance and you'll find that NERSC alone leads to a
zillion publications per year.

 _there are many many machines on the T500 list where I just wonder, "Ok why
do they have that?"_

And you take the fact that you personally don't know what they're being used
for as evidence that they're not doing anything worthwhile?

Most of these machines aren't used for one big project, they're used by
hundreds of researchers for hundreds of small projects.

~~~
ChuckMcM
_"And you take the fact that you personally don't know what they're being used
for as evidence that they're not doing anything worthwhile?"_

That was not what I said. Clearly some organization ponies up a few hundred
million to build one of these systems and signs up to feed it power they think
what ever it is doing is "worthwhile." I would love to understand that so I
posed the question.

I think I mentioned that I understand the NERSC vision, sharing super
computers around for science. And yes, there are lots of citations about their
facilities, and they have six such machines on their homepage [1].

My question is if you aren't a government or university, what value would you
derive from having a supercomputer on the T500 list? (patio11's excellent
response not withstanding). Off list someone pointed out that Boeing uses
their computer for structural analysis of airplanes made of carbon fiber. That
seems like a perfectly reasonable use, one plane like the 787 is probably 5 -
6B$ of sales over its life time.

[1] <http://www.nersc.gov/users/computational-systems/>

~~~
hugh3
Well, there aren't _that_ many computers on the top500 that aren't owned by a
government or university. I'm too lazy to look through the whole 500, but just
in the top 100 let me count.

Airbus has one at 29. I think it's clear what they use that for.

Vestas Windsystems in Denmark has number 53. Presumably they're optimizing
their wind turbines.

IBM has a couple, which they largely run like shared research facilities.

And that's pretty much it for commercial research facilities in the top500.

------
phren0logy
It "blisters" them? Presumably that problems is most noticeable with a fresh
boot.

------
pcvarmint
This K machine is a months-old story. Why the sudden jump in news stories on
it?

~~~
wmf
It was just upgraded.

------
rorrr
I stopped being impressed by supercomputers like 5 years ago. These days it's
all about money. You can buy as many CPUs/RAM, and connect them together.

I'm much more impressed by the advances in CPU architecture. SSDs have been
amazing so far. Networking has been improving too.

~~~
cabacon
What impressed you about the supercomputers of 5 years ago that no longer
impresses you about today's supercomputers? The Cray approach
([http://www.nccs.gov/wp-
content/uploads/2011/03/UserMeeting20...](http://www.nccs.gov/wp-
content/uploads/2011/03/UserMeeting2011.pdf)) and the IBM approach
(<http://en.wikipedia.org/wiki/Blue_Gene#Blue_Gene.2FQ>) both seem more
interesting than just connecting together as many CPUs/RAM as you can afford.

There are interesting OS challenges for the compute nodes, interesting
hardware challenges on interconnect, interesting filesystem challenges,
interesting programming challenges to manage the parallelism, competing
architectural approaches between CPUs and GPU-accelerated clusters ... it's a
pretty interesting space!

------
toblender
After I found out about bitcoin, it's the first thing I think about when they
come out with faster hardware.

~~~
wmf
K doesn't have GPUs, so it sucks for Bitcoin mining. In general,
supercomputers contain extra stuff that isn't useful (and thus isn't
profitable) for mining.

------
michaelcampbell
How long does it take to boot Windows?

