
IBM unveils computer fed by 'electronic blood' - a_w
http://www.bbc.co.uk/news/science-environment-24571219
======
eksith
I can't be the only one who thought of Ghost in The Shell while reading this.
If anything, this is a way to explain away the presence of white 'blood' in
the cyborgs of the series.

I can understand the power issue, but I wonder why 3D chips aren't pursued
more. It can't be just the cooling since liquid cooling (as mentioned) is in
use. Maybe inefficiencies in power distribution and distance to adjacent
junctions? If so, parallel execution should help with some of the delays.

~~~
jessaustin
I love GitS, but when I see "white blood" I think of Bishop and the other
androids from the Alien series.

------
IronJustice
From the end of the article:

"But all of the above will not get electronics down to the energy-efficiency
of the brain.

"That will require many more changes, including a move to analogue computation
instead of digital.

"It will also involve breakthroughs in new non-Turing models of computation,
for example based on an understanding of how the brain processes information."

What exactly does all this mean? How would moving from digital to analogue
computation improve energy efficiency? And aren't the whole Turing vs. non-
Turing models of the brain still up for serious debate?

~~~
wmf
Analog computing is dramatically more energy-efficient than digital because it
needs less hardware. For example, you can store an analog value in a capacitor
while storing a digital value requires a multi-bit register that is made of
multiple transistors per bit. Analog multiplication can be done with an
amplifier while digital multiplication requires thousands of transistors. The
tradeoff is that digital computation is exact while analog computation
contains noise which may accumulate through the computation.

~~~
IronJustice
So is there a particular difficulty in overcoming this noise, or is it more
just unexplored territory?

~~~
ehsanu1
Definitely not unexplored territory, analog computation came first. Notably,
the way you program an (electronic) analog computer is by adjusting it's
circuitry.. not exactly the friendliest programming model.

The problem with noise seems to be repeatability. Enough noise, and
reliability becomes a huge problem. With same inputs and analog circuit, you
want the same answer every time, don't you?

You cannot "overcome this noise" by removing it. The perfection of materials,
process and environment required for that would be on the order of an
experiment like this:
[https://www.simonsfoundation.org/quanta/20131010-neutrino-
ex...](https://www.simonsfoundation.org/quanta/20131010-neutrino-experiment-
intensifies-effort-to-explain-matter-antimatter-asymmetry/)

We overcame the noise via digitization. In fact, there is obviously still
noise in our current digital computers, since the components within them are
fundamentally analog, but digital circuits quantize the analog signals,
interpreting the 1's and 0's, despite their analog nature. (this is simplified
and I know next to nothing about digital circuit design)

~~~
flumbaps
I think the main potential in analogue computing is to create complex networks
of feedback loops where different regions of stability correspond to different
machine states. I've seen models of neural network memory where the
interconnection of neurons works like a combination of a symmetric linear
transform and an amplifier followed by vector normalisation. The transform
maps the sensory input into a reduced dimensional space (where each dimension
corresponds to a possible memory). The reduced vector is amplified via the
neural response function, and then it's transformed back to the sensory input
vector space through an inverse to the original transform. That creates a
feedback loop where (because of how the neural response function works)
whatever the input is, the system converges to a vector that corresponds to
exactly one of the memory vectors. It basically picks out and amplifies the
closest memory to the sensory input.

That kind of system is a huge simplification, but similar things could be done
with analogue computing. In particular, I think probabilistic computing could
be done by setting up network feedback loops corresponding to underlying
Bayesian networks, where stable points correspond to highest likelihood
parameterisations. (I may actually do some work in this direction next year,
because it's pretty cool stuff.)

------
LAMike
Could you imagine the graphics that are possible with a one petaflop computer?
Simulation status right?

I'm praying we (humanity) don't nuke ourselves to death during some stupid
WWIII type situation before we get there

~~~
devx
Carmack said we need 5 petaflops for truly realistic graphics.

But I'm not even sure if he considered 4k+ resolutions, 120+ FPS and 3D when
he said that, because all of those may play a role too in the future if we
want "Matrix-like" graphics in our virtual reality goggles or holodecks. So we
might need orders of magnitude more powerful hardware than that still.

~~~
cma
If you can track the eye with low enough latency you only need to render high
res in the tiny fovea region, low res everywhere else.

------
leberwurstsaft
Sweet, I attended a talk about this this spring in our university. The one big
question was about the enormous energy density in these cubes. It must some
way or another get into them, which seemed quite impossible to do. But, I for
one hope this leads to some big leaps in performance that were previously not
within reach for the foreseeable future.

------
jcampbell1
Is it just me, or is this article written by someone who is clueless? I read
the whole thing, and wasn't able to extract a single piece of genuine
information.

Either someone is gaming HN to make artificial up votes, or I need to find a
different forum.

~~~
alan_cx
This article was not written for any one who has even an average understanding
of the subject. It is there to give the most simple, general over view of an
interesting development. It is not supposed to be anything more than that. It
is there for shelf stackers, van drivers, policemen, accountants, solicitors,
carpenters, mechanics, in fact, any one but people with any clue to start
with. People with a clue are doing themselves down by reading it in the first
place. They should already know all about it, and not be blind sided by a BBC
piece.

No, the person who wrote it is not clueless, they have written it for the
clueless.

And that, is absolutely NOT a value judgement on the "clueless".

The vast majority of people very understandably do not have any clue about
this niche science. Why on earth should they? After all, does being a whiz
with Java or something suddenly mean this should even begin to have any
understanding of this? Why should it? So why should a brick layer have any
idea at all? All the average "clueless" want is a very, very basic over view,
such that they can essentially say, "oh wow, that's cool. Oi Dave, look at
this". And if they are even slightly inspired, they would go off and drill
down to the proper scientific detail else where, using something revolutionary
like Google.

If you want any more than that, then I'm sorry, its not the remit of the BBC.
And no body even close to knowing the basics of this sort stuff would be
getting their information from the BBC. Like I have said, they will already be
very clued up, right? And one in the middle will not even read the article
fully. They will get enough information to go to a more thorough source.

I get really frustrated with people who seem clueless themselves about what
the BBC is there for, and who articles like this are aimed at. It is always
easy for some expert to slag off BBC articles, in the way you have, when they
were never ever written to stand up to peer review or some such high standard.
I imagine an "expert" could easily pick holes in literally every single
article the BBC has ever published.

What is really depressing is that you have obviously read enough to be
interested. But instead of being interested enough to search for more
information, you come here to lay in to the author. Is it really that much
easier to complain than research? Does the BBC have to spoon feed every one,
on every subject, at every level? No. Its a broadcaster, not a collection of
all the worlds best universities. The let you peek in, the rest is up to you.

------
devx
While this may be a huge improvement in computer power/space occupied compared
to today's classical computers, since nature tends to "compute" stuff so much
more efficiently than our PC's, it still feels like a pretty transitional
phase to me versus quantum computers.

We need to learn how to compute at the sub-atomic level with mind-blowing
efficiency (trillions of trillions more operations for insignificant power
consumption). Once we learn how to do that, we'll be the masters of our
galaxy, or potentially even the universe.

------
webhat

      The art of liquid cooling has been demonstrated by Aquasar
      and put to work inside the German supercomputer SuperMUC
      which - *perversely* - harnesses warm water to cool its
      circuits.
    

How is it perverse to cool hot circuits with warm water?

~~~
wmf
The efficiency of cooling is proportional to delta t, so it's much better (for
the computer) to cool a computer with cold water. But how did you produce that
cold water? Probably with some energy-intensive refrigeration process. Thus
AFAIK it's more efficient overall to use a higher inlet temperature
(presumably with a higher flow rate).

------
adamnemecek
"Their vision is that by 2060, a one petaflop computer that would fill half a
football field today, will fit on your desktop."

Will it be able to run Crysis?

But seriously, I'd be even more interested in how these computers will be
programmed. Any wild speculations?

~~~
AlexanderDhoore
Abstraction will be the answer. We will need very, very high level domain
specific "programming languages".

Problem is: I feel like we are abstracting software slower than computers are
getting faster. Is Javascript really that much more abstract than machine code
(assembly)? I think not. In the long run, what we do now are baby steps.

We are still telling computers exactly what to do. Every step needs to be
spelled out. And we're doing it by text files...

~~~
7952
Think about the huge amount of computing power currently used to run gmail on
millions of web browsers, servers, and apps. It could be viewed as one big
computer program that just happens to run different parts on different
devices. The benefit of separating it into parts is that it gives the program
close proximity to different data. A massively parallel system would have to
divide data into chunks that are separated by latency and bandwidth
limitations in a similar way. The language abstractions are less important
than the unavoidable physical limitations. Surely the event driven approach of
Javascript is a good solution to this latency problem?

------
soundwave
More info here: [http://www.research.ibm.com/cognitive-
computing/neurosynapti...](http://www.research.ibm.com/cognitive-
computing/neurosynaptic-chips.shtml)

------
X4
wow that looks insanely close to what I remember from Terminator.

Compare yourself: [http://imm.io/1iQis](http://imm.io/1iQis)

------
chaosmatic
Does anyone know much about the practicalities of the chemistry behind this?
I've almost finished looking at redox at high school.

