
Rambus, Microsoft Put DRAM into Deep Freeze to Boost Performance - RmDen
https://www.nextplatform.com/2017/05/06/rambus-microsoft-put-dram-deep-freeze-boost-performance/
======
ChuckMcM
Heh, I thought "wait a minute didn't IBM do this 20 years ago and it was a
total flop?" and yes, and its one of the same people involved.

Cryogenic computing wasn't a bust because the technology didn't work. It does.
It was a bust because silicon _really_ hates to transition between cryogenic
temperatures and room temperature. If you transition it quickly (say you pull
a card out of the liquid nitrogen and start working on it) it will crack as it
warms up unevenly. As a result you needed anywhere from 20 to 48 hours to get
a card from 'cryo' temp to 'room' temp, and while you could cool a bit faster
it still took longer than just dumping it in LN2. So repairs and maintenance
were multi-day affairs. Compare that to a modern AWS, Google, or Azure data
center where a system fails, a tech can skate out to it with a new
motherboard, pull the old one, put the new one in, and poof you're back on
line in under 30 minutes.

As a result cryo computers either had to have failure rates that were so low
that a repair that required transitioning the hardware through a
cold/warm/cold cycle rarely happened, or you had to have enough extra hardware
to support your base load while part of it was slowly going through the
cold/warm/cold cycle.

I don't know how much IBM spent, but it was a _lot_ and they never cracked
that nut.

~~~
rbanffy
As long as the gains out weight the overhead of having extra capacity to
compensate for the longer repair cycles, it's still a win.

You can make a unit operate in cryogenic temperatures and, when it fails,
power it down and manage its cooling long enough it can be removed safely.
When repaired, just re-add it and cool it down until you can safely bring it
online.

~~~
ChuckMcM
Agreed, and for 25 years IBM could not make that equation balance out. They
tried a _lot_ of different ideas. You can see some of their patents for
examples. (for a fun time go to patents.google.com and search for 'cryogenic
ibm' there are _lots_ of patents)

------
JohnBooty
> “Something not many people are talking about is playing the temperature
> card,” Bronner says [...] "All the talk I hear is about stretching Moore’s
> Law in the datacenter, such as using GPU accelerators. Those will work, but
> it’s a one-time shot. You use it once and you are done.”

But isn't "the temperature card" something you only get to play once as well?
Once you've adopted supercooled computing, where do you go from there?

(This guy knows there's such a thing as absolute zero, and that you can't just
keep getting colder, right?)

Sorry, I guess I still have antipathy towards Rambus since the 1990s. =)

~~~
MichaelBurge
Not sure about the physics, but supercooled chips might use superconductors.
Those would have zero resistance, and maybe that means they'll emit hardly any
heat. And since modern chips are so heat-constrained, it would allow them to
change the design entirely.

------
frik
Is Rambus still a thing? I remember their vendor lockin RAM memory from
Pentium 4 era. Later there were some bad PR and lawsuits, and the PC world
moved on.
[https://en.wikipedia.org/wiki/Rambus](https://en.wikipedia.org/wiki/Rambus)

~~~
djsumdog
Wow, I didn't know about the patent trolling:

[https://en.wikipedia.org/wiki/Rambus#Patent_Lawsuits](https://en.wikipedia.org/wiki/Rambus#Patent_Lawsuits)

I just remembered the Intel/Rambus boards that all had to be recalled because
of Rambus issues (IIRC, the issue wasn't with the ram, but actually the boards
themselves. Many of the boards were given to employees who took the ram and
chucked the boards).

~~~
yuhong
Yea, I think you are probably referring to how i820 RDRAM boards had to be
limited to two slots. This kind of signal integrity issues is not limited to
RDRAM.

------
davidgerard
Is this the same Rambus that essayed into patent trolling standards? If so,
I'm surprised they didn't change their name.

~~~
exhilaration
Yeah, it's a blast from the past. I was surprised to see them on the HN front
page.

------
petra
Why would cooling offer such big performance improvement to DRAM ? even if it
improves transistor witching speed, wire switching speed shouldn't be affected
and it is a significant component of latency , right ?

~~~
Neliquat
I would imagine the cooling itself isnt so much speeding it up as it is making
it reliable at higher clock speeds.

------
jdonaldson
If they were smart, they'd advertise this with : "Tech Ops HATES him! Check
out this one weird trick that increases your memory speed 200%!!!"

