
Servers Too Hot? Intel Recommends a Luxurious Oil Bath - theklub
http://www.wired.com/wiredenterprise/2012/08/servers-too-hot-intel-recommends-a-luxurious-oil-bath/
======
cnvogel
This of course looks very cool, but to me it does not seem practical besides
edge-cases, even though the technology is not new in any way, high-powered
transformers which you find in power plants, for example, are oil-filled since
basically forever.

The practical problems on the other hand are not to be ignored: The oil needs
to be kept rather clean, otherwise it will loose its good insulation
properties, hence you need sealed containers or purification devices, in
industrial scale deployments this means keeping an eye on the chemical
composition of your coolant by regular chemical analysis.

Connectors and cables need to be really oil-tight, otherwise the oil will
creep out through cables hanging out of a closed vessel (even if they go
higher than the oil-surface)!

It's not an efficient technology in terms of the amount of coolant that is
used: You put everything into the oil-bath, both the high-energy-density
components (CPU, graphics-card, maybe chipsets, fast RAM), regardless of their
energy consumption: Imagine a full data center, you'd basically need thousands
of cubic meters of pure, constantly filtered oil... For the good old cray-
computer often used as an example it was different: There computing was spread
out over a vast number of logic gates that deposited their waste heat over a
really high volume.

The examples I have seen (casemodders being a good example) were also rather
generous with container-size and amount of coolant where enough circulation
and cooling was provided by natural convection. "Professionally", one would
use this technology for a reason, hence in devices that are even more densly
packed than usual blade-centers and there I expect issues with forced
circulations of warm/hot oil just as we have hot/cold air distribution
nowadays.

A technology like currently deployed water-cooled devices, where the majority
of heat is collected by water at the concentration points (mainly CPU) and air
is taking care of the (much reduced) remains seems much more sensible to me.

~~~
sigkill
This may be only partially relevant, but I'm reading about transformer
protectors and I can see all sorts of things going wrong here. Extreme heat
causes arcing in the oil and consequently fire and/or explosion, in
transformers. I don't know how that'd work here.

~~~
mistercow
Arcing is a problem with open air transformers too, and one of the functions
of oil in oil-filled transformers is to prevent arcing. Probably this property
simply breaks down at very high temperatures, but since arcing isn't generally
a problem for computers in the first place, it shouldn't be a problem
regardless of oil temperature.

~~~
lutusp
> Arcing is a problem with open air transformers too, and one of the functions
> of oil in oil-filled transformers is to prevent arcing.

That's part of the reason for oil-filled transformers. The other reason is to
cool the heat-generating parts by relying on a liquid's tendency to create
convection loops between the warmer and cooler areas.

<http://en.wikipedia.org/wiki/Transformer_oil>

A quote: "Transformer oil or insulating oil is usually a highly-refined
mineral oil that is stable at high temperatures and has excellent electrical
insulating properties. It is used in oil-filled transformers, some types of
high voltage capacitors, fluorescent lamp ballasts, and some types of high
voltage switches and circuit breakers. Its functions are to insulate, suppress
corona and arcing, and to serve as a coolant."

~~~
mistercow
Right, that's why I said "one of".

~~~
lutusp
I wasn't being critical, only expanding your point.

------
rdwallis
I built a mineral oil PC a couple of years ago.

I epoxied the motherboard mount from an old computer case to some wood and
then stuck the wood to the inside of a cheap fish tank. I installed the
computer sans harddrive and then poured in 12 liters of mineral oil that I'd
bought from a vet (Vet's use it as a horse laxative).

It resulted in a perfectly silent system but because I didn't cover the top it
was also a really effective fly trap. Within a few months there were a bunch
of dead flies floating at the top of the tank. Also the weight made it a real
effort to move around and no matter how careful I was there was always a mess
when I needed to swap out or adjust a part.

Still it was a lot of fun to build and I used it as my primary machine for
about a year.

------
mistercow
It's fascinating that I heard about homebrewers doing this kind of thing
around a decade ago (although I believe they were keeping the hard drives un-
submerged rather than sealing them up) yet this is only now being considered
"cutting edge" as a commercial option.

~~~
sliverstorm
It's just a reminder that "commercially viable" is not the same thing as
"possible".

~~~
mistercow
Yet I wonder what really has changed that makes it commercially viable now.
Pumping fluids around isn't a technology that has significantly advanced,
after all. Is it just that Moore's law has ceased to apply to heat and power
usage? Is it that CPU manufacturers finally got around to testing and
falsifying the long-assumed-true hypothesis that servers need to be kept in
ultra cool rooms to begin with?

~~~
sliverstorm
I spent some time reflecting on this, and I think it's because what the
bottleneck is has changed over the years. For a while, I/O was the real
challenge for servers. But hard drives have improved a ton, and many-drive
systems have really improved a lot. Memory had its day, but the price has
plummeted.

Today, the limiting factor everyone is focusing on is core density, and cores
generally produce more heat than any other component aside from high-end
graphics and displays.

Of course, power consumption and thus heat has been continually improving, but
not by orders of magnitude. Core density, on the other hand, _has_ been going
up by a few orders.

As a result, you've got to move more heat, and because density is what's going
up, you've also impaired airflow.

Take this with a grain of salt though, it's just speculation based on what
I've seen & heard.

------
w1ntermute
Reminds me of a Bioshock Casemod from a few years ago[0]. From what the
creator said, it doesn't even sound that expensive.

0:
[http://www.reddit.com/r/gaming/comments/dmrpv/bioshock_miner...](http://www.reddit.com/r/gaming/comments/dmrpv/bioshock_mineral_oil_media_center_case/)

------
amalag
I really cannot understand why this is not more prevalent. The cost savings of
not having air conditioning for a datacenter should be well worth any mess of
oil.

~~~
ars
There are lots of problems.

First the standard vertical rack wouldn't work - you would not be able to
remove any blades from the rack since they are submerged.

You would not be able to lift the entire rack out of the tub, since you might
not have enough height, but worse the other computers in that rack would
probably burn up from a sudden loss of cooling.

So you would have to switch to horizontal racks, so you would lift one blade
out without affecting the others. Of course that wastes tons of space. If you
want lots of servers you would need multiple racks one above the other - but
then how would you service the upper ones?

And you are wasting space again - you need air height above each rack.
Normally in a vertical server, the gap you need to remove the server is the
same gap you need to walk there, so no extra wasted space.

And then there are weight issues - a typical 21.5 inch x 19 inch x 8 feet rack
would weigh about half a ton just from mineral oil! (Not including any metal.)
Good luck building something strong enough to hold lots of them, and forget
mounting them one above the other.

All of this can be designed around, but it's not a simple matter - for now you
would need to make it custom.

~~~
PayUpPal
Blades in huge datacenters are not serviced one by one. Typically you'd wait
until say 20% of the blades fail and only then fix and upgrade the whole rack
at once.

------
webwielder
Midas Green Tech, anyone? <http://www.midasgreentech.com>

------
rwmj
According to the article, Intel recommends removing the grease between the
processor and the heat sink. Question: Why does it need a heat sink if it's
being cooled by oil?

~~~
lutusp
> Why does it need a heat sink if it's being cooled by oil?

The heatsink couples the heat source (the electronics) to the oil. The metal
heatsink has much higher thermal conductivity than the oil. The advantage of
the oil is that it circulates and carries heat away to another heatsink that
ultimately dissipates heat to the air.

------
elorant
I recently saw a video about a similar implementation at an annual conference
in Germany for PC modders that won first place. It looked pretty awesome, the
whole PC box was immersed in a greenish liquid. You can see it here, it's at
the end of the video after 8:33.

[http://www.youtube.com/watch?v=HMduN09xCgs&feature=playe...](http://www.youtube.com/watch?v=HMduN09xCgs&feature=player_embedded)

------
evoxed
Interesting that supermicro is already moving towards putting these on the
market. I hadn't heard of them until the other day with the etsy setup, but
this looks very promising for future datacenters large and small...

------
achristensen
Check out LiquidCool Solutions too. I think this type of technology could very
well be the standard in the future. <http://www.liquidcoolsolutions.com>

------
gary4gar
No risk of short-circuits?

~~~
raverbashing
No, mineral oil as they're using it, with required specs, is an even better
insulator than air

