
Too Hot for Humans, But Google Servers Keep Humming - cd34
http://www.datacenterknowledge.com/archives/2012/03/23/too-hot-for-humans-but-google-servers-keep-humming/
======
twelvechairs
This is a very interesting case. As an architect though I can't help putting
in my 2 cents that cooling isn't simply a binary choice.

Firstly, the best thing to do environmentally would be to use the heat
positively, connecting it into a centralised heat-distribution network for
other users (for instance, other industries, public heated pools, or even
residences).

Secondly, even without power, there is a lot you can do to maximise the heat-
loss of a building: Eg. design to capture prevailing winds and utilise them to
help heat escape, or internal arrangements to keep cooler 'microclimates'
around areas used by maintenance staff and temperature-critical server
elements. I am not sure what the design process of these buildings is, but I
would bet that this sort of design doesn't get thought about.

Thirdly, you can also actively cool buildings without 'chillers'
(refrigeration units), such as pumping earth-cooled water through the building
(as 'chilled beams'). This is actually likely to be quite cost-effective for
large projects like this, and much more environmentally friendly than burning
coal.

~~~
Drbble
Why would you bet that professionals with huge budgets candidate cost-
conscious customers wouldn't think about how to solve the problem they're paid
to solve? If they don't, you could be a billionaire by selling your ideas.

~~~
twelvechairs
I think the big clients (google, facebook, et. al) are getting smarter very
quickly now (see for instance cd34's link above). It also requires not just
one person, but a whole team of smart architects, engineers and environmental
experts to figure this out for every individual server centre.

Also - some of the big successes (heat sharing, using water from outside
sources, etc.) also require government involvement, which can be a minefield
in itself (note: google seems to have much better success with these
government schemes in Europe than in the USA).

------
tripzilch
> During these periods, the temperature inside the data center can rise above
> 95 degrees.

:-/ 95 degrees _what_? They don't use Fahrenheit in Belgium, you know (nor
does most of the rest of the world, for that matter).

For anyone wondering, 95F is exactly 35 degrees Celsius.

Also, it would probably have been more relevant to mention the equivalent
Belgian and EU institutes instead of the OSHA ... seriously, I don't expect
for a second that the regulations would be the same. Roughly, maybe, but it
just doesn't make sense to mention the OSHA at all if the servers aren't in
the USA!

~~~
sneak
As someone who spent his first pair of decades in the USA, it's easy to forget
that most Americans view the whole world as if it were the USA. It's a habit
that seems normal in that environment.

They also frequently say "the world" when they mean "the whole country".

------
DanielBMarkham
Is it just me, or did this article strike a dystopian tone? Computer centers
of the future, perhaps achieving truly useful AI capabilities, consisting of
unimaginable numbers of racks sweltering in a heat so hot that people can't
survive in the same room with them? Sounds like something Dante would have
written if he were alive in 2012.

~~~
karamazov
I don't think so - to me, the tone seems informative: "hey, did you know that
data centers might not need all that cooling? here's an example."

Plus, humans can work for about 80-90 minutes at close to 100 F (albeit not
comfortably)[1], so we're not really talking about conditions that would kill
people. One can extrapolate, of course, but it's quite possible that the
servers would break down before the humans do.

[1] <http://www.ncbi.nlm.nih.gov/pubmed/9183079>

~~~
nullflux
We also seem to get along fine in space, under water, in low oxygen
environments, in irradiated environments, in freezing environments...

The point of this is: who cares if the data center is warm/hot? It's as hot as
that inside a mascot suit at a theme park, so they build cooling mechanisms
into them. Tools exist for a reason, and it is somewhat ironic that we
prematurely assume that our creations require the same environment for
"comfortable" survival that we do. The animistic tendencies we have may not
just be cultural.

~~~
sliverstorm
It's not like we get along fine in space et al because we put up with
discomfort. Have you actually considered your examples? Space is survivable in
part because the suit keeps you warm. Under water is much the same way; stay
in the water long enough and you need a drysuit or wetsuit to stay alive.
Freezing environments? You wear seal furs.

The point is, talking about how we can survive in freezing environments does
NOT demonstrate that climate control (or rather, local microclimate control
around your body) is just creature comfort.

~~~
nullflux
I guess I didn't get to my point clearly. What I'm saying was "Who cares what
the temperature in the data center is? We'll just put some guy in a cold suit
if it is more efficient." Humans usually adapt when it is in their favor to.

------
amalag
I wonder how much money companies waste by keeping their server rooms as cold
as a refrigerator. Some IT guy who thinks servers should be cold probably told
them that.

~~~
Zaak
For a long time it was common knowledge that computers last longer if you keep
them cold. It's only been in the past few years that evidence has been shown
that moderately high temperatures do not appreciably shorten equipment life.

~~~
jordanb
It's also conceivable to me that it used to be more true in the past than it
is now. Computer equipment used to be incredibly fragile.

~~~
reitzensteinm
Also, heat generated per server was rising until recently.

Since air conditioning cost is proportional to heat generated, it could have
well made financial sense in the 80s and 90s to aggressively cool servers. Not
nearly as much heat had to be displaced, and the cost of each server was
higher.

From then, institutional inertia and infectious repititis (a brilliant term
coined by Amory Lovins) kick in, and it just becomes The Way data centers are
built.

Until a bunch of cool kids like Google and Facebook show up on the block, who
don't care about the old rules quite so much, and they start to question the
conventional wisdom.

~~~
DanBC
Energy costs have risen quite a lot. Cooling is a lot more expensive.

Google (see links elsewhere in this thread) have a lot of equipment, and a lot
of data about that equipment, and a lot of smart people to turn that data into
information.

------
bengl3rt
Did anyone else think of the scene in Sunshine where Chris Evans gets stuck
under the computer that's being lowered into fluid and freezes to death?

The mechanism for making "the hot aisle" temporarily habitable fails, you get
stuck on/in something and can't get out in time.... Life imitates art.

~~~
cperciva
I think you're overstating the dangers a bit. A hot aisle might be
uncomfortable and not a place you'd want to spend many hours, but it isn't
going to kill you quickly. If by some series of amazing coincidences you
managed to get stuck in one with no way to call for help, you could just start
unplugging servers until someone came to investigate and rescue you.

~~~
cd34
Years ago a data center outside DC in Silicon Alley had an A/C failure. Our
equipment was reporting temps of 140F on intake. We were allowed 10 minutes
in, one person doing work, another observer, both required to have squirt
water bottles and a two-way radio. We were instructed not to carry anything
and to use a cart. We were told not to remove equipment that weighed more than
a few pounds.

I pulled 10 drives so we could run them to another data center as a 'just in
case'. The guy I was with passed out about 60' from the door. The air temp
outside was ~72F and they had portable hurricane fans blowing air into the
data center.

In that ten minutes, I drank 32 ounces of water. I remember that because I
went to pick him up, kicked my water bottle by accident and it fell over
because it was empty.

He spent 30 minutes in an ambulance, about three hours in a hospital from 10
minutes exposure. It took me about two hours to recover to the point where I
felt I could eat.

Don't lose respect for heat. :)

~~~
dennisgorelik
Saunas have 70C - 100C temperature.

<http://en.wikipedia.org/wiki/Sauna>

You had only 60C (140F).

Fainting after just 10 minutes in 60C is too fast.

BTW, most likely reason your friend spent time in ambulance and hospital is
because your employer wanted to be on the safe side, not because your friend
really needed it.

~~~
sliverstorm
The air in a sauna is very very humid, right? And the air in a server room
will typically be very very dry.

In a hot dry climate, you do lose water much faster.

~~~
phamilton
I don't think that's right.

When overheated, you sweat. Sweat leads to evaporation, which leads to your
skin cooling.

In humid environments, our sweat does not evaporate as well. This leads to
less cooling, which leads to more sweating and thus more water loss.

~~~
marvin
Anyone that's been in a sauna can verify this. High humidity humidity feels
hotter. You can demonstrate this by throwing water onto the sauna heater - the
sauna will instantly feel a lot hotter.

------
Smudge
Did anyone read the last two paragraphs? Seems the article wasn't quite done
being edited:

> Before entering the hot aisle, a technician uses a supply trigger, typically
> a switch located outside the hot aisle, to activate the SmartAire T units.
> Cool air then enters the hot aisle until a comfortable temperature is
> established. SmartAire T units maintain this temperature until the
> technician completes the assigned work and deactivates the units,
> eliminating any need for rest periods and increasing productivity.

> Before entering the hot aisle, a technician can use a supply trigger –
> typically a switch located outside the hot aisle – to activate the SmartAire
> T units. Cool air then enters the hot aisle until the temperature reaches a
> comfortable level.

------
jballanc
I honestly have to wonder why we don't have robots doing the majority of these
tasks. Obviously, a robot couldn't troubleshoot detailed or obscure problems,
but for things like provisioning a server, rerouting around bad network
hardware, replacing drives, NICs, etc. they should be good enough. CERN
already uses robots for the LHC tape backup system. I can't see that this
would be that much more complicated.

~~~
adpowers
Humans are very easy to train compared to robots, and aren't that expensive.
I'd imagine each facility is laid out slightly differently, with different
generations of hardware. To make the robot work well you'd either have to
program in each hardware variation or make the software sufficiently advanced
to understand the differences itself and adapt. I imagine the cost/benefit
wouldn't pan out when you look at how many datacenter techs you replace with
how many software and hardware engineers.

Humans are great at adapting, so if you throw a new enclosure at them or a new
motherboard design which has the CPU sockets in a different location, they'll
be able to learn the new system in 30 minutes.

A former coworker of mine trained to work with factory robots, but he realized
that there wasn't much of a current job market for it because humans are still
way cheaper (and will be for some time). Instead he went into software.

------
awongh
I can't help but think about the logical extreme of this type of trend, which
is to put the server rack itself outside- what components would you need to
weather seal or protect against the elements? There are fans and disks that
have moving parts, and cable connections you would have to seal, but other
than those changes what would you have to do? Maybe it would be even more
efficient....

------
antonyme
One aspect of this approach they don't mention is how it affects the MTBF of
the equipment. Many components suffer increased failure rates when run at a
consistently higher temperature. While it may be relatively safe to run
machines in a range higher than is comfortable for humans, I bet there's an
upper limit past which components really start to fail in numbers.

~~~
mvgoogler
We've published a few papers about this that studied (among other things) the
effect of temperature on DRAM and hard-drive failure rates. We watch our
machines pretty closely. If you're a data geek, Google's a fun place to be.

Disk Study: <http://research.google.com/archive/disk_failures.pdf>

DRAM Study: <http://www.cs.toronto.edu/~bianca/papers/sigmetrics09.pdf>

~~~
antonyme
Very interesting! Thanks for the links, I'll check out the papers.

------
1point2
In the picture it sure does look like a row of evaporative coolers - there was
no mention of them in the article - so maybe I'm wrong.

~~~
cd34
Free air cooling uses evaporative coolers. They are not running chillers.
Belgium also has a water treatment center bringing in water from an industrial
canal for their coolers and thus runs on 100% recycled water.

In Finland, they run heat exchangers with water from the Baltic Sea, again, a
chiller-less data center.

Document from Google that talks about their cooling:
[http://www.google.com/about/datacenters/inside/efficiency/co...](http://www.google.com/about/datacenters/inside/efficiency/cooling.html)

------
DanBC
I'm interested in how much "efficiency" can be built in to new data centres.

Is it possible to use the waste heat for other purposes? Would painting the
roof white make any difference?

~~~
mjwalshe
at one of BT's Datacetres one hot summer we played water hoses on the roof to
increase cooling via evaporation

------
ajasmin
If it gets too hot for humans can't we use robots instead?

~~~
justincormack
The other option is not to fix stuff, just leave failed stuff in place.
Various people have done this, all depends on costs.

~~~
abalashov
Hope that mindset does not come to pervade the nuclear power industry. :-)

------
PaulHoule
These guys are whimps.

I used to work in an academic building with an ancient HVAC system in which
temperatures in my office would soar to 105 degrees in April, between having
no AC, sun shining in the big windows, plus the heat dissipation from 5 humans
and 20 computers. My deck included a Dell Windows machine and two old IBM
machines running Linux and a Linux laptop and a Sun Ultra 10 and a "Pizza Box"
32-bit SPARC machine and two Sun Rays.

(Even if they couldn't charge enough tuition to get a decent A/C system, at
least I had access to a pool of last year's hardware.)

We never evacuated... You see, that's why the U.S. is #1 -- people in any
other country would bail out at 95, but we stay the course. ;-)

