
Google throws open doors to its top-secret data center - Hurdy
http://www.wired.com/wiredenterprise/2012/10/ff-inside-google-data-center/
======
fourspace
I had the pleasure of helping to build and manage these facilities, both
hardware and software, for 5 years. It's nice to see some of Google's real
innovations reach the public eye. Some of the smartest folks I ever worked
with at the company build absolutely mind blowing tech that the outside never
has the opportunity to see or appreciate.

In fact, while much of the content in the article has been written about
before, it's still probably 2-3 years or more behind where Google is actually
at. I left in 2010 and did't read about anything I had not experienced.

~~~
jedberg
> In fact, while much of the content in the article has been written about
> before, it's still probably 2-3 years or more behind where Google is
> actually at. I left in 2010 and did't read about anything I had not
> experienced.

What if they just plateaued and didn't really go beyond what you had done when
you were there, and this is totally accurate to today?

~~~
mvgoogler
As someone who just recently transferred out of Platforms[1] to another group
at Google, I can assure you that the technology has not plateaued.

[1] Platforms is the group at Google that designs and builds the technology
that goes into the data-centers.

~~~
enneff
I concur. The platforms team are the only group at Google that consistently
amaze me quarter after quarter.

------
sounds
Single page article: (note: HN guidelines suggest always submitting the
single-page article)

[http://www.wired.com/wiredenterprise/2012/10/ff-inside-
googl...](http://www.wired.com/wiredenterprise/2012/10/ff-inside-google-data-
center/all/)

~~~
ImprovedSilence
There should be a guide for hacks on how to view articles on a single page.
Yeah, wired is easy, there's a link, but often times it calls for a little php
knowledge, or viewing in "print" mode. Or something random. Not always
complexly obvious, but a list of tricks to try might be handy...

~~~
dailo10
Have you tried the "Auto Pager" Chrome extension? It's pretty handy...

------
DanBC
It's a shame that heat is just dumped outside most of the time.

EDIT:

The article talks about Google's impressive technical achievements. But
there's a lot of energy that's wasted in industry. I don't mean "used
inefficiently" (although that's bad too); I mean actually wasted.

I used to work at a tiny electronic sub-contracting factory. The morning shift
would arrive, turn on the air compressor (2 KW), the reflow ovens (10 KW and
12 KW); and the other machines (about 7 KW).

But they'd do that even if the machines were not going to be running. All
these KW were being used for no reason at all. And the machines are pretty
inefficient anyway. (One of the owners thought powered machines looked more
impressive. Energy costs were included in the rent so there was no incentive
to think about when the machines were on or off. )

Counting that waste across all the tiny factories in the world, and including
all the waste in offices - it's quite a lot.

~~~
philip1209
I've often wondered if Google would heat its headquarters with a data center .
. .

~~~
dredmorbius
Few office buildings require heat, even in cold-weather climates. The residual
heat from lighting, office equipment, bodies, etc., generally has to be
_removed_ , not augmented.

Not always the case, but I can assure you that in California, office heating
demands are very, very low.

~~~
jrockway
I heard secondhand that this is even the case in Chicago in the winter. (In my
apartment in a highrise there, I never ran my heat. The building was naturally
75 degrees. On some days, I even turned on the AC to get it down to a more
comfortable 72!)

~~~
dredmorbius
Why not open a window (assuming they open)?

~~~
jrockway
The building was pressurized such that opening a window only resulted in air
leaving through the window.

------
VikingCoder
<http://www.google.com/about/datacenters/inside/streetview/>

~~~
knowaveragejoe
Amusing easter eggs throughout. One guy has rick roll videos playing on both
his computers.

~~~
svdad
Love the storm trooper.

~~~
dredmorbius
Not very effective security considering the R2 unit rolling past him.

~~~
jrockway
That wasn't the droid he was looking for.

------
rpearl
There are some photos, such as
[https://www.google.com/about/datacenters/gallery/images/_300...](https://www.google.com/about/datacenters/gallery/images/_3000/IDI_018.jpg)

I wonder why they've mirrored the image (the left side is quite clearly the
right side flipped--take a look at the machine identifier labels). What's
being hidden?

~~~
reledi
The blue LEDs in the picture you linked to, indicate that the servers are
running smoothly. [1] It's possible that some servers were faulty at the time
the picture was taken. More likely, it's to make the image look perfect.

[1] <http://www.google.com/about/datacenters/gallery/#/all/12>

------
ims
Single page version: [http://www.wired.com/wiredenterprise/2012/10/ff-inside-
googl...](http://www.wired.com/wiredenterprise/2012/10/ff-inside-google-data-
center/all/)

------
javajosh
Hi Google Platform people. Very nice work. As you may know, Randall Monroe (of
xkcd fame) has recently started a feature called "what if" on their site. I
would like to post a question to you along those lines:

What if Google was tasked with building an orbiting datacenter? How about a
Dyson ring, or sphere? How would you do it?

If we were to use all matter in the solar system for commodity linux hardware,
how much gmail storage would I get? How many flops? And what sorts of
computation could you do on this monster?

Please answer! This should be fun...

~~~
alxv
Space would be a terrible environment for building a datacenter. The main
goals of a datacenter is to make computation as cheap, as fast and as reliable
as possible. Having the datacenter orbit around Earth would not help us
accomplish any of these goals.

First off, building a datacenter in space would not be cheap. It costs around
$25,000 to send a kilogram of equipment into a geostationary orbit. [1] So let
assume we were to use Dell PowerEdge C1100. Each server costs $14,000 and
weights 18kg. [2] This means for each server sent in orbit, you could buy 32
extra ones on Earth.

Then, there is the issue of cooling. Although outer space is really cold, its
vacuum prevents the heat generated by the machines from being dissipated
quickly. Controlling the temperature of a such datacenter would be a very
interesting engineering challenge.

And then how would you power this datacenter? Converting the excess heat back
into electricity could be an interesting option. But most likely, it would
need a lot of solar panels. This would make the datacenter cheap to run once
built, but the upfront costs would be enormous.

And we haven't talked about speed and reliability yet. Since the signal would
need to travel about 35,000 km from the geostationary orbit to reach us,
communications between Earth and the datacenter would have significant delays.
Even at the speed of light, the minimum round trip time would be about 250
milliseconds if we ignore all other possible sources of delay.

The hostile space weather would also make it pretty hard to run servers
reliability. Radiations would destroy electronics, caused bit to flip randomly
and do all bunch of fun stuff to the equipment.

But... anyhow! Let's assume anyway that by some magical work of science and
Google engineering, we figure ways to manufacture a datacenter directly space
for almost nothing by mining the Moon, discover some amazing thermoelectric
generators with near 100% efficiency and space shields that blocks almost all
radiations.

So back our previous example, a high performance PowerEdge gives us up to
about 300 GFLOPS of computing power, 192 GB of RAM and 12 TB of storage.

Now if we were to convert the total mass of the Moon (7.34767309 × 10²² kg)
into one monstrous datacenter, this would give us about 4.0 × 10²¹ servers. It
would gives us a whooping 1.2 billions YottaFLOPS (or put differently, 1.2 ×
10³³ FLOPS) of compute madness, 0.8 billions YottaBytes and 49 billions
YottaBytes of storage. This monster would consume about the equivalent of 1%
of the Sun's total power output.

[1]:
[http://www.futron.com/upload/wysiwyg/Resources/Whitepapers/S...](http://www.futron.com/upload/wysiwyg/Resources/Whitepapers/Space_Transportation_Costs_Trends_0902.pdf)
[2]:
[http://www.dell.com/us/enterprise/p/poweredge-c1100/pd#TechS...](http://www.dell.com/us/enterprise/p/poweredge-c1100/pd#TechSpec)

~~~
Axsuul
The heatsink could be exposed to the space environment for cooling.

~~~
henrikschroder
A heatsink works because there is some sort of medium that absorbs heat from
the sink and moves away, thereby moving heat away. On earth, we use air for
this, sometimes with the help of a fan.

If you stick a heatsink on equipment in space, there's no air that can move
the heat away, since space is mostly empty. You'll bleed off some through
infrared radiation, but that's not going to be enough.

~~~
Axsuul
That's right, oops!

------
mseebach
It's a nice piece, but nothing new in it, and most certainly no doors were
thrown anywhere.

~~~
Hurdy
Well, it's the first time that there are photos from inside. There's a bigger
gallery here: <http://www.google.com/about/datacenters/gallery/>

~~~
bruceboughton
That link seems to throw Chrome 15 into an infinite redirect loop (without
cutting off after n redirects).

Maybe they don't want you to see inside.

~~~
Hurdy
I'm really curious: What's the reason that you are still using Chrome 15?

~~~
bruceboughton
Corporate firewall blocks auto-update channel.

~~~
eco
That's pretty amusing that the corporate firewall is making your system far
less secure. There has most likely been hundreds of security issues fixed
since Chrome 15.

------
Tipzntrix
They have a team causing water leaks and stealing hardware to test their
disaster recovery. That is some serious penetration testing.

~~~
riobard
Isn't that just a simulation, not for real?

~~~
Tipzntrix
From the tone of the article, it doesn't seem so. They did say the protest was
placated with imaginary pizza though. Hmm, like that would work :P

------
seiji
Google should one-up Amazon and get into the Datacenter As A Service market.
Service segments: normal cages (I'd rather lease cages from Colorful Pipes,
Inc than Equinix), pay-n-go turnkey same-hardware in 3 georedundant locations,
and lease-by-rack in multiples of 10 pre-populated racks (racks specified as
compute-only or storage-only with 10G interconnects between racks).

~~~
cracell
It's doubtful that they could directly compete as is. Amazon has done well
with their services because they eat their own dog food. From everything I
read Bezos basically forced them to build this system and consume it for
Amazon's own needs. Google has never taken this approach with their APIs and
the difference shows very clearly when you consume these products.

~~~
sses
I assume you're referencing stevey's rant, and if so you're conflating two
issues. Steve was talking about the use of APIs between services at the
application level, not the API for the datacenter/cloud as a platform.

~~~
seiji
I think cr is talking about how you get "baby bigtable" and "baby cloudscale"
that are copies of internal services, but not what core platforms use
themselves.

------
brown9-2
In regards to the disaster testing:

 _How did Google do this time? Pretty well. Despite the outages in the
corporate network, executive chair Eric Schmidt was able to run a scheduled
global all-hands meeting. The imaginary demonstrators were placated by
imaginary pizza._

How does one decide what will placate imaginary demonstrators? Who calls them
off?

~~~
ubercore
For the purposes of tests like that, they probably just wanted to see that
"reasonable action was taken", which will (hand-waving) probably take care of
most instances of that type. In the event of real demonstrators, it would just
be the opening salvo of damage control, but it's too hard to predicate how a
crowd of angry people would react past the first move.

------
Loic
I start to be annoyed with the "a power efficiency of 2 is the standard in
datacenters". My servers are hosted in a datacenter with a global efficiency
of 1.15, proved after more than a year in operation. Announcing that Google is
doing 1.2 is simply announcing something wrong and I suppose Google is very
happy with this number being provided to the press. It means that some
competitor will use it as "Google is the best, they do 1.2, we are at 1.3 we
are not too bad", where I bet Google is now near 1.1 or less (they operate
without cooling in Belgium for example).

~~~
rryan
[http://www.google.com/about/datacenters/efficiency/internal/...](http://www.google.com/about/datacenters/efficiency/internal/index.html)

You're right -- those PUE numbers from the article were talking about their
PUE at the time. Google's 2012 average PUE across all facilities was 1.12/1.13
with a minimum PUE of 1.09/1.10.

Also, Google puts enormous care into the process of calculating PUE since it's
kind of black art and if you aren't careful you'll leave out some aspect of
your operation that will mislead you into thinking your PUE is lower than it
is.

------
francov88
Really cool article - would be amazing to walk through that facility.... love
the Google coloured pipes from the pictures

------
stock_toaster
It is unfortunate (for the rest of us) that datacenter tech is such a
competitive advantage for Google. If they were able to share their
breakthroughs more readily with others, imagine how much less of the "1.5% of
all power globally" datacenters could be using.

------
dredmorbius
When they say that supercomputing is essentially a plumbing problem ...
looking at these photos, no kidding.

------
bhauer
All caveats about chrome-dev aside, I find it amusing that this site's
navigation does not work in Chrome v24.0.1297.0. Had to use Aurora to view it.

Maybe Google really is Sun v2 ("We are the dot in dot-com" == "Where the
Internet lives").

------
wilfra
Good read but most of that is not new information. I read a lot of that in a
book about Google over two years ago. The last ~ 1 page was new though.

~~~
waqf
Officially announcing things that "everybody knows" already can still make a
difference. It means that you can ask Google executives about those things in
public appearances and they'll at least acknowledge the question even if they
refuse to answer substantially.

------
Fando
An incredible article!

------
no_script
Seriously. I can't believe they require JavaScript to view this all this eye-
candy and server porn.

<http://www.google.com/about/datacenters/gallery/>

I thought GWT was designed to "compile" rendered pages for a wide variety of
browsers and permutations of configurations?

The pictures are very pretty, but that's really awful of them to release a PR
site like this, and force users into using JavaScript.

Unforgivable.

