
Google is investing $3.3B to build clean data centers in Europe - yibllesqueak
https://techcrunch.com/2019/09/20/google-is-investing-3-3b-to-build-clean-data-centers-in-europe/
======
rabscuttler
Bravo to Google for this show of corporate leadership. Their pledge to
purchase renewable energy to match their real-time demand really is
impressive, especially the note that it must create new renewable generation
(discussed on the Google blog announcement not this article) .

It is also prudent, as these long-term power purchase agreements (PPAs) with
new renewable generators will be very competitive, probably much cheaper than
wholesale power prices, and be fixed into the future, avoiding price
fluctuations from e.g. future gas supply shocks.

~~~
cameronbrown
I wonder, would it be feasible for them to use wind power to run time
insensitive batch jobs in the daytime - crank down the processing power on
these jobs when there's less green energy around?

~~~
cheez
Google has insane optimization for power consumption at data centers, if there
is something you can think of, they will have either done it or will be trying
it.

~~~
Zenst
For many a datacenter - balancing the loads across the three-phase power
supply is one area that makes one of, if not the biggest difference in the
bill/power charged for using. Though that for many a tech, is an area they are
more inclined to overlook.

Then quality of the power (clean sine or noise spikes) auditing, then the
factor that UPS's do square wave. Though that's more stability and indeed
longevity of your kit plugged that appreciates those nuances.

It is one big rabbit hole and can imagine the likes of Google having dedicated
teams to focus just on power optimisations as the dividends at that scale -
more than pay for the team.

~~~
bt848
When you deploy a datacenter at this scale you contract for an amount of power
and you're basically obligated to use it. The trick is to get more compute out
of the same amount of power.

By the way if you still have AC UPS systems that is probably your #1 problem
and you definitely will not benefit from some fancy machine learning thing.
Just get rid of most of your AC systems. For example you may benefit by
adopting Open Rack 48V-to-point-of-load scheme that uses an in-rack DC UPS.

[https://www.opencompute.org/files/External-2018-OCP-
Summit-G...](https://www.opencompute.org/files/External-2018-OCP-Summit-
Google-48V-Update-Flatbed-and-STC-20180321.pdf)

~~~
Zenst
Total, more the company inhouse data center type affairs - big, but not your
cloud or google size affairs. Then doing one AC to DC conversion instead of
all that overhead per system along with the associated heat and centralising -
pays for itself.

The cut-off for that level of work I'd say is if you design your own servers
over just speccing from a vendor level is when you would be doing this. Upto
that, it's still vendor off shelf. Though been a while and an option some
vendors may now offer at lower scales these days.

------
nabla9
The Hamina data center is close to Russia without being in Russia. I guess
it's the closest Google is comfortable at being when serving Russian
customers.

In the north data centers can double as heating power plants when coupled with
district heating. It's unfortunate that Google puts the excess heat into the
sea. Yandex has a big data center in Mäntsälä. Their cooling water is used for
heating the area. There are also data centers in Helsinki where heat is used
in district heating.

~~~
davedx
Hah that's amazing. I remember when I did bitcoin mining (only 1 GPU) and
thinking it was getting noticeably warmer upstairs in our house, and how
actually the bigger miners could use the heat to heat buildings.

Great to hear this is being put into action. Computing -> heat.

~~~
bigbaguette
Following this logic, I often thought electric heaters are a waste of energy
since they perform no computation.

But a French company is having a go at this:
[https://www.qarnot.com/computing-
heater_qh-1/](https://www.qarnot.com/computing-heater_qh-1/)

~~~
mehrdadn
Confused, could you explain? A heater that puts its energy into something
other than heating is just being inefficient at heating, right? Shouldn't you
want it to put 100% of its energy into heating?

~~~
Sharlin
To very good approximation, 100% of the energy used by a contemporary computer
is turned to heat. The amount of negentropy generated by the computations is
utterly dwarfed by the increase in entropy in the computer's surroundings.
Computers are space heaters that happen to do some calculations as a side
effect.

~~~
rightbyte
I wonder if WD and friends do generative braking on the hard drives.

~~~
zepearl
:) Regenerative braking when the arm of the HDD starts getting near the target
track.

------
octosphere
Anyone find anything that Google does these days to be aggressively
competitive? As a one-man startup guy, I feel everything I do pales in
comparison to what Google are doing and I feel nothing more than /eclipsed/ by
them. I have found myself wondering: "If you can't beat them join them" on
many occasions. I probably should just merge into Google one day and stop my
worrying?

~~~
satanspastaroll
You are correct, google is trying to cover all sectors they can, and offer
tools to join them even at a loss. After they own everything, they can start
raising prices, dictate web standards and so on

~~~
dantheman
Your worst case scenario has never happened.

~~~
satanspastaroll
I'm quite sure a monopoly is not that alien a concept

~~~
dantheman
A monopoly that keeps prices artificially low for years, drives everyone out
of business, and then raises prices to make up for the years of lost profit --
I'd like to hear one example - it's never happened as far as i know.

Most monopolies have been government granted - railroads through free land,
AT&T by creation, etc.

~~~
satanspastaroll
I believe the Rockefeller's General Oil was dissolved as an illegal monopoly.

It used it's massive, over 90% share of the market (and it's vertical supply
chain) to deter competition, and made deals with other suppliers to make the
inertia of competing a herculean task

~~~
spc476
While a company that controls the majority of a market may be the text book
definition of a monopoly, it's the the legal definition in the US. The US law
recognizes that a company may naturally (and legally) gain a majority market
share (even 100%) and be left alone. It's only when a company with such
control over a market uses that to leverage control over _another_ market that
it becomes illegal under US law.

Standard Oil owned 90%+ of the oil refineries in the US (they never did drill
for oil) yet that wasn't why Standard Oil was broken up. They were broken up
because they were using their influence with oil to control the train
industry, specifically, train transport of oil. [1]

Microsoft wasn't threatened with breakup over their operating system monopoly
on desktop computers, but because they were using _that_ influence with at
attempt to control the web browser industry.

[1] And ironically, it was the _breakup_ of Standard Oil that made John. D.
Rockefeller the richest man in the world. He was also responsible for saving
the world's population of whales. [2]

[2] Half-serious here. Whale oil was big business, until Rockefeller made
petroleum products cheap enough to supplant whale oil as a product.

------
tpmx
Working link:

[https://techcrunch.com/2019/09/20/google-is-
investing-3-3b-t...](https://techcrunch.com/2019/09/20/google-is-
investing-3-3b-to-build-clean-data-centers-in-europe/)

~~~
M2Ys4U
God I hate Oath's awful privacy modal.

~~~
tantalor
>God I hate Oath's awful privacy modal.

What is "Oath"?

~~~
seszett
It's some kind of entity that is responsible for the full page GDPR popups
that block the access to many websites like Techcrunch, Tumblr or Slashdot.

------
sha666sum
Non-shortened url: [https://techerati.com/news-hub/google-3-billion-data-
centres...](https://techerati.com/news-hub/google-3-billion-data-centres-data-
centers/)

------
senko
TLDR from the TechCrunch article:

> This [...] is in addition to the $7 billion the company has invested since
> 2007 in the EU, [...] today’s announcement was focused on Google’s
> commitment to building data centers running on clean energy.

------
toper-centage
Did you know [Ecosia] (www.ecosia.org) is carbon negative? Google should
follow suit if they really care.

------
ashildr
Not in the UK I guess...

------
grezql
This is not relevant to the article, but why is there so many full screen
popups with newsletter on american websites?

I have never ever written my email on such form. The first thing I do when I
see it is to close it and think its a low-budget website or something.

~~~
josephpmay
Newsletters can be incredibly profitable for media organizations because it 1)
gets people to return to your content on a regular basis 2) can allow for more
targeted advertising

I personally hate this trend, but almost everybody's doing it these days

~~~
umeshunni
and (3) are vastly cheaper than other forms of advertising since they are not
paying any platform provider for the ads.

------
amos19870630
These data centers will collect large amounts of personal data. I believe that
the future of personal privacy protection issues needs to receive greater
attention.

~~~
jasonvorhe
This is not on topic.

~~~
spookthesunset
These “omg privacy!” flavor of off topic post seems predictably standard for
anything with “google” in the title.

------
cedivad
Oh, so that's what those Google job listings for datacenter-builders were
for...

------
auslander
I wonder what has bigger impact on climate, driving a car daily or feeding
Google and Adtech industry with all your data, that has to be stored and
constantly processed.

------
chvid
Will these data centers still be subject to mass data collection by the us
government (under arrangements similar to the prism program described in the
Snowden relevations)? Does it matter at all that these centers are placed
outside the us?

~~~
bt848
Just like basically this biggest nit ever, but Snowden revelations didn't
reveal anything and nothing was described in them. Snowden exfiltrated a bunch
of powerpoints that neither he nor you nor I can understand out of context.
The most supportable conclusion you can really glean from Snowden's slide
decks is the US intelligence agencies were able to intercept traffic of
undersea cables.

~~~
1-6
Fiber is encrypted you know...

~~~
jasonvorhe
They weren't back then. Google had plans but they didn't start yet. They
prioritised that after the leaks. Now they are encrypted.

------
rndgermandude
They might be (a lot) less bad than other data centers, but unless these
google data centers have a neutral carbon footprint, they are not clean, and
unless they have a negative footprint they are not "environmentally friendly".

~~~
jonas21
Google's datacenters are already carbon neutral. This is about going one step
further and using locally-generated renewable energy to power the datacenters.

~~~
rndgermandude
No they are not. They use energy, a lot of it. They want to run their data
centers off of renewable, "carbon-free" energy sources and made some great
strides towards that. But they aren't there yet.

On top of that, renewables aren't carbon neutral per se, either. And the
hardware you put into those things isn't free either.

Now you might argue the services google is able to provide with them warrant
the environmental costs, but that's another question entirely. And you might
argue that google did a lot of things to reduce their footprint (and they did)
but it's hardly close to zero, let alone negative.

~~~
breakyerself
Renewablea aren't carbon neutral "yet" is how I would put it.

~~~
rndgermandude
Indeed, but the point is, it isn't carbon neutral yet.

------
m0zg
What'd be better for the environment is to not build anything at all, and
force people to use existing resources more efficiently through higher pricing
rather than buying more resources to run at 1/30th the speed a real
programming language would be able to do. Not a realistic proposition, but
let's not get taken by the narrative that a multi-megawatt datacenter can
somehow be as green as they claim, once you consider the downstream effects.

------
badrequest
At Google Next in April, I attended a talk by the C-level in charge of
Google's Datacenters, and they bragged about using machine learning to
manage/anticipate their power draw, and the subsequent efficiencies those
brought. They also bragged about how much they're trying to make these centers
powered by renewables.

When an audience member asked if they would be releasing their models that
help them manage power, the speaker quickly changed the topic, which I felt
told me all I needed to know about the sincerity of their motivations.

~~~
dxbydt
I did a bunch of work on managing/anticipating the power draw. (
[https://news.ycombinator.com/item?id=13747481](https://news.ycombinator.com/item?id=13747481)
). Something nobody talks about is how grossly wasteful & inefficient Hadoop
is. There's a bunch of cited papers
([https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.58...](https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.580.5685&rep=rep1&type=pdf)
) that study power consumption of Hadoop nodes, but nothing on how to actually
reduce the consumption.

So here's something you can personally do to make a dent to this problem. Most
of the Hadoop jobs that you write will involve some statistical summary over a
dataset. Find the total, or the mean, median, 90th quantile, whatever. Writing
a Hadoop map-reduce job is the single worst way to do this.Almost always, you
can sample say a 1000 points, get a kernel density estimate via Parzen & then
use a table. All quantiles, order statistics, functions of order stats...these
can be hand computed for several univariate distributions & their location-
scale families, and your real-life data can easily be bounded above below by
these estimates. So you can get to > 95% accuracy just by hand-calculating. I
can go into details if you like, but I suspect most of you already know how to
do this.

~~~
smueller1234
You're exactly right. Hadoop and it's ecosystem are... a very poor replica of
the system whose predecessor the respective Google papers were written about.
There's glaring efficiency things like not supporting complex encodings (last
I used HDFS, at least, it could only do full replication) or the query engines
on top not implementing sampling in their aggregates as a default/out of the
box, which the Google tools do.

