
DeepMind in “very early stage” talks with National Grid to reduce UK energy use - rbanffy
https://arstechnica.co.uk/information-technology/2017/03/deepmind-national-grid-machine-learning/
======
sweezyjeezy
The DeepMind datacenter project was very interesting, but a lot of the ML
people I spoke to were quite dubious about how much of it was genuinely down
to new AI/neural networks, and how much of it was Google PR to justify how
much they spent on DeepMind.

> DeepMind trained a neural network to more accurately predict future cooling
> requirements, in turn reducing the power usage of the cooling system by 40
> percent.

But when you look at the DeepMind blog post
([https://deepmind.com/blog/deepmind-ai-reduces-google-data-
ce...](https://deepmind.com/blog/deepmind-ai-reduces-google-data-centre-
cooling-bill-40/)), it looks like the 40% model is comparing to a baseline of
doing nothing. So the question is : is this really something you require an AI
research powerhouse like DeepMind for, or is it something a regular data
science team could do?

~~~
wheelerwj
I was going to say, I feel like this could be done with excel and some
historical data combined with weather forecasting.

that being said, even if ML gets you a 0.5% advantage over spreadsheet math,
that is a non-trivial amount of savings on a national scale.

~~~
iagovar
What I'm wondering is why I can't find anything about ML applied to finding
and curating data, which is the most tedious part of data science. That would
be an interesting way of using ML without fuzzy stuff.

~~~
SolaceQuantum
There are a few current projects, particularly in database research. I don't
know how many of them use ML in the traditional understanding. Current
projects I know of are Wrangler, Mimir, Katara, MayBMS (in no particular
order).

~~~
iagovar
Did you try any of them. Wrangler seems like a tool similar to OpenRefine to
me.

------
javiermaestro
Heh. If only they mandate to replace a majority of the crappy wooden + single-
pane windows they have in most houses, that let all of the heat out and wind
and noise in, with normal, double-glaze windows with good insulation, they
would cut the energy footprint in HALF, I'm sure...

~~~
poooogles
>crappy wooden

I'm not sure where you got the gripe against wood from. It lasts longer than
uPVC given maintenance every 5 years, looks better, is more environmentally
friendly and has similar thermal properties. Oh and it doesn't go yellow.

~~~
javiermaestro
Nothing against wood, really. But if you've been to the UK, you should agree
that the unmaintained, not-properly-fitted, wooden frames that they use in
most old windows are just crap. They let in huge amounts of wind, cold, etc,
through the unsealed seams. Anything that improved that would be 10 times
better, regardless of the yellowing, etc.

~~~
grey-area
Most wooden sash (I assume you're talking about wooden sash) were properly
fitted at the time, but are victorian or edwardian and therefore over 100
years old. They've done pretty well considering their heritage. New wooden
sash with double glazing are better IMO than plastic windows - as the parent
noted, plastic doesn't do well long term when exposed to light.

~~~
javiermaestro
So then, we all agree that the old (ancient) and unmaintained wood frames in
the UK are crap. Cool! :)

Really, I couldn't care less if they replaced the frames with wood or
whatever, as long as they made sure that they were thermally good and avoid
wasting all of the heat...

~~~
grey-area
No.

------
poooogles
On a similar note, for anyone that hasn't seen it you can get live grid
metrics from
[http://www.gridwatch.templar.co.uk/](http://www.gridwatch.templar.co.uk/).

~~~
yunolisten
what is this, a website for ants? It needs to be at least twice as big!

Seriously thought, that's quite an unreadable interface.

~~~
ljf
It looks great on a 22 inch screen, and likely even better on a nice 50 inch
monitor - but on a normal laptop screen it is indeed nuts.

------
Pitarou
There’s a huge conspiracy theory brewing about the mandatory rollout of smart
meters. They’re going to absolutely freak out when they learn that Google’s
Deep Mind will be watching them.

------
bboreham
I like these guys who are already running a "virtual power station" which can
smooth out the peaks by switching off non-vital power loads:

[http://www.openenergi.com/](http://www.openenergi.com/)

With Machine Learning, natch: [http://www.openenergi.com/virtual-power-
station-with-big-dat...](http://www.openenergi.com/virtual-power-station-with-
big-data/)

------
nzjrs
This title misses the following words; "suggests they can", "proposes to", or
my personal favourite "something something ai solves all your problems"

come on folks, we can be better than this.

~~~
dang
We added "very early stage" to the title since that's what the National Grid
is quoted as saying in the (you're right, extremely frothy) article.

------
redcalx
I doubt this will use deep learning though. DL works best at detecting
patterns in large quantities of data, whereas the time series data here is
limited to just a few years...and the system has changed over those year, and
continues to change significantly. I'd guess that 10% is the cost of ensuring
supply, i.e. spinning up generators as contingency - any attempt to shave
saving from that may increase risk of power outages. Doing so while
maintaining the same power outage risk is the goal here, but I think it would
be hard to prove that some new clever strategy has the same risk levels. In
this respect 10% sounds ambitious to me.

~~~
halflings
I know a couple people working on using deep learning for financial
timeseries. It can be beneficial even with relatively limited data (a couple
of years with high granularity).

I also think the 10% figure is quite ambitious; especially given that they are
at an early stage of negotiations.

~~~
londons_explore
It might be reducing transmission losses by 10%.

Ie. trying to prefer supply near users to reduce losses in long transmission
cables.

The whole thing seems rather tricky though, because the entire grid right now
is run on a market based approach, where suppliers and users bid for the right
to sell/use power every half hour. The ML in that case would have to be given
to every company to make smarter bids.

~~~
lostboys67
And reducing the use of standby gensets which is very expensive not to mention
polluting

------
grandalf
Predicting peaks in supply and demand for energy is essentially turning the
utility into a hedge fund.

I think it's worth considering the overall characteristics of the "improved"
algorithm, since the ML optimizations are likely analogous to leveraging based
on an overfitted predictive model, and the objective of a power grid is
resilience as well as efficiency.

Also, depending on how you define efficiency, it may be considered beneficial
to shut down a coal plant and spin up a few hundred windmills, even if the
resulting price per kilowatt hour increases by 5%.

~~~
_800MHz_
I work in the industry and we've been "predicting peaks in supply and demand
for energy" for years.

You're 100% correct about the "resilience" and "efficiency" part, and we
currently have software for that as well (Google "Real-Time Contingency
Analysis"), but the real improvements to be made in this area would be things
that generally fall under the "smart grid" buzzword. The ability to
automatically switch transmission lines in and out of service during an
emergency event, better and/or automatic control frequency and voltage (which
we do have currently, but could be further automated), and
reaction/recovery/restoration after a relay incident (for example, an
automatic "Blackstart" after a voltage collapse) would all fall under the kind
of improvements needed to push electric grid technologies to the next level,
in my opinion.

~~~
grandalf
Very interesting! Thanks for the clarification. Would you say that the ML/AI
aspect is "easy" once the automation is built? Or does the automation offer
limited value without fairly sophisticated algorithms?

------
KCFforecast
I don't want to give any hint to the DeepMind, since I an inclined to think
that in this field an expert assessment can be better that DeepMind advice.
Just to give a simple question or example of the kind of knowledge involved in
those predictions: Since energy generation and demand depends a lot of weather
conditions, do they have any state of the art machine learning model to
forecasting weather conditions?, can they predict the evolution of energy
prices?, can they measure the impact of new improvement in reducing the cost
in renewable energy, what about the impact of the brexit in the energy
market?, are they first class experts in times series or are they simple to
apply current technologies like profet or Hyndman's R package for ts. How can
they defend that technologies applied at go game like reinforcement learning
can be successfully applied to forecasting? Perfect games simulations allow
you to get almost infinite data, current world market and weather has only
very limited amount of information to be supplied for a model, how can they
justify the application of big data techniques to a small available data
world. I hope that some of those questions can be addressed, unfortunately
while writing this I haven't read the post yet.

~~~
computerex
Reinforcement learning is used in spam detection and control amongst other
things. You can do data augmentation and also do transfer learning.

~~~
KCFforecast
In this concrete case, how do you do transfer learning? what is the domain you
have experience to transfer to the energy of energy? Also, Bayes's naive
algorithm can be used in spam detection and usually it gives good results, is
RL such a great tool in spam filtering when there is moderate data?

------
timthorn
If you're in Cambridge (UK), Demis Hassabis (CEO of DeepMind) will be giving a
talk to CSAR as part of the Cambridge Science Festival at 7:30 this evening:
[http://www.sciencefestival.cam.ac.uk/events/towards-
general-...](http://www.sciencefestival.cam.ac.uk/events/towards-general-
artificial-intelligence)

------
MrQuincle
This is about prediction of supply. In other words, becoming a weather
forecasting organization, mainly wind and clouds/shadow.

Demand is predictable so little to gain there.

How big is the market of providing excess demand? If I can turn on the AC or
charge electric cars in the millions, what can I earn as a company?

------
kaino128
I don't know much about electric grid engineering - are there opportunities
for a ML approach to increase efficiency in ways other than the sort of better
supply forecasting implied by this article.

The article does mention the losses involved in long distance transmission,
but surely traditional approaches can already yield fairly well optimised
planning for improving this sort of efficiency?

(Finally, this article seems quite light on details to me & doesn't mention a
source Google press release or anything like that with more specifics. Maybe
just better supply forecasting could yield bigger benefits that I imagine...)

~~~
DataDisciple
DM is far from the first company to provide ML solutions to revolutionize
utilities.

There are so many opportunities to increase the efficiency of our electric
networks. Forecasting demand is not something they do well, but more
importantly they could improve Demand Response and Energy Efficiency programs.
Oh, and most of the techniques they use to prevent and stop theft are a joke
(that's a $6B/yr problem in the US).

The utilities have not been forced to innovate. They won't innovate on their
own because there are no customers at risk - no competition. (Aside from smart
meters and the main benefit from that was that they no longer had to pay for
meter readers.)

There is a WORLD of opportunity for utilities to become more efficient, but
they will not do it on their own. Our regulators need to force them to
innovate.

Kudos to DM for this work, but I will be more impressed if they can actually
get a major utility to implement these solutions.

~~~
_800MHz_
"Forecasting demand is not something they do well, but more importantly they
could improve Demand Response and Energy Efficiency programs."

\- Source?

"The utilities have not been forced to innovate. They won't innovate on their
own because there are no customers at risk - no competition."

\- Also, what data can you provide to back this up? I work in the industry,
and I can tell you that innovation will depend largely on the type of energy
market that the utility operates in, whether or not they are a vertically
integrated company, regulated or unregulated, IOU or POU, as well as a ton of
other variables. So while maybe its true that not every single company is
innovating, to generally say that they "won't innovate" or that there is "no
competition" is simply wrong as well as spreading incorrect information about
the industry.

I'd check out this is you're interested in learning more about the utility
industry:
[https://www.osti.gov/scitech/biblio/15001013](https://www.osti.gov/scitech/biblio/15001013)

------
jgamman
“We think there’s no reason why you can't think of a whole national grid of a
country in the same way as you can the data centres." means that you haven't
thought about it at all.

A grid has all sorts of actors involved, a whole bunch of whom are there
purely to make money (manage risk, gamble... tom-a-to/tom-ar-to) - bolting an
AI onto a complex system and then letting people !#$!@# with it to get their
hedge contracts into the money is practically a guaranteed scenario.

------
dboreham
Or run some TV ads telling people to turn their heat down and hang up their
laundry..

------
macu
What is 'UK energy use'? The title should tell what the article is about.

~~~
bboreham
It refers to the generation of electricity. "National Grid" implies that,
since that is the name of the system of wires that carries electricity
throughout the UK. But only if you have the necessary context.

------
chatmasta
I interned at Numenta [0] in 2012. Numenta is building an open-source machine
intelligence product [1] based on the human brain. Specifically, the
algorithms are based on the theory of "hierarchical temporal memory" (HTM) [2]
as described by Numenta founder Jeff Hawkins [3] in his book _On Intelligence_
[4]. The basic idea is that the neocortex has a generalized learning framework
that acts on generalized input from all five senses. For example, one study
enabled blind people to "see" a ball in front of them and grasp it with their
hands, by encoding the image of the ball onto electrical impulses on taste
buds, using a device in contact with the tongue. I can't find a link to the
original study, but here's an article that describes one such device. [5] The
conclusion drawn from this is that, even though blind people have lost their
sight, their neocortex is able to understand inputs from _any_ sensory source
(in this case, tastebuds). The Numenta algorithms generalize this idea to feed
time series data into a virtual "neocortex" that builds custom ML models for
each data source and chooses the best one (a form of online learning that
eliminates the need for data scientists to create custom models for each data
source).

Because the Numenta algorithms are optimized for time-series data, their areas
of strength are prediction and anomaly detection. When I worked there (5 years
ago, they've made a lot of progress since then), one of their main "case
studies" was energy usage in large buildings. Jeff Hawkins gave the keynote
presentation at Strangeloop 2012 where he discussed this specific application
of the algorithms, lowering energy bills in factories by predicting the next
day's usage in advance. [6]

My understanding is that the algorithms worked particularly well for energy
consumption data (e.g. energy drops at night on weekdays, drops on weekends,
spikes in meeting room X at 10-11am every day, etc). I would not be surprised
at all if DeepMind is able to capture savings using similar methods.

[0] [https://www.numenta.com/](https://www.numenta.com/)

[1] [https://www.numenta.org/](https://www.numenta.org/)

[2]
[https://en.wikipedia.org/wiki/Hierarchical_temporal_memory](https://en.wikipedia.org/wiki/Hierarchical_temporal_memory)

[3]
[https://en.wikipedia.org/wiki/Jeff_Hawkins](https://en.wikipedia.org/wiki/Jeff_Hawkins)

[4]
[https://en.wikipedia.org/wiki/On_Intelligence](https://en.wikipedia.org/wiki/On_Intelligence)

[5] [http://www.theplaidzebra.com/device-allows-blind-people-
see-...](http://www.theplaidzebra.com/device-allows-blind-people-see-tongues/)

[6] [https://www.numenta.com/blog/2012/10/22/jeff-hawkins-at-
stra...](https://www.numenta.com/blog/2012/10/22/jeff-hawkins-at-strange-
loop/) (link to video at bottom of blogpost). You need to login to download
the PDF slides, so I made an imgur album of the relevant slides to this
discussion: [https://imgur.com/a/5ULak](https://imgur.com/a/5ULak)

~~~
deepnotderp
You're aware that numenta is considered a joke in the serious machine learning
community, right?

~~~
chatmasta
Yes, and quite unjustifiably so in my opinion, which is why I sourced all
those links in my comment. From what I've seen, Numenta is very good at a
specific class of problems (time series prediction and anomaly detection).

Feel free to underestimate them. It's really not my problem.

And do pompous, abrasive comments like yours really add anything to the
discussion? Why don't you elaborate on why they are a "joke" to serious
intellectuals like yourself.

~~~
deepnotderp
Because they continously claim they're "ahead of everyone" and how everyone
else is stupid and their glorious htm will beat everyone....and then proceeds
to get destroyed by convnets...and then moves the goalposts to anomaly
detection only...and then gets beaten on their own heavily rigged benchmark
dataset.

------
dpandey
Honestly, why is this news? :)

------
br1n0
On the photo #4 you can see a man pointing a pretty insecure computer, windows
xp :) Should Not be used xp anymore because there is no more security updates
or I'm wrong?

The direct link of the image is: [https://cdn.arstechnica.net/wp-
content/uploads/sites/3/2017/...](https://cdn.arstechnica.net/wp-
content/uploads/sites/3/2017/03/National-Grid-workers-in-the-control-
room_highres-980x653.jpg)

~~~
JamesMcMinn
The image has been used in a number of articles, the oldest of which I was
able to find was from 2013 - I would assume the image is older than that even.

------
employee8000
They will reduce energy usage by 10% and then raise prices because they aren't
making enough money. It's happened in Canada with Ontario hydro, and in
California with water.

