
The Google Brain Team – Looking Back on 2017 - amaks
https://research.googleblog.com/2018/01/the-google-brain-team-looking-back-on.html
======
jacksmith21006
My favorite of these is the Jeff Dean paper on using NN for database indexes.
Then doing the processing on the TPUs. Really looking forward to see the
difference in power required in using a TPU versus a CPU using a traditional
approach.

------
pkaye
Seems like the AutoML project if successful will result in a loss of atleast
the lower end AI jobs.

~~~
riku_iki
Is it open source?

------
dspoka
The most promising area here to me seem like automl. The promise of the new
machine learning was that we get to move away from tedious feature engineering
and everything will work and be simple. It may have become simpler but
training/debugging new DL models is still painful causing the focus to move to
extensive hyperparameter search. automl may become the next step in
abstraction, where we design single models/algorithms that are able to build
viable networks for many tasks/purposes.

~~~
tramGG
I'm really excited by AutoML as well, it seems like one of the few next level
advancements. One of the authors of AutoML is presenting in SF in Feb at the
decentralized ai summit can't wait.

------
sabujp
the best one is "I'm too busy for romance". :
[https://google.github.io/tacotron/publications/tacotron2/ind...](https://google.github.io/tacotron/publications/tacotron2/index.html)

