
Hardware for Deep Learning, Part 3: GPU - che_shr_cat
https://blog.inten.to/hardware-for-deep-learning-part-3-gpu-8906c1644664
======
jaimex2
Tesla apparently designed their own chip specifically for neural net vision as
GPUs were not fast enough - including nVidias Tesla chip which is currently
being put in their cars.

Musk mentioned they got a 10x boost from doing so in their last investor call:

"I'm a big fan of NVIDIA, they do great stuff. But using a GPU, fundamentally
it's an emulation mode, and then you also get choked on the bus. So, the
transfer between the GPU and the CPU ends up being one of the constraints of
the system"

"And as a rough sort of whereas the current NVIDIA's hardware can do 200
frames a second, this is able to do over 2,000 frames a second and with full
redundancy and fail-over."

Peter Bannon lead the development.

~~~
che_shr_cat
Many companies designed or designing their own chips. The most prominent
example is Google TPU.

There will be another articles in this series on FPGAs, ASICs and so on.

~~~
godelmachine
Eagerly waiting for the one on FPGA’s and ASIC’s

~~~
che_shr_cat
The original plan was this:

[https://blog.inten.to/hardware-for-deep-learning-current-
sta...](https://blog.inten.to/hardware-for-deep-learning-current-state-and-
trends-51c01ebbb6dc)

~~~
godelmachine
[http://www.cs.cornell.edu/courses/cs6787/2017fa/Lecture11.pd...](http://www.cs.cornell.edu/courses/cs6787/2017fa/Lecture11.pdf)

------
asparagui
Great article!

~~~
che_shr_cat
Thanks!

