
Nvidia Unifies AI Compute with “Ampere” GPU - Katydid
https://www.nextplatform.com/2020/05/14/nvidia-unifies-ai-compute-with-ampere-gpu/
======
mark_l_watson
"On single precision floating point (FP32) machine learning training and
eight-bit integer (INT8) machine learning inference, the performance jump from
Volta to Ampere is an astounding 20X."

Incredible. I just have a 1070 in a Linux laptop which really helped my
personal deep learning projects (before I retired, I had I the compute I
needed).

No way to justify the purchase, but I would like an Ampere based system at
home. My workload has changed a lot since retiring, I train less and use pre-
trained NLP models.

~~~
votepaunchy
> No way to justify the purchase, but I would like an Ampere based system at
> home.

These will be available on every major cloud, on-demand. Is that not
sufficient, to have a $20K GPU available in seconds for a dollar or so per
hour?

