
Tesla GPU Accelerator Bang for the Buck, Kepler to Volta - jonbaer
https://www.nextplatform.com/2018/03/05/tesla-gpu-accelerator-bang-buck-kepler-volta/
======
sxp
> thanks in part to the growing demand of Nvidia’s Tesla cards in those
> markets but also because cryptocurrency miners who can’t afford to etch
> their own ASICs are creating a huge demand for the company’s top-end GPUs.

This part is incorrect. Unless you need functionality specific to workstation
& server GPUs, it can be an order of magnitude cheaper to buy a high end
gaming GPU. Cryptominers normally use motherboards with 12+ PCI-E 1x-16x
adapters and consumer cards:
[https://www.youtube.com/watch?v=B6gB7GYkkiA](https://www.youtube.com/watch?v=B6gB7GYkkiA)

The same performance/dollar math applies to anyone else who needs to do GPU
processing and doesn't need:

* double-precision math

* high-bandwidth connection between the GPU and motherboard

* a setup that doesn't violate EULAs

* a warranty

In my testing with raytracers, there is around a 5-15x improvement in perf/$
with consumer cards.

~~~
mmrezaie
In data centers, nvidia seems to want to limit usage of consumer products [1].

[1]
[https://news.ycombinator.com/item?id=15983587](https://news.ycombinator.com/item?id=15983587)

~~~
ericd
>* a setup that doesn't violate EULAs

This was a reference to what you're talking about.

~~~
mmrezaie
Yes. but I am not sure if Nvidia is pushing it yet. But at least in Linux, you
have to apply some patches to enable passthrough for consumer GPUs.

------
w0utert
One thing I don't fully understand is why these accelerators all have
relatively small memory sizes? Is it purely because of cost?

The most expensive Tesla accelerator I see in this list has 16 GB, which
sounds like a lot, but it's 'only' twice as much as the consumer-level GPU I
have in my PC at home, and it seems like a pretty paltry amount for large-
scale simulations. I'm thinking about volumetric modeling applications, for
example, where storing a dense voxel grid as a volume texture would easily run
over 30 GB even for quite modest grid size/resolution (e.g. 2K x 2K x 2K
grid). It's exactly these kinds of things where a GPU accelerator could shine,
without the need to implement spare sampling solutions or accelerator data
structures that would be complex and severely impact performance on GPU
architectures.

~~~
Const-me
> that would be complex and severely impact performance on GPU architectures

Agree about the complexity. However, adaptive grids can be a huge win in terms
of performance. The GPU doesn’t need to simulate the whole 8G grid nodes, it
only simulates much fewer nodes, only in the areas of large gradients / higher
frequencies of the simulated fields.

Here’s an old SpaceX presentation about their GPU simulations of rockets:
[https://www.youtube.com/watch?v=vYA0f6R5KAI](https://www.youtube.com/watch?v=vYA0f6R5KAI)

~~~
w0utert
That was very interesting and helpful, thanks!

------
bcheung
I was shocked how difficult it was to get a 1080TI at the local store.

Pardon my ignorance but with all the demand for graphics cards for
cryptocurrency mining, why has a viable ASIC not become commonplace so miners
aren't grabbing up every available graphics card they can find making it hard
for non-miners to have anything and driving up prices.

I'm hoping that mining moves more to ASIC and deep learning gets more desktop
based "tensor" options. This is a very strange market right now with companies
trying to appeal to everyone instead of specializing more and not really
serving anyone well.

~~~
cbg0
If the coin you're mining goes bust you can still sell the GPU and recoup some
of the investment. With ASICs there's not much of a market for that.

~~~
mrep
Also, most coins try to design them to be ASIC resistant because they do not
want them to become centralized.

~~~
glup
I've heard this elsewhere but it wasn't clear to me what that being ASIC
resistant actually entails

~~~
ansible
The algorithms used for Ether and some other altcoins require more RAM than is
economically available for the current ASIC implementations. This was
specifically done to deter use of ASICs.

------
zeristor
Can the demand for bitcoin ever be sated?

Given the CAPEX cost of the GPUs, and the OPEX of the electricity someone has
probably worked out the rates of return for each of the crypto currencies.

If there’s an expectation the virtual currencies will soar in value why
wouldn’t people buy all they can, perhaps to two or three times the cost to
mine.

