
Ask HN: How many GPUs do you train deep models on? - kiske
I&#x27;ve noticed that most of my colleagues will only use up to 4 GPUs when training deep models (tensorflow &amp; pytorch).<p>Why is that?<p>Do you train on multiple machines?
======
chanchar
It's always going to be about time and money. With more GPUs, it'll take less
time to train. You're also not locked into a cloud provider like AWS or GCE if
you plan to do this long term.

------
wskinner
Setting up infrastructure is hard, and the costs can add up. I don't want to
spend cloud GPU money to debug my crappy tensorflow code.

