
Show HN: GPU-backed Linux desktops in the cloud - DTE
https://www.paperspace.com/ml
======
MichaelBurge
I've considered setting up my own GPU server to reduce costs, but if you've
got P100s for $200/month that means you're paying off the hardware in 4 years?
I can't compete with that. AWS was much easier to justify, since they're more
expensive so the payoff is shorter. I was waiting on Google cloud's pricing to
make any decision there:

[https://cloud.google.com/gpu/](https://cloud.google.com/gpu/)

The only thing that concerns me is that your GPU+ instances only have a single
P100, while someone like Google promises to let you attach up to 8 to a single
machine. So if I wanted a single powerful machine for experimental work, the
cloud providers are more expensive. But I'd have the same problem with buying
my own hardware, because those cards are expensive.

If you have only a single GPU, have you done any performance testing comparing
consumer GPUs like the GTX 1080 with the commercial GPUs? I believe two
advantages of the commercial ones are 1. Better interlinks between multiple
GPUs and 2. Better floating point performance at the precision used in deep
learning. Advantage #1 seems like it wouldn't matter with only a single GPU. I
think AWS only has K80s, so that's in your favor.

What motherboards/RAM/CPU/etc. are you using? If my estimate is right and you
are pricing for a 4-year payoff just for capex, listing all of the hardware
would make it an easy sell.

------
evervevdww221
This looks great. great execution! What gpu do I have? how much video memory?

I knew about paperspace the day it came out of yc.

at the time, we were working on a competing product.

but we eventually gave up on the idea. because business wise, amazon is too
expensive for this kind of service, making the price not economical for end
users.

I don't know too much about the technical details of paperspace, but I believe
we did better at resource sharing and cost management. but it was still a
tough sell, even for enterprise users.

~~~
DTE
We are using NVIDIA Quadros and the new P100's both in "pass-through" mode so
you get the whole GPU. The economics of running this kind of business on top
of AWS definitely doesn't make sense (we tried for a while). Ultimately we
decided to build out our own datacenters (currently one in CA and one in NY)
which gives us a huge advantage in that we can really optimize the hardware
and also control costs. The end result is that we can offer an instance that
is twice as powerful as Amazon's g2.xlarge for about half the price.

------
rco8786
any plans/thoughts on Windows support?

~~~
DTE
Yup we can do gpu-backed Windows also! They should be publicly available in
the interface in the next week or two — we are hammering out some frontend
changes to go with it. You can also email support [at] paperspace.com and we
can set you up with one today.

