
Launch HN: Snark  AI (YC S18) – Distributed Low-Cost GPUs for Deep Learning - davidbuniat
Hi HN,<p>We are Sergiy, Davit and Jason, founders of Snark AI (<a href="https:&#x2F;&#x2F;snark.ai" rel="nofollow">https:&#x2F;&#x2F;snark.ai</a>). We provide low-cost GPUs for Deep Learning training and deployment on semi-decentralized servers.<p>We started Snark AI during our PhD programs at Princeton University. As deep learning researchers we always experienced lack of GPU resources. Renting out GPUs on the cloud didn&#x27;t fit in our budget, and purchasing GPU cards was difficult -- at that time, so many GPUs were being taken away by the crypto-miners. Then we found out that GPU mining profits lag far behind public cloud GPU prices.<p>On top of that, we figured out that there&#x27;s a way to run Neural Network inference and crypto-mining simultaneously without hurting mining hash rate. This observation is a little counterintuitive, but it turns out that anti-asic hashing algorithms are designed to be extremely memory intensive, which leaves a good chunk of the CUDA cores idle. We can utilize the leftover compute power to run Neural Network inference extremely cost efficiently, which could be a life savior for large-scale inference tasks. <a href="http:&#x2F;&#x2F;snark.ai&#x2F;blog" rel="nofollow">http:&#x2F;&#x2F;snark.ai&#x2F;blog</a><p>At the same time, we provide low cost raw hardware access for Neural Network training. We aim to be up to 10 times cheaper than on-demand instances on public cloud, undercutting preempteble&#x2F;spot instance by up to 2x. When the GPU is idle our algorithms efficiently switch to mining to reduce costs. Try it out at <a href="https:&#x2F;&#x2F;lab.snark.ai" rel="nofollow">https:&#x2F;&#x2F;lab.snark.ai</a>, with 10 hours of free GPU time. We made it very simple to access the hardware through a single command line after `pip3 install snark`. More information on usage here <a href="https:&#x2F;&#x2F;github.com&#x2F;snarkai&#x2F;snark-doc" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;snarkai&#x2F;snark-doc</a>. We are also working on creating a hub for NNs, similar to docker hub. It is still work in progress but you can take a look at couple examples at <a href="https:&#x2F;&#x2F;hub.snark.ai&#x2F;explore" rel="nofollow">https:&#x2F;&#x2F;hub.snark.ai&#x2F;explore</a>.<p>We would love to get your feedback, to understand how was the experience for training Deep Networks through our platform and then deploying.
======
karimf
If you guys are searching for potential early customer, maybe you can hang
around on Fast.ai course forum[0]. There are a lot of people who are just
trying DL for the first time, like me, and are looking for the cheapest way
possible to get started experimenting. One of your selling point you said is
the cheaper price, so maybe it would work out well.

[http://forums.fast.ai/](http://forums.fast.ai/)

~~~
corporateslaver
Please stop marketing fast.ai on hacker news

~~~
cookingrobot
Why, it’s really good? I was going to make the same suggestion, but to compete
with Paperspace they should offer an image that comes with all the course
dependencies preinstalled.

~~~
merkleforest
Thanks for the suggestion! Snark AI will offer a pod type with all fast.ai
course dependencies installed and easy jupyter notebook access.

------
whoisjuan
"3-5 times cheaper than public cloud"... That's very vague. What's the actual
minimum price per hour?

~~~
davidbuniat
Thanks for asking, it is cheaper than preemptible instance of K80 0.135$/h on
Google with an equivalent GPU of P106 0.095$/h on Snark.

~~~
ipsum2
That's definitely not 3-5 times cheaper.

~~~
sgrove
I _think_ snark isn't pre-emptible, they're just giving _the best possible
price_ on GCE (steelman argument, which is impressive), but it is a bit
confusing.

From
[https://cloud.google.com/compute/pricing](https://cloud.google.com/compute/pricing)
it looks like the non-pre-emptible K80 pricing is $0.45 USD per GPU.

Can someone from Snark correct me if I'm wrong?

~~~
davidbuniat
Yep, totally agree!

------
minimaxir
So what _is_ the pricing? The website has a login-wall to access the pricing.

~~~
Fede_V
I'm sure there is a practical reason why companies do this, but it drives me
crazy and makes me want to ignore what could potentially be a really cool
idea. Please make the pricing transparent and viewable without a registration.

~~~
davidbuniat
yeah you are right @Fede_V, sorry about this, just made it public
[https://lab.snark.ai/pricing](https://lab.snark.ai/pricing)

------
thisisit
> we figured out that there's a way to run Neural Network inference and
> crypto-mining simultaneously without hurting mining hash rate

I understand that you figured out the technical side of things but curious
about the human nature side of things. Let's say currently you can have both
NN and mining running. From a cryptocurrency price point of view two things
can happen:

a. Prices go up - Wont the incentive change towards mining? How do you guys
plan to handle such situations?

b. Prices go down - In this case, the intuitive thing to happen is that GPU
pricing to get cheaper. But then, will you guys passing on such benefits to
your customers? Because the whole point is to be able to make a steady income
from the GPUs, mining or not.

~~~
davidbuniat
Lets agree that if the mining price goes up, it will be always cheaper than
public cloud. Otherwise, you would have run mining on AWS/GCP and be
profitable. In this scenario, we are still able to offer cheaper price to the
customers and higher price to the miners even by just binary switching.
Assuming the gap is not negligible.

If the price goes down, customers will be able to set lower price. As long as
GPU holders profit margin is high enough given electricity and maintenance
costs, they will do the compute.

If the marketplace matures, pricing of mining, deep learning, rendering and
other tasks will be driven by the market. At Snark AI we are working on
towards creating this marketplace that will provide optimal benefits to all
parties.

~~~
thisisit
> Lets agree that if the mining price goes up, it will be always cheaper than
> public cloud

I am confused. Let's say your price is $x while public cloud is $x+y. When
cryptocurrency price rise the incentive is to increase _your_ price to $x+y,
if that is the profit point. In which case you will no longer be cheaper than
public cloud. Any differences will be negligible. So, I am not sure you can
claim to be _always cheaper than public cloud_.

~~~
davidbuniat
Following your notation, consider our new price will be $x+y', where y' is the
mining price difference.

You are right my claim that y'<y is slightly weak (was based on "gap is not
negligible" assumption, see below).

"gap is not negligible" \- means if y' gets near to y, then y will get even
higher and there will be always a market gap, which I think you disagree with.

Based on your suggestion, on extreme scenario I would soften my claim to
y'=<y, without us making profit. :)

------
TekMol
Funny, I wanted to take a look at snark.ai to see if it can be used to run
WebGL inside a browser.

For some reason, the snark.ai homepage brings my laptop to its knees though.
Do I need a GPU cluster to see it?

Anyhow: Can I use snark.ai to run WebGL in a browser?

~~~
davidbuniat
Haha, jokingly we use GPUs on our landing page to rent out to others. :D On a
side note, animations are slightly tough, need to optimize them.

Regarding WebGL, actually that is an interesting point, would like to know
more about the use case.

~~~
TekMol
WebGL: I mean can I install a desktop environment on your machines, start a
browser in it and then WebGL will be fast because it has access to a fast GPU?

~~~
davidbuniat
@TekMol, we can give a try together to see if the streaming speed will be
enough for running your WebGL application smoothly.

------
hartator
We might be an early customer at SerpApi. [1]

In your pricing, you say you are selling, for example, P106 at 0.095$/h, but
in your explanation, you are saying you are using idle cycle to mine cryptos
(or the reverse, idle cycle to process ML tasks). When I rent a P106, do I
have full access to the cores or just partial?

[1] [https://serpapi.com](https://serpapi.com)

~~~
davidbuniat
Great! Once you rent a P106 you have full access to the GPU, and we don't run
mining or ML tasks.

If you want to deploy large-scale computation and significantly reduce your
costs, we can help by running mining at the same time under your consent. This
only applies to Deep Learning inference.

------
mbajkowski
Reading through your blog entry: "When the data to be processed is sensitive,
Snark Infer will dispatch the task only to Privacy Shield Verified GPU
providers." \- can you shed some light as to a) what a privacy shield verified
GPU provider is? b) how data sensitivity is specified or determined? (I did
not see either mentioned in the docs yet)

~~~
davidbuniat
Great you asked this. We try to say that for privacy sensitive data processing
(e.g. face recognition), we are eager to work closely with clients and
hardware providers to ensure security and compliance to regulations such as
Privacy Shield, GDPR, etc. Do you have a such use case, would be great to get
in touch with.

------
paulschraven
Are you planning to add cards manufactured for ML, such as the NVidia Titan V
(110 teraflops)?

~~~
davidbuniat
yes, we are currently focusing on large-scale compute, but we are planning to
add more variety for hardware.

~~~
scottlegrand2
Hope you're not planning on putting any of those fancy Titan Vs in a
Datacenter! Rumor has it that ever since Baidu installed 100,000 GeForce GPUs
in a Datacenter, they've made that against their EULA (unless you're mining
cryptocurrencies which is apparently A-OK).

[http://www.datacenterdynamics.com/content-tracks/servers-
sto...](http://www.datacenterdynamics.com/content-tracks/servers-
storage/nvidia-updates-geforce-eula-to-prohibit-data-center-
use/99525.fullarticle)

~~~
davidbuniat
Good catch! We don't own hardware ourselves, but we are considering EULA with
our partners.

------
hwoolery
Serious question (as opposed to a snarky one): what happens when/if the crypto
currency market takes a large downturn? If your pricing is based off using
their cycles, Snark.ai would have to raise prices, right?

~~~
davidbuniat
In case of crypto downturn renting GPUs will be even cheaper. Actually, it
will be a bigger win for our users and also for us.

~~~
hwoolery
Assuming the miners still want to rent out the hardware and not sell it :)
Even then, you'd have an instance of two parties both taking slices of the
profit, which is hard to compete in a practical manner against the
Amazon/Google

~~~
merkleforest
Yeah exactly. Hopefully AI computation market will grow big in time so those
crypto-miners can pivot into more profitable business of GPU cloud for AI
instead of selling the hardware in crypto downfall. In the long term, Snark AI
wants to create the marketplace where all the qualified GPU providers can bid
for the best market price which will benefit everyone.

------
wsu718
Really interesting.

Is storage persistent? Will files get deleted when I stop a machine?

~~~
davidbuniat
Thanks for asking, thats a good point! Compared to spot/preemptible, your
files will be persistent. You can stop your running pod, add more GPUs and
then continue training your model.

------
wei2012
The Graphic Card's price will reduce very soon due to bitcoin pricing drop.

~~~
davidbuniat
That's what we count on as well.

------
harias
Are you the same guy :
[https://www.reddit.com/r/gpumining/comments/86ofw2/rent_out_...](https://www.reddit.com/r/gpumining/comments/86ofw2/rent_out_your_gpu_compute_to_ai_researchers_and/)

~~~
1996
Then it is a copycat of nicehash.

I wonder how they plan to undercut nicehash?

~~~
lozaning
I understand them to be totally different services. How would someone with ML
training data on hand use nicehash to get any work done? Is snark setup to do
mining? As I see it they serve totally distinct and separate markets.

------
technologia
I don't know if I overlooked it, but it doesn't seem possible to delete my
account if I so choose.

~~~
davidbuniat
Just shoot us an email at support@snark.ai, we will handle it :)

~~~
avarun
I'm not personally super invested in this, but you might want to check to see
if that violates GDPR.

~~~
methyl
It doesn't, if they do it after request.

~~~
whoisjuan
Doesn't GDPR mandates that you need to provide a self-serve way to delete your
account?

~~~
geoelectric
You might be confusing it with the new California law (which would apply to
snark.ai, per their own terms doc) that requires any renewing service offer
that can be initiated online to have an online way to cancel.

[https://www.perkinscoie.com/en/news-insights/california-
upda...](https://www.perkinscoie.com/en/news-insights/california-updates-its-
auto-renewal-law.html)

I don't think that extends to full account deletion, though, more about
stopping recurring charges.

