
Train Your Machine Learning Models on Google’s GPUs for Free - rbanffy
https://hackernoon.com/train-your-machine-learning-models-on-googles-gpus-for-free-forever-a41bd309d6ad
======
blensor
They even allow installing of python modules via pip.

And they already include opencv

    
    
      import cv2
      print(cv2.__version__)
    
      '3.4.0'
    

So I guess this is an incredibly awesome gift for many more than just ML devs

After playing some random python module bingo here are some of the most useful
libraries

    
    
      sklearn 0.19.1
      skimage 0.13.1
      bs4 0.4.6  (BeautifulSoup4)
      scipy 0.19.1
      seaborn 0.7.1
      PIL 4.0.0
      keras 2.1.5
      nltk 3.2.1
      zmq 16.0.4

------
davidkuhta
Good to know thanks!

Quick summary: Google Colaboratory (Colab) provides a Jupyter notebook
environment with up to 12 hours continuous access to an Nvidia K80 for free.

[https://research.google.com/colaboratory/faq.html](https://research.google.com/colaboratory/faq.html)

------
spaceandthyme
This is great - especially for small projects and learning. However, there are
a few limitations to be aware of if you’re thinking about using it for real
projects.

First, is memory. I’ve seen reports of practical limitations being around
500mb. Call me a spoiled millennial noob but the vast majority of my real-
world ML projects use a lot more - especially when working with image data.

Second, is getting data into your notebook. Most ML is data heavy and you want
a fast way of working with it. For example, the Google Landmark challenge on
Kaggle has nearly half a terabyte of _unaugmented_ images just for the test
set. You could easily push several terabytes for that one-off challenge alone
if you were not careful.

Note: it is possible to work with much less data in many cases by resizing and
preprocessing the images as you pull them down but that has problems too.
Also, you can use Google Drive but it’s not ideal for large datasets.

Related:

[https://www.reddit.com/r/MachineLearning/comments/84532y/n_g...](https://www.reddit.com/r/MachineLearning/comments/84532y/n_google_colab_gives_you_free_usage_of_a_k80_gpu/)

[https://stackoverflow.com/questions/48750199/google-
colabora...](https://stackoverflow.com/questions/48750199/google-colaboratory-
misleading-information-about-its-gpu-only-5-ram-available/49225460)

~~~
singularity2001
[[deleted]]

~~~
londons_explore
They give you 12GB of GPU ram.

The 500mb is simply the host ram for the python runtime. As long as you aren't
converting everything to numpy arrays, you'll be fine.

~~~
singularity2001
fantastic, thanks!

------
ris
> for Free — Forever

Nothing is _ever_ free forever.

------
Dibbles
I'm assuming they have an eye on all your data, or will so in the future?
What's the catch?

------
singularity2001
just so that you appreciate this outstanding gift: the equivalent compute time
would quickly be worth several dozen of dollars if not hundreds of dollars on
Amazon (plus setup time).

this is truly one of the most remarkable giveaways since … the Internet. of
course by doing this they want to find talent and hook you. but I see no harm
in this bait and switch scheme.

TL;DR: wow! free supercomputer time!

~~~
singularity2001
what we need now is some peer to peer or community driven learning so that
people can collaborate on creating the best speech recognition model etc

------
solarkraft
Something makes me think this offer won't be this free for long.

------
nartz
what prevents ethereum mining on this if its free?

~~~
rorosaurus
I would assume it's against the TOS and you'd run the risk of losing your
Google Account if you got caught.

~~~
pharrington
Do you have a URL for the Google Colaboratory TOS? Maybe I'm blind, but my
cursory search failed.

