
Learning@Home - decentralized training of huge neural networks - justheuristic
http://learning-at-home.github.io
======
justheuristic
Can you train a huge neural network without a supercomputer? Imagine you want
a GPT-3-sized model, but instead of $10⁸ GPU cluster you've got support from
thousands of volunteers across the world - gamers, research labs, small
companies. What kind of system would you use to let them work together despite
internet latency, packet loss, and hardware failures?

We at Learning@home are building just such a system. Together, we want to
change large-scale deep learning from private experiments behind closed doors
into a decentralized peer-to-peer activity where everyone can participate.

Let's build the BitTorrent of deep learning :)

