
Deep learning has a size problem - nnx
https://heartbeat.fritz.ai/deep-learning-has-a-size-problem-ea601304cd8
======
sigmaprimus
This article confused me, the author referenced the time and power required to
train a model, which granted is lots, but then talks about phones and
mircontrollers which are less powerful but are far more numerous than the
machines currently being used to develop these models.

Isnt that the point? The powerful machines create the models through
training/data processing, then those models are loaded onto the less powerful
units for use?

