I get it that for any serious use you'd want a GPU, but for learning and toying around you might want to be able to run and debug code on your freakin macbook! Is that too much to asks? (Some of us do code in IDEs, not in notebooks + vim on server, and we'd want at least our test suite to be able to run locally ffs!)
(Also, hopefully they've got rid of the lovecraftian architecture with methods that can mutate an object's class [?!] - I understood the practical appeal and why they did it, but as a software engineer with sympathy for functional-programming that almost made me wanna barf :|)
Anyway, fastai is awesome for learning and experimenting, keep up the good work! I just hate it that it's so obnoxious to use and learn for anyone with a more traditional software engineering background...
It is not FastAI fault though.
But in fastai 1.0 it was all bundled together in one big yarn, with everything depending in the end on some data loading classes that depended on GPU driver etc.
Anyway, it was really bad architecture and dev practices in the codebase I was working though, the tested behavior would probably not have matched production one 100%... I don't blame fastai much for not helping with a broken workflow, but I prefer more barebones and less opinionated frameworks, aka using tf or pytorch directly, since some times you really need to get that "broken" thing running in production before you work on a refactored version of it :P Fastai seems very research-oriented and opinionated.
I'll definitely look into fastai 2.0 though :)
Every year it gets more accessible to a wider audience. Soon there will probably be frameworks that hide the complexity completely and you can just say here’s a massive dataset, I want to train it to be a conversation bot or cat pic classifier, go. But we’re not quite there yet.
Synchronizing local and remote code shouldn't take much time, but it's still at least a few seconds on the critical path for the run->fail->fix->rerun loop.
VSCode's remote mode might be a worth a try for people with such a setup.