Hacker News new | past | comments | ask | show | jobs | submit login

Isn't this just "transfer learning"? Surely there has to be a better way than "momma bird pukes into baby bird's mouth" type of training



No. Transfer usually means using the same NN model (eg. GPT-3 checkpoints being retrained on Github and then called 'Codex'), or possibly some sort of distillation/sparsifying approach. This is about auto-generating training data, maybe not even meant to be used by a neural net at all.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: