Hacker News new | past | comments | ask | show | jobs | submit login

Sequence predicting RNNs are basically unsupervised, in that they can learn from lots raw of unlabelled data. And they learn useful internal representations which can be adapted for other tasks. There is lots of old work on unsupervised learning rules for RNNs, including recurrent autoencoders and history compression.



Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: