Hacker News new | past | comments | ask | show | jobs | submit login

Backprop is such a fundamental part of Neural Networks, I am very surprised how anyone can complain about having to know how it works. It is true that once you grocked the principle implementing it for more than 2 layers is pretty tedious. I would propose, however, that without having done the tedious work at least once you cannot truly understand it.



Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: