Hacker News new | comments | show | ask | jobs | submit login

This reminds me of dropout regularization where you improve a neural network by removing random neurons from the neural network.

See https://en.wikipedia.org/wiki/Dropout_(neural_networks) and https://youtu.be/u4alGiomYP4?t=33m53s




Dropout is only used during training a neural network and also neurons are turned off not removed.


Well, the brain is constantly trained and used. :)




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: