Hacker News new | past | comments | ask | show | jobs | submit login

There are a lot of neural networks, each with their own represential capabilities.

+ Reservoir computing, ESN, LSM, only combines quenched dynamics.

+ Adaptive resonance theory. Addresses catastrophic forgetting and allows someone to learn from one example.

+ Bottleneck networks. Forcing networks to represent things in a compressed sense. Almost like making up their own symbols.

+ Global workspace theory. Winner take all mechanisms that allow modules to compete.

+ Polychronization. Izhikevich shows how dynamic representations are possible thanks to delays.

+ Attractor networks. Use of dynamical system theory to have population of neurons perform computational tasks.

That neural networks are too fragile is a statement that's a bit too general.




Applications are open for YC Winter 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: