+ Reservoir computing, ESN, LSM, only combines quenched dynamics.
+ Adaptive resonance theory. Addresses catastrophic forgetting and allows someone to learn from one example.
+ Bottleneck networks. Forcing networks to represent things in a compressed sense. Almost like making up their own symbols.
+ Global workspace theory. Winner take all mechanisms that allow modules to compete.
+ Polychronization. Izhikevich shows how dynamic representations are possible thanks to delays.
+ Attractor networks. Use of dynamical system theory to have population of neurons perform computational tasks.
That neural networks are too fragile is a statement that's a bit too general.