Another pretty good step forward in Deep Neural Networks from DeepMind.
They took inspiration from neurosciences-based theories about the consolidation of previously acquired skills and memories in mammalian and human brains: connections between neurons are less likely to be overwritten if they have been important in previously learnt tasks. This mechanism is known as “synaptic consolidation“.
The result is a neural network model that can learn several tasks without overwriting what was previously learnt (a known limitation of the current neural network approach, known as “catastrophic forgetting”).
The new algorithm has been called “Elastic Weight Consolidation” (EWC).
All the details can be read in their last PNAS paper.