Overcoming catastrophic forgetting in neural networks

Another pretty good step forward in Deep Neural Networks from DeepMind.

They took inspiration from neurosciences-based theories about the consolidation of previously acquired skills and memories in mammalian and human brains: connections between neurons are less likely to be overwritten if they have been important in previously learnt tasks. This mechanism is known as “synaptic consolidation“.

The result is a neural network model that can learn several tasks without overwriting what was previously learnt (a known limitation of the current neural network approach,  known as “catastrophic forgetting”).

The new algorithm has been called “Elastic Weight Consolidation” (EWC).

All the details can be read in their last  PNAS paper.

Published by

Davide Madrisan

Linux Developer, DevOps & Automation Engineer Cloud, Web and Data Science passionate

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s