Continual and One-Shot Learning Through Neural Networks with Dynamic External Memory

Benno Lüders, Mikkel Schläger, Aleksandra Korach, Sebastian Risi

Research output: Conference Article in Proceeding or Book/Report chapterArticle in proceedingsResearchpeer-review


Training neural networks to quickly learn new skills without forgetting previously learned skills is an important open challenge in machine learning. A common problem for adaptive networks that can learn during their lifetime is that the weights encoding a particular task are often overridden when a new task is learned. This paper takes a step in overcoming this limitation by building on the recently proposed Evolving Neural Turing Machine (ENTM) approach. In the ENTM, neural networks are augmented with an external memory component that they can write to and read from, which allows them to store associations quickly and over long periods of time. The results in this paper demonstrate that the ENTM is able to perform one-shot learning in reinforcement learning tasks without catastrophic forgetting of previously stored associations. Additionally, we introduce a new ENTM default jump mechanism that makes it easier to find unused memory location and therefor facilitates the evolution of continual learning networks. Our results suggest that augmenting evolving networks with an external memory component is not only a viable mechanism for adaptive behaviors in neuroevolution but also allows these networks to perform continual and one-shot learning at the same time.
Original languageEnglish
Title of host publicationEuropean Conference on the Applications of Evolutionary Computation
Number of pages16
Publication date2017
ISBN (Electronic)978-3-319-55849-3
Publication statusPublished - 2017
SeriesLecture Notes in Computer Science


Dive into the research topics of 'Continual and One-Shot Learning Through Neural Networks with Dynamic External Memory'. Together they form a unique fingerprint.

Cite this