Continual Lifelong Learning in Natural Language Processing: A Survey
This work looks at the problem of CL through the lens of various NLP tasks, and discusses major challenges in CL and current methods applied in neural network models.
Abstract
Continual learning (CL) aims to enable information systems to learn from a\ncontinuous data stream across time. However, it is difficult for existing deep\nlearning architectures to learn a new task without largely forgetting\npreviously acquired knowledge. Furthermore, CL is particularly challenging for\nlanguage learning, as natural language is ambiguous: it is discrete,\ncompositional, and its meaning is context-dependent. In this work, we look at\nthe problem of CL through the lens of various NLP tasks. Our survey discusses\nmajor challenges in CL and current methods applied in neural network models. We\nalso provide a critical review of the existing CL evaluation methods and\ndatasets in NLP. Finally, we present our outlook on future research directions.\n