FoCL: Feature-oriented continual learning for generative models

作者:

Highlights:

摘要

In this paper, we propose a general framework in continual learning for generative models: Feature-oriented Continual Learning (FoCL). Unlike previous works that aim to solve the catastrophic forgetting problem by introducing regularization in the parameter space or image space, FoCL imposes regularization in the feature space. We show in our experiments that FoCL has faster adaptation to distributional changes in sequentially arriving tasks, and achieves state-of-the-art performance for generative models in task incremental learning. We discuss choices of combined regularization spaces towards different use case scenarios for boosted performance, e.g., tasks that have high variability in the background. Finally, we introduce a forgetfulness measure that fairly evaluates the degree to which a model suffers from forgetting. Interestingly, the analysis of our proposed forgetfulness score also implies that FoCL tends to have a mitigated forgetting for future tasks.

论文关键词:Catastrophic forgetting,Continual learning,Generative models,Feature matching,Generative replay,Pseudo-rehearsal

论文评审过程:Received 6 May 2020, Revised 9 April 2021, Accepted 23 June 2021, Available online 1 July 2021, Version of Record 14 July 2021.

论文官网地址:https://doi.org/10.1016/j.patcog.2021.108127