language-icon Old Web
English
Sign In

GAN Memory with No Forgetting

2020 
Seeking to address the fundamental issue of memory in lifelong learning, we propose a GAN memory that is capable of realistically remembering a stream of generative processes with \emph{no} forgetting. Our GAN memory is based on recognizing that one can modulate the ``style'' of a GAN model to form perceptually-distant targeted generation. Accordingly, we propose to do sequential style modulations atop a well-behaved base GAN model, to form sequential targeted generative models, while simultaneously benefiting from the transferred base knowledge. Experiments demonstrate the superiority of our method over existing approaches and its effectiveness in alleviating catastrophic forgetting for lifelong classification problems.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    114
    References
    5
    Citations
    NaN
    KQI
    []