Continual Learning with Laplace Operator Based Node-Importance Dynamic Architecture Neural Network

2021 
In this paper, we propose a continual learning method based on node-importance evaluation and a dynamic architecture model. Our method determines the important nodes according to the value of Laplace operator of each node. Due to the anisotropy of the important nodes, the sparse sub-networks for the specific task can be constructed by freezing the weights of the important nodes and splitting them with unimportant nodes to reduce catastrophic forgetting. Then we add new nodes in networks to prevent existing nodes from being exhausted after continuously learning many new tasks, and to lessen the negative transfer effects. We have evaluated our method on CIFAR-10, CIFAR-100, MNIST, Fashion-MNIST and CUB200 datasets and it achieves superior results when compared to other traditional methods.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []