An Incremental Scheme with Weight Pruning to Train Deep Neural Network

2018 
Deep neural networks have present state-of-the-art results in many different machine learning tasks. In the traditional machine learning task, we train models on the formerly prepared dataset. However, in the real-world scenarios, training data are always collected in an incremental manner, in which new samples and new classes will be added to the training data gradually. Since the traditional training method with stochastic gradient descent will suffer from catastrophic forgetting problem when training on the new data set, in this paper, we proposed a new scheme to train deep neural networks incrementally. We first train the deep model on the original dataset with a weight-pruning manner, then on the newly added training data, we train the former pruned weights while remaining the former trained core-part weights unchanged. Experiments on MNIST demonstrated that our method is efficient and can even get better performance than training from scratch on the whole dataset in the traditional manner.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    16
    References
    0
    Citations
    NaN
    KQI
    []