Internal node bagging: a layer-wise ensemble training method

2019 
When training neural networks, regularization methods are needed to avoid model overfitting. Dropout is a widely used regularization method, but its working principle is inconclusive and it does not work well for small models. This paper introduced a novel view to understand how dropout works as a layer-wise ensemble training method, that each feature in hidden layers is learned by multiple nodes, and next layer integrates the outputs of these nodes. Basing on the novel understanding of dropout, we proposed a new neural network training algorithm named internal node bagging, which explicitly forces a group of nodes to learn the same feature during training phase and combines these nodes into one node during testing phase. This means that more parameters can be used during training phase to improve the fitting ability of models while keeping model remains small during testing phase. After experimenting on three datasets, it is found that this algorithm can significantly improve the test performance of small models.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    17
    References
    0
    Citations
    NaN
    KQI
    []