Regularization of Neural Network Using DropCoadapt

2020 
Complex co-adaption between hidden neurons prevents fine tuning of all parameters by overfitting, especially in deep neural models. Dropout has played an essential role to tackle this problem. However, dropout suppress neurons blindly. In this paper, we present a technique to assign dropout rate according to co-adaption pressure, called DropCoadapt. At first, it identifies co-adaption clusters on hidden units. We define the term co-adaption pressure as the density of co-adaption in each co-adaption clusters. Then, the dropout rate updates for each neuron according to the obtained co-adaption pressure. Experimental results on MNIST and CIFAR-10 datasets confirm that DropCoadapt obtains better performance and outperforms other methods.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    22
    References
    0
    Citations
    NaN
    KQI
    []