Structural adaptation for sparsely connected MLP using Newton's method

2017 
In this work, we propose a paradigm for constructing a sparsely-connected multi-layer perceptron (MLP). Using Orthogonal Least Squares (OLS) method for training, the proposed method prunes the hidden units and output weights based on their usefulness to design a sparsely connected MLP. We formulate second order algorithm to obtain the closed-form expression for hidden unit learning factors thereby minimizing hand-tuned parameters. The usefulness of the proposed algorithm is further substantiated by its ability to differentiate two combined datasets. Using widely available datasets, the proposed algorithm's 10-fold testing error is shown to be less than that of several other algorithms. Inducing sparsity into a fully-connected neural network, pruning of the hidden units, Newton's method for optimization, and orthogonal least squares are the subject matter of the present work.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    24
    References
    0
    Citations
    NaN
    KQI
    []