Extending MLP ANN hyper-parameters Optimization by using Genetic Algorithm

2018 
Optimizing the hyper-parameters of a multi-layer perceptron (MLP) artificial neural network (ANN) is not a trivial task, and even today the trial-and-error approach is widely used. Many works have already presented using the genetic algorithm (GA) to help in this optimization search including MLP topology, weights, and bias optimization. This work proposes adding hyperparameters for weights initialization and regularization to be optimized simultaneously with the usually MLP topology and learning hyper-parameters. It also analyses which hyperparameters are more correlated with classification performance, allowing a reduction in the search space, which decreases the time and computation needed to reach a good set of hyper-parameters. Results achieved with public datasets reveal an increase in performance when compared with similar works. Also, the hyperparameters related to weights initialization and regularization are among the top 5 most relevant hyper-parameters to explain the accuracy performance in all datasets, showing the importance of including them in the optimization process.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    14
    References
    17
    Citations
    NaN
    KQI
    []