Soft-Tempering Deep Belief Networks Parameters Through Genetic Programming

2019 
Deep neural networks have been widely fostered throughout the last years, primarily on account of their outstanding performance in various tasks, such as objects, images, faces, and speeches recognition. However, such complex models usually require large-scale datasets for training purposes; otherwise, they can get overfitted and therefore not achieve consistent results over unseen data. Another problem among deep models concerns their hyperparameter setting, which may require an experienced user and much effort to calibrate them, despite being application-dependent. In this paper, we present an evolutionary-inspired optimization, known as Genetic Programming, regarding Deep Belief Networks hyperparameter selection, where the terminal nodes encode the hyperparameters of the model, and proper function nodes allow an excellent combination of mathematical operators. The experimental results over distinct datasets showed Genetic Programming could outperform some state-of-the-art results obtained through other meta-heuristic techniques, thus showing to be an exciting alternative to them.
    • Correction
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []