Higher order ANN parameter optimization using hybrid opposition-elitism based metaheuristic

2021 
For two decades, the dominance of nature-inspired optimization algorithms has been irresistible in solving many complex problems. In fact, most of these algorithms are integrated with some other intelligent technique to prove the effectiveness of each method. Out of these algorithms, the last decade has witnessed a good research contribution for Teaching–Learning Based optimization and its different variations as well as advances in many engineering domains. In this research, a variation of the TLBO technique has been integrated with the Functional Link Artificial Neural Network to classify the nonlinear data. With suitable parameter adjustments, the proposed model can classify the data efficiently. For learning, the Gradient Descent method has been adopted for obtaining the optimal weight units of the neural network. Simulation results reveal that the proposed hybrid approach is superior in terms of considered performance parameters as compared to other competitive methods.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    86
    References
    0
    Citations
    NaN
    KQI
    []