A Neighborhood Regression Optimization Algorithm for Computationally Expensive Optimization Problems.

2020 
Expensive optimization problems arise in diverse fields, and the expensive computation in terms of function evaluation poses a serious challenge to global optimization algorithms. In this article, a simple yet effective optimization algorithm for computationally expensive optimization problems is proposed, which is called the neighborhood regression optimization algorithm. For a minimization problem, the proposed algorithm incorporates the regression technique based on a neighborhood structure to predict a descent direction. The descent direction is then adopted to generate new potential offspring around the best solution obtained so far. The proposed algorithm is compared with 12 popular algorithms on two benchmark suites with up to 30 decision variables. Empirical results demonstrate that the proposed algorithm shows clear advantages when dealing with unimodal and smooth problems, and is better than or competitive with other peer algorithms in terms of the overall performance. In addition, the proposed algorithm is efficient and keeps a good tradeoff between solution quality and running time.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []