Feature Based Algorithm Configuration: A Case Study with Differential Evolution

2016 
Algorithm Configuration is still an intricate problem especially in the continuous black box optimization domain. This paper empirically investigates the relationship between continuous problem features (measuring different problem characteristics) and the best parameter configuration of a given stochastic algorithm over a bench of test functions — namely here, the original version of Differential Evolution over the BBOB test bench. This is achieved by learning an empirical performance model from the problem features and the algorithm parameters. This performance model can then be used to compute an empirical optimal parameter configuration from features values. The results show that reasonable performance models can indeed be learned, resulting in a better parameter configuration than a static parameter setting optimized for robustness over the test bench.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    20
    References
    22
    Citations
    NaN
    KQI
    []