Feature selection and kernel specification for support vector machines using multi-objective genetic algorithms

2015 
Support Vector Machines (SVMs) have shown to be popular for classification problems. There are tuning parameters that need to be specified before fitting SVMs.  Genetic algorithms (GA) have been used as optimization algorithm for selecting parameters, but most applications excluded the selection of a kernel function.  GA has a further extension called multi-objective GA where multiple criteria are specified and the fitness of possible solutions are determined by their level of dominance. The use of multi-objective GA applied to SVMs is demonstrated where the optimization criteria are prediction error, number of variables and number of support vectors.  The kernel function, kernel parameters and cost parameter(C) form part of the member definition of the GA.  Benchmark and simulated data sets are used to show how this approach provides a range of solutions that are trade-offs of the various optimization criteria. For the standard GA where prediction error is used as fitness criterion, the fitness has to be determined from a validation set or using cross validation to guard against overfitting.  In the multi-objective approach, the number of variables and number of support vectors are part of the optimization, and the possibility of using the full training dataset without cross validation will be discussed.
    • Correction
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []