Automated Optimization of Non-linear Support Vector Machines for Binary Classification

2018 
Support vector machine (SVM) is a popular classifier that has been used to solve a broad range of problems. Unfortunately, its applications are limited by computational complexity of training which is \(O(t^3)\), where t is the number of vectors in the training set. This limitation makes it difficult to find a proper model, especially for non-linear SVMs, where optimization of hyperparameters is needed. Nowadays, when datasets are getting bigger in terms of their size and the number of features, this issue is becoming a relevant limitation. Furthermore, with a growing number of features, there is often a problem that a lot of them may be redundant and noisy which brings down the performance of a classifier. In this paper, we address both of these issues by combining a recursive feature elimination algorithm with our evolutionary method for model and training set selection. With all of these steps, we reduce both the training and classification times of a trained classifier. We also show that the model obtained using this procedure has similar performance to that determined with other algorithms, including grid search. The results are presented over a set of well-known benchmark sets.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    24
    References
    3
    Citations
    NaN
    KQI
    []