Relaxing support vectors for classification

2014 
We introduce a novel modification to standard support vector machine (SVM) formulations based on a limited amount of penalty-free slack to reduce the influence of misclassified samples or outliers. We show that free slack relaxes support vectors and pushes them towards their respective classes, hence we use the name relaxed support vector machines (RSVM) for our method. We present theoretical properties of the RSVM formulation and develop its dual formulation for nonlinear classification via kernels. We show the connection between the dual RSVM and the dual of the standard SVM formulations. We provide error bounds for RSVM and show it to be stable, universally consistent and tighter than error bounds for standard SVM. We also introduce a linear programming version of RSVM, which we call RSVMLP. We apply RSVM and RSVMLP to synthetic data and benchmark binary classification problems, and compare our results with standard SVM classification results. We show that relaxed influential support vectors may lead to better classification results. We develop a two-phase method called RSVM2 for multiple instance classification (MIC) problems, where RSVM formulations are used as classifiers. We extend the two-phase method to the linear programming case and develop RSVMLP2. We demonstrate the classification characteristics of RSVM2 and RSVMLP2, and report our classification results compared to results obtained by other SVM-based MIC methods on public benchmark datasets. We show that both RSVM2 and RSVMLP2 are faster and produce more accurate classification results.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    57
    References
    5
    Citations
    NaN
    KQI
    []