Robust multiclass least squares support vector classifier with optimal error distribution

2020 
Abstract Robust least squares support vector regression (RLSSVR), minimizing the variance and mean of the global modeling errors, has achieved the excellent performance in dealing with outliers. However, generalizing the RLSSVR for solving the binary classification problems is easily misguided by the outliers because the differences in the modeling errors of the different classes are not considered. To address this issue, a robust least squares support vector classifier (RLSSVC) with optimal error distribution is proposed. RLSSVC minimizes the mean and variance of the modeling errors class-wisely, and considers the difference in the modeling errors of the different classes. Specifically, the binary classification problems are considered at first, the variance analysis indicates that the variance of the modeling errors of RLSSVC is smaller than that of RLSSVR. According to the validity in solving the binary classification problems, RLSSVC is naturally generalized for solving the multiclass classification problems by introducing multiple error adjusting factors. The robustness analysis provides a theoretical guarantee for the robustness of RLSSVC, which delivers that RLSSVC assigns the smaller weights for the training instances with the larger errors, while the larger weights for the training instances with the smaller errors. Furthermore, our optimization objective function is strictly convex and thus can obtain their corresponding closed-form solutions, resulting in higher computational performance. Finally, the performance of RLSSVC is further improved by introducing the metric learning and kernel trick. Theoretical and experimental results indicate that the proposed RLSSVC achieves the better classification effect with the lower computational costs.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    53
    References
    0
    Citations
    NaN
    KQI
    []