An improved Twin-KSVC with its applications

2018 
Twin-KSVC (Xu et al. in Cognit Comput 5(4):580–588, 2013) is a novel multi-classifier, which is an extension of K-SVCR (Angulo et al. in Neurocomputing 55(12):57–77, 2003). Compared with K-SVCR, the Twin-KSVC has higher training speed. However, there are some drawbacks in classical Twin-KSVC. (a) Each pair of sub-classifiers in Twin-KSVC only implements empirical risk minimization, which makes generalization performance reduced. (b) Each pair of sub-classifiers in Twin-KSVC needs to calculate large-scale inverse matrices, which is intractable or even impossible in practical applications. (c) For the large-scale datasets, the classical Twin-KSVC doesn’t offer an appropriate training algorithm. (d) For nonlinear case, the classical Twin-KSVC has to construct additional primal problems based on the approximate kernel-generated surface. For the drawbacks of Twin-KSVC, we propose an improved version in this paper, called ITKSVC. First of all, we introduce regularization terms into each pair of sub-classifiers in Twin-KSVC, which makes each pair of sub-classifiers implement structural risk minimization. Further, we theoretically deduce the dual problems of each pair of sub-classifiers, which makes ITKSVC avoid calculating large-scale inverse matrices. In addition, to improve training speed of each pair of sub-classifiers in ITKSVC for the large-scale datasets, successive overrelaxation method is applied. Finally, the dual problems of each pair of sub-classifiers in ITKSVC can directly apply the kernel trick for nonlinear cases. The experimental results on several benchmark datasets indicate that, compared with Twin-KSVC, the proposed ITKSVC has better classification performance for large-scale datasets.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    33
    References
    5
    Citations
    NaN
    KQI
    []