Multiple Nonlinear Subspace Methods Using Subspace-based Support Vector Machines

2011 
In this paper, we propose multiple nonlinear subspace methods (MNSMs), in which each class consists of several subspaces with different kernel parameters. For each class and each candidate kernel parameter, we generate the subspace by KPCA, and obtain the projection length of an input vector onto each subspace. Then, for each class, we define the discriminant function by the sum of the weighted lengths. These weights in the discriminant function are optimized by subspace-based support vector machines (SS-SVMs) so that the margin between classes is maximized while minimizing the classification error. Thus, we can weight the subspaces for each class from the standpoint of class separability. Then, the computational cost of the model selection of MNSMs is lower than that of SS-SVMs because for SS-SVMs two hyper-parameters, which are the kernel parameter and the margin parameter, must be chosen before training. We show the advantages of the proposed method by computer experiments with benchmark data sets.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    13
    References
    0
    Citations
    NaN
    KQI
    []