Empirical Evaluation of Kernel PCA Approximation Methods in Classification Tasks.

2017 
Kernel Principal Component Analysis (KPCA) is a popular dimensionality reduction technique with a wide range of applications. However, it suffers from the problem of poor scalability. Various approximation methods have been proposed in the past to overcome this problem. The Nystr\"om method, Randomized Nonlinear Component Analysis (RNCA) and Streaming Kernel Principal Component Analysis (SKPCA) were proposed to deal with the scalability issue of KPCA. Despite having theoretical guarantees, their performance in real world learning tasks have not been explored previously. In this work the evaluation of SKPCA, RNCA and Nystr\"om method for the task of classification is done for several real world datasets. The results obtained indicate that SKPCA based features gave much better classification accuracy when compared to the other methods for a very large dataset.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    14
    References
    2
    Citations
    NaN
    KQI
    []