Feature extraction for classification method using principal component based on conformal geometric algebra

2016 
This paper discusses feature extraction methods. The feature extraction methods such principal component analysis and multiple discriminant analysis are very important techniques in machine learning research areas. The characteristic of feature extraction is to transform the data from a difficultly classified space to a easily classified space. There are many conventional machine learning methods including transformation such as artificial neural network and support vector machines. However, extracting the good features before applying machine learning methods will lead to better classification results. This paper focuses on the principal component regression (PCR). The PCR finds the approximation with hyper-planes where the data distributed on. The problem now is that it has a case of the data do not distribute on hyper-planes, for example they distribute on hyper-spheres such as rotation objects, the PCR can not extract the good feature to apply the classification problems. This paper proposes a new feature extraction method by calculating the conformal eigenvectors in conformal geometric algebra(CGA) space to find the approximation hyper-planes or hyper-spheres which fit to the set of data using the least square approach. In particular, this paper shows that the classification accuracy of proposed method is better than that of conventional PCR method.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    19
    References
    1
    Citations
    NaN
    KQI
    []