On the Asymptotic L1-PC of Elliptical Distributions

2022 
The dominant eigenvector of the covariance matrix of a zero-mean data distribution describes the line wherein the variance of the projected data is maximized. In practical applications, the true covariance matrix is unknown and its dominant eigenvector is estimated by principal-component analysis (PCA) of a finite collection of coherent data points. As the size of the data collection increases, its principal component (PC) tends to the covariance eigenvector. The downside of PCA is that it is very sensitive against any outliers in the data collection. L1-PCA is an increasingly popular robust alternative to standard PCA that has demonstrated sturdy resistance against outliers in a number of applications. However, to date, the asymptotic properties of L1-PCA as an eigenvector estimator are not well understood. In this work we show for the first time that, for centered elliptical distributions, as the number of samples increases, the L1-PC tends to the eigenvector of the covariance matrix, just like the standard PC.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []