R1-PCA: rotational invariant L1-norm principal component analysis for robust subspace factorization
C. Ding, D. Zhou, X. He, and H. Zha. Proceedings of the 23rd international conference on Machine learning - ICML '06, volume 148 of ACM International Conference Proceeding Series, page 281-288. ACM, (2006)
DOI: 10.1145/1143844.1143880
Abstract
Principal component analysis (PCA) minimizes the sum of squared errors (L2-norm) and is sensitive to the presence of outliers. We propose a rotational invariant L1-norm PCA (R1-PCA). R1-PCA is similar to PCA in that (1) it has a unique global solution, (2) the solution are principal eigenvectors of a robust covariance matrix (re-weighted to soften the effects of outliers), (3) the solution is rotational invariant. These properties are not shared by the L1-norm PCA. A new subspace iteration algorithm is given to compute R1-PCA efficiently. Experiments on several real-life datasets show R1-PCA can effectively handle outliers. We extend R1-norm to K-means clustering and show that L1-norm K-means leads to poor results while R1-K-means outperforms standard K-means.
%0 Conference Paper
%1 ding2006r1pca
%A Ding, Chris
%A Zhou, Ding
%A He, Xiaofeng
%A Zha, Hongyuan
%B Proceedings of the 23rd international conference on Machine learning - ICML '06
%D 2006
%I ACM
%K 15a18-eigenvalues-singular-values-and-eigenvectors 15a23-factorization-of-matrices
%P 281-288
%R 10.1145/1143844.1143880
%T R1-PCA: rotational invariant L1-norm principal component analysis for robust subspace factorization
%U https://doi.org/10.1145%2F1143844.1143880
%V 148
%X Principal component analysis (PCA) minimizes the sum of squared errors (L2-norm) and is sensitive to the presence of outliers. We propose a rotational invariant L1-norm PCA (R1-PCA). R1-PCA is similar to PCA in that (1) it has a unique global solution, (2) the solution are principal eigenvectors of a robust covariance matrix (re-weighted to soften the effects of outliers), (3) the solution is rotational invariant. These properties are not shared by the L1-norm PCA. A new subspace iteration algorithm is given to compute R1-PCA efficiently. Experiments on several real-life datasets show R1-PCA can effectively handle outliers. We extend R1-norm to K-means clustering and show that L1-norm K-means leads to poor results while R1-K-means outperforms standard K-means.
@inproceedings{ding2006r1pca,
abstract = {Principal component analysis (PCA) minimizes the sum of squared errors (L2-norm) and is sensitive to the presence of outliers. We propose a rotational invariant L1-norm PCA (R1-PCA). R1-PCA is similar to PCA in that (1) it has a unique global solution, (2) the solution are principal eigenvectors of a robust covariance matrix (re-weighted to soften the effects of outliers), (3) the solution is rotational invariant. These properties are not shared by the L1-norm PCA. A new subspace iteration algorithm is given to compute R1-PCA efficiently. Experiments on several real-life datasets show R1-PCA can effectively handle outliers. We extend R1-norm to K-means clustering and show that L1-norm K-means leads to poor results while R1-K-means outperforms standard K-means.
},
added-at = {2021-02-19T06:35:53.000+0100},
author = {Ding, Chris and Zhou, Ding and He, Xiaofeng and Zha, Hongyuan},
biburl = {https://www.bibsonomy.org/bibtex/2a4ba634086cec1feb3dcfc0335b0e611/gdmcbain},
booktitle = {Proceedings of the 23rd international conference on Machine learning - ICML '06},
doi = {10.1145/1143844.1143880},
interhash = {2cac4fdcb178e58d34cb303a8b7db1a7},
intrahash = {a4ba634086cec1feb3dcfc0335b0e611},
keywords = {15a18-eigenvalues-singular-values-and-eigenvectors 15a23-factorization-of-matrices},
pages = {281-288},
publisher = {ACM},
series = {ACM International Conference Proceeding Series},
timestamp = {2021-02-19T06:37:40.000+0100},
title = {R1-PCA: rotational invariant L1-norm principal component analysis for robust subspace factorization},
url = {https://doi.org/10.1145%2F1143844.1143880},
volume = 148,
year = 2006
}