Avoiding optimal mean ℓ<inf>2,1</inf>-norm maximization-based robust PCA for reconstruction

Publication Type:
Journal Article
Citation:
Neural Computation, 2017, 29 (4), pp. 1124 - 1150
Issue Date:
2017-04-01
Filename Description Size
ContentServer (1).pdfPublished Version1.2 MB
Adobe PDF
Full metadata record
© 2017 Massachusetts Institute of Technology. Robust principal component analysis (PCA) is one of the most important dimension-reduction techniques for handling high-dimensional data with outliers. However, most of the existing robust PCA presupposes that the mean of the data is zero and incorrectly utilizes the average of data as the optimal mean of robust PCA. In fact, this assumption holds only for the squared -norm-based traditional PCA. In this letter, we equivalently reformulate the objective of conventional PCA and learn the optimal projection directions by maximizing the sum of projected difference between each pair of instances based on -norm. The proposed method is robust to outliers and also invariant to rotation. More important, the reformulated objective not only automatically avoids the calculation of optimal mean and makes the assumption of centered data unnecessary, but also theoretically connects to the minimization of reconstruction error. To solve the proposed nonsmooth problem, we exploit an efficient optimization algorithm to soften the contributions from outliers by reweighting each data point iteratively. We theoretically analyze the convergence and computational complexity of the proposed algorithm. Extensive experimental results on several benchmark data sets illustrate the effectiveness and superiority of the proposed method.
Please use this identifier to cite or link to this item: