ART

Robust Principal Component Analysis (RPCA) is a modification of the widely used statistical procedure of principal component analysis (PCA) which works well with respect to grossly corrupted observations. A number of different approaches exist for Robust PCA, including an idealized version of Robust PCA, which aims to recover a low-rank matrix L0 from highly corrupted measurements M = L0 +S0.[1] This decomposition in low-rank and sparse matrices can be achieved by techniques such as Principal Component Pursuit method (PCP),[1] Stable PCP,[2] Quantized PCP,[3] Block based PCP,[4] and Local PCP.[5] Then, optimization methods are used such as the Augmented Lagrange Multiplier Method (ALM[6]), Alternating Direction Method (ADM[7]), Fast Alternating Minimization (FAM[8]), Iteratively Reweighted Least Squares (IRLS [9][10][11]) or alternating projections (AP[12][13]).

Algorithms
Non-convex method

The state-of-the-art guaranteed algorithm for the robust PCA problem (with the input matrix being M=L+S) is an alternating minimization type algorithm.[12] The computational complexity is \( O\left(m n r^2 \log\frac{1}{\epsilon} \right) \) where the input is the superposition of a low-rank (of rank r) and a sparse matrix of dimension \( m\times n \) and \( \epsilon \) is the desired accuracy of the recovered solution, i.e., \( \| \widehat{L}-L \|_F \leq \epsilon \) where L is the true low-rank component and \( \widehat{L} \) is the estimated or recovered low-rank component. Intuitively, this algorithm performs projections of the residual on to the set of low-rank matrices (via the SVD operation) and sparse matrices (via entry-wise hard thresholding) in an alternating manner - that is, low-rank projection of the difference the input matrix and the sparse matrix obtained at a given iteration followed by sparse projection of the difference of the input matrix and the low-rank matrix obtained in the previous step, and iterating the two steps until convergence.

This alternating minimization algorithm is later improved by an accelerated version. [13] The acceleration is achieved by applying a tangent space projection before project the residue onto the set of low-rank matrices. This trick improves the computational complexity to \( {\displaystyle O\left(mnr\log {\frac {1}{\epsilon }}\right)} \) with a much smaller constant in front while it maintains the theoretically guaranteed linear convergence.
Convex relaxation

This method consists of relaxing the rank constraint \( {\displaystyle rank(L)} \) in the optimization problem to the nuclear norm \( {\displaystyle \|L\|_{*}} \) and the sparsity constraint \( {\displaystyle \|S\|_{0}} \) to \( \ell _{1} \) -norm \( {\displaystyle \|S\|_{1}} \) . The resulting program can be solved using methods such as the method of Augmented Lagrange Multipliers.
Applications

RPCA has many real life important applications particularly when the data under study can naturally be modeled as a low-rank plus a sparse contribution. Following examples are inspired by contemporary challenges in computer science, and depending on the applications, either the low-rank component or the sparse component could be the object of interest:
Video surveillance

Given a sequence of surveillance video frames, it is often required to identify the activities that stand out from the background. If we stack the video frames as columns of a matrix M, then the low-rank component L0 naturally corresponds to the stationary background and the sparse component S0 captures the moving objects in the foreground.[1][14]
Face recognition

Images of a convex, Lambertian surface under varying illuminations span a low-dimensional subspace.[15] This is one of the reasons for effectiveness of low-dimensional models for imagery data. In particular, it is easy to approximate images of a human's face by a low-dimensional subspace. To be able to correctly retrieve this subspace is crucial in many applications such as face recognition and alignment. It turns out that RPCA can be applied successfully to this problem to exactly recover the face.[1]
Surveys

Robust PCA [14]
Dynamic RPCA [16]
Decomposition into Low-rank plus Additive Matrices [17]
Low-rank models[18]

Books, journals and workshops
Books

T. Bouwmans, N. Aybat, and E. Zahzah. Handbook on Robust Low-Rank and Sparse Matrix Decomposition: Applications in Image and Video Processing, CRC Press, Taylor and Francis Group, May 2016. (more information: http://www.crcpress.com/product/isbn/9781498724623)
Z. Lin, H. Zhang, "Low-Rank Models in Visual Analysis: Theories, Algorithms, and Applications", Academic Press, Elsevier, June 2017. (more information: https://www.elsevier.com/books/low-rank-models-in-visual-analysis/lin/978-0-12-812731-5)

Journals

N. Vaswani, Y. Chi, T. Bouwmans, Special Issue on “Rethinking PCA for Modern Datasets: Theory, Algorithms, and Applications”, Proceedings of the IEEE, 2018.
T. Bouwmans, N. Vaswani, P. Rodriguez, R. Vidal, Z. Lin, Special Issue on “Robust Subspace Learning and Tracking: Theory, Algorithms, and Applications”, IEEE Journal of Selected Topics in Signal Processing, December 2018.

Workshops

RSL-CV 2015: Workshop on Robust Subspace Learning and Computer Vision in conjunction with ICCV 2015 (For more information: http://rsl-cv2015.univ-lr.fr/workshop/)
RSL-CV 2017: Workshop on Robust Subspace Learning and Computer Vision in conjunction with ICCV 2017 (For more information: http://rsl-cv.univ-lr.fr/2017/)

Sessions

Special Session on "Online Algorithms for Static and Dynamic Robust PCA and Compressive Sensing" in conjunction with SSP 2018. (More information: https://ssp2018.org/)

Resources and libraries
Websites

Background Subtraction Website
DLAM Website
Documentation from the University of Illinois - Archive Link

Libraries

The LRS Library (developed by Andrews Sobral) provides a collection of low-rank and sparse decomposition algorithms in MATLAB. The library was designed for moving object detection in videos, but it can be also used for other computer vision / machine learning tasks. Currently the LRSLibrary offers more than 100 algorithms based on matrix and tensor methods.
References

Emmanuel J. Candes; Xiaodong Li; Yi Ma; John Wright (2009). "Robust Principal Component Analysis?". Journal of the ACM. 58 (3): 1–37. doi:10.1145/1970392.1970395.
J. Wright; Y. Peng; Y. Ma; A. Ganesh; S. Rao (2009). "Robust Principal Component Analysis: Exact Recovery of Corrupted Low-Rank Matrices by Convex Optimization". Neural Information Processing Systems, NIPS 2009.
S. Becker; E. Candes, M. Grant (2011). "TFOCS: Flexible First-order Methods for Rank Minimization". Low-rank Matrix Optimization Symposium, SIAM Conference on Optimization.
G. Tang; A. Nehorai (2011). "Robust principal component analysis based on low-rank and block-sparse matrix decomposition". Annual Conference on Information Sciences and Systems, CISS 2011: 1–5. doi:10.1109/CISS.2011.5766144. ISBN 978-1-4244-9846-8.
B. Wohlberg; R. Chartrand; J. Theiler (2012). "Local Principal Component Pursuit for Nonlinear Datasets". International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2012: 3925–3928. doi:10.1109/ICASSP.2012.6288776. ISBN 978-1-4673-0046-9.
Z. Lin; M. Chen; L. Wu; Y. Ma (2013). "The Augmented Lagrange Multiplier Method for Exact Recovery of Corrupted Low-Rank Matrices". Journal of Structural Biology. 181 (2): 116–27. arXiv:1009.5055. doi:10.1016/j.jsb.2012.10.010. PMC 3565063. PMID 23110852.
X. Yuan; J. Yang (2009). "Sparse and Low-Rank Matrix Decomposition via Alternating Direction Methods". Optimization Online.
P. Rodríguez; B. Wohlberg (2013). "Fast Principal Component Pursuit Via Alternating Minimization". IEEE International Conference on Image Processing, ICIP 2013: 69–73. doi:10.1109/ICIP.2013.6738015. ISBN 978-1-4799-2341-0.
C. Guyon; T. Bouwmans; E. Zahzah (2012). "Foreground Detection via Robust Low Rank Matrix Decomposition including Spatio-Temporal Constraint". International Workshop on Background Model Challenges, ACCV 2012.
C. Guyon; T. Bouwmans; E. Zahzah (2012). "Foreground Detection via Robust Low Rank Matrix Factorization including Spatial Constraint with Iterative Reweighted Regression". International Conference on Pattern Recognition, ICPR 2012.
C. Guyon; T. Bouwmans; E. Zahzah (2012). "Moving Object Detection via Robust Low Rank Matrix Decomposition with IRLS scheme". International Symposium on Visual Computing, ISVC 2012.
P., Netrapalli; U., Niranjan; S., Sanghavi; A., Anandkumar; P., Jain (2014). "Non-convex robust PCA". Advances in Neural Information Processing Systems. 1410: 1107–1115. arXiv:1410.7660. Bibcode:2014arXiv1410.7660N.
C., HanQin; C., Jian-Feng; W., Ke (2019). "Accelerated alternating projections for robust principal component analysis". The Journal of Machine Learning Research. 20 (1): 685--717. arXiv:1711.05519. Bibcode:2017arXiv171105519C.
T. Bouwmans; E. Zahzah (2014). "Robust PCA via Principal Component Pursuit: A Review for a Comparative Evaluation in Video Surveillance". Computer Vision and Image Understanding. 122: 22–34. doi:10.1016/j.cviu.2013.11.009.
R. Basri; D. Jacobs. "Lambertian reflectance and linear subspaces".
N. Vaswani; T. Bouwmans; S. Javed; P. Narayanamurthy (2017). "Robust PCA and Robust Subspace Tracking". Preprint. 35 (4): 32–55. arXiv:1711.09492. Bibcode:2017arXiv171109492V. doi:10.1109/MSP.2018.2826566.
T. Bouwmans; A. Sobral; S. Javed; S. Jung; E. Zahzahg (2015). "Decomposition into Low-rank plus Additive Matrices for Background/Foreground Separation: A Review for a Comparative Evaluation with a Large-Scale Dataset". Computer Science Review. 23: 1–71. arXiv:1511.01245. Bibcode:2015arXiv151101245B. doi:10.1016/j.cosrev.2016.11.001.
Z. Lin (2016). "A Review on Low-Rank Models in Data Analysis". Big Data and Information Analytics. 1 (2): 139–161. doi:10.3934/bdia.2016001.

 

Mathematics Encyclopedia

World

Index

Hellenica World - Scientific Library

Retrieved from "http://en.wikipedia.org/"
All text is available under the terms of the GNU Free Documentation License