Ran He , Institute of Automation, Chinese Academy of Sciences, Beijing
Tieniu Tan , Institute of Automation, Chinese Academy of Sciences, Beijing
Liang Wang , Institute of Automation, Chinese Academy of Sciences, Beijing
Low-rank matrix recovery algorithms aim to recover a corrupted low-rank matrix with sparse errors. However, corrupted errors may not be sparse in real-world problems and the relationship between L1 regularizer on noise and robust M-estimators is still unknown. This paper proposes a general robust framework for low-rank matrix recovery via implicit regularizers of robust M-estimators, which are derived from convex conjugacy and can be used to model arbitrarily corrupted errors. Based on the additive form of half-quadratic optimization, proximity operators of implicit regularizers are developed such that both low-rank structure and corrupted errors can be alternately recovered. In particular, the dual relationship between the absolute function in L1 regularizer and Huber M-estimator is studied, which establishes a relationship between robust low-rank matrix recovery methods and M-estimators based robust principal component analysis methods. Extensive experiments on synthetic and real-world datasets corroborate our claims and verify the robustness of the proposed framework.
Robustness, Principal component analysis, Optimization, Sparse matrices, Equations, Minimization, Kernel, regularizer, robust principal component analysis, low-rank matrix recovery, correntropy
L. Wang, T. Tan and R. He, "Robust Recovery of Corrupted Low-rank Matrix by Implicit Regularizers," in IEEE Transactions on Pattern Analysis & Machine Intelligence.