The Community for Technology Leaders
2017 IEEE International Conference on Data Mining (ICDM) (2017)
New Orleans, Louisiana, USA
Nov. 18, 2017 to Nov. 21, 2017
ISSN: 2374-8486
ISBN: 978-1-5386-3835-4
pp: 625-634
ABSTRACT
In today's era of big data, robust least-squares regression becomes a more challenging problem when considering the adversarial corruption along with explosive growth of datasets. Traditional robust methods can handle the noise but suffer from several challenges when applied in huge dataset including 1) computational infeasibility of handling an entire dataset at once, 2) existence of heterogeneously distributed corruption, and 3) difficulty in corruption estimation when data cannot be entirely loaded. This paper proposes online and distributed robust regression approaches, both of which can concurrently address all the above challenges. Specifically, the distributed algorithm optimizes the regression coefficients of each data block via heuristic hard thresholding and combines all the estimates in a distributed robust consolidation. Furthermore, an online version of the distributed algorithm is proposed to incrementally update the existing estimates with new incoming data. We also prove that our algorithms benefit from strong robustness guarantees in terms of regression coefficient recovery with a constant upper bound on the error of state-of-the-art batch methods. Extensive experiments on synthetic and real datasets demonstrate that our approaches are superior to those of existing methods in effectiveness, with competitive efficiency.
INDEX TERMS
Big Data, distributed algorithms, estimation theory, least squares approximations, optimisation, regression analysis
CITATION

X. Zhang, L. Zhao, A. P. Boedihardjo and C. Lu, "Online and Distributed Robust Regressions Under Adversarial Data Corruption," 2017 IEEE International Conference on Data Mining (ICDM), New Orleans, Louisiana, USA, 2018, pp. 625-634.
doi:10.1109/ICDM.2017.72
93 ms
(Ver 3.3 (11022016))