Issue No. 10 - Oct. (2012 vol. 24)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TKDE.2011.208
Indranil Palit , University of Notre Dame, Notre Dame
Chandan K. Reddy , Wayne State University, Detroit
In this era of data abundance, it has become critical to process large volumes of data at much faster rates than ever before. Boosting is a powerful predictive model that has been successfully used in many real-world applications. However, due to the inherent sequential nature, achieving scalability for boosting is nontrivial and demands the development of new parallelized versions which will allow them to efficiently handle large-scale data. In this paper, we propose two parallel boosting algorithms, AdaBoost.PL and LogitBoost.PL, which facilitate simultaneous participation of multiple computing nodes to construct a boosted ensemble classifier. The proposed algorithms are competitive to the corresponding serial versions in terms of the generalization performance. We achieve a significant speedup since our approach does not require individual computing nodes to communicate with each other for sharing their data. In addition, the proposed approach also allows for preserving privacy of computations in distributed environments. We used MapReduce framework to implement our algorithms and demonstrated the performance in terms of classification accuracy, speedup and scaleup using a wide variety of synthetic and real-world data sets.
Boosting, Prediction algorithms, Algorithm design and analysis, Convergence, Distributed databases, Training, Computational modeling, MapReduce., Boosting, parallel algorithms, classification, distributed computing
C. K. Reddy and I. Palit, "Scalable and Parallel Boosting with MapReduce," in IEEE Transactions on Knowledge & Data Engineering, vol. 24, no. , pp. 1904-1916, 2012.