CSDL Home IEEE Transactions on Pattern Analysis & Machine Intelligence 2011 vol.33 Issue No.01 - January
Issue No.01 - January (2011 vol.33)
Ke Chen , The University of Manchester, Manchester
Shihai Wang , The University of Manchester, Manchester
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TPAMI.2010.92
Semi-supervised learning concerns the problem of learning in the presence of labeled and unlabeled data. Several boosting algorithms have been extended to semi-supervised learning with various strategies. To our knowledge, however, none of them takes all three semi-supervised assumptions, i.e., smoothness, cluster, and manifold assumptions, together into account during boosting learning. In this paper, we propose a novel cost functional consisting of the margin cost on labeled data and the regularization penalty on unlabeled data based on three fundamental semi-supervised assumptions. Thus, minimizing our proposed cost functional with a greedy yet stagewise functional optimization procedure leads to a generic boosting framework for semi-supervised learning. Extensive experiments demonstrate that our algorithm yields favorite results for benchmark and real-world classification tasks in comparison to state-of-the-art semi-supervised learning algorithms, including newly developed boosting algorithms. Finally, we discuss relevant issues and relate our algorithm to the previous work.
Semi-supervised learning, boosting framework, smoothness assumption, cluster assumption, manifold assumption, regularization.
Ke Chen, Shihai Wang, "Semi-Supervised Learning via Regularized Boosting Working on Multiple Semi-Supervised Assumptions", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.33, no. 1, pp. 129-143, January 2011, doi:10.1109/TPAMI.2010.92