The Community for Technology Leaders
Green Image
Issue No. 12 - December (2010 vol. 32)
ISSN: 0162-8828
pp: 2216-2231
Chunhua Shen , NICTA, Canberra Research Laboratory and Australian National University, Canberra
Hanxi Li , NICTA, Canberra Research Laboratory and Australian National University, Canberra
ABSTRACT
We study boosting algorithms from a new perspective. We show that the Lagrange dual problems of \ell_1-norm-regularized AdaBoost, LogitBoost, and soft-margin LPBoost with generalized hinge loss are all entropy maximization problems. By looking at the dual problems of these boosting algorithms, we show that the success of boosting algorithms can be understood in terms of maintaining a better margin distribution by maximizing margins and at the same time controlling the margin variance. We also theoretically prove that approximately, \ell_1-norm-regularized AdaBoost maximizes the average margin, instead of the minimum margin. The duality formulation also enables us to develop column-generation-based optimization algorithms, which are totally corrective. We show that they exhibit almost identical classification results to that of standard stagewise additive boosting algorithms but with much faster convergence rates. Therefore, fewer weak classifiers are needed to build the ensemble using our proposed optimization technique.
INDEX TERMS
AdaBoost, LogitBoost, LPBoost, Lagrange duality, linear programming, entropy maximization.
CITATION
Chunhua Shen, Hanxi Li, "On the Dual Formulation of Boosting Algorithms", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 32, no. , pp. 2216-2231, December 2010, doi:10.1109/TPAMI.2010.47
82 ms
(Ver 3.1 (10032016))