CSDL Home IEEE Transactions on Pattern Analysis & Machine Intelligence 2010 vol.32 Issue No.12 - December
Issue No.12 - December (2010 vol.32)
Chunhua Shen , NICTA, Canberra Research Laboratory and Australian National University, Canberra
Hanxi Li , NICTA, Canberra Research Laboratory and Australian National University, Canberra
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TPAMI.2010.47
We study boosting algorithms from a new perspective. We show that the Lagrange dual problems of \ell_1-norm-regularized AdaBoost, LogitBoost, and soft-margin LPBoost with generalized hinge loss are all entropy maximization problems. By looking at the dual problems of these boosting algorithms, we show that the success of boosting algorithms can be understood in terms of maintaining a better margin distribution by maximizing margins and at the same time controlling the margin variance. We also theoretically prove that approximately, \ell_1-norm-regularized AdaBoost maximizes the average margin, instead of the minimum margin. The duality formulation also enables us to develop column-generation-based optimization algorithms, which are totally corrective. We show that they exhibit almost identical classification results to that of standard stagewise additive boosting algorithms but with much faster convergence rates. Therefore, fewer weak classifiers are needed to build the ensemble using our proposed optimization technique.
AdaBoost, LogitBoost, LPBoost, Lagrange duality, linear programming, entropy maximization.
Chunhua Shen, Hanxi Li, "On the Dual Formulation of Boosting Algorithms", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.32, no. 12, pp. 2216-2231, December 2010, doi:10.1109/TPAMI.2010.47