The Community for Technology Leaders
Green Image
<p>A useful notion of weak dependence between many classifiers constructed with the same training data is introduced. It is shown that if both this weak dependence is low and the expected margins are large, then decison rules based on linear combinations of these classifiers can achieve error rates that decrease exponentially fast. Empirical results with randomized trees and trees constructed via boosting and bagging show that weak dependence is present in these type of trees. Furthermore, these results also suggest that there is a trade-off between weak dependence and expected margins, in the sense that to compensate for low expected margins, there should be low mutual dependence between the classifiers involved in the linear combination.</p>
Exponential bounds, weakly dependent classifiers, classification trees, machine learning

A. Murua, "Upper Bounds for Error Rates of Linear Combinations of Classifiers," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 24, no. , pp. 591-602, 2002.
84 ms
(Ver 3.3 (11022016))