Issue No. 08 - August (2004 vol. 26)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TPAMI.2004.48
<p><b>Abstract</b>—Classifier combination holds the potential of improving performance by combining the results of multiple classifers. For domains with very large numbers of classes, such as biometrics, we present an axiomatic framework of desirable mathematical properties for combination functions of rank-based classifiers. This framework represents a continuum of combination rules, including the Borda Count, Logistic Regression, and Highest Rank combination methods as extreme cases [CHECK END OF SENTENCE], [CHECK END OF SENTENCE], [CHECK END OF SENTENCE], [CHECK END OF SENTENCE]. Intuitively, this framework captures how the two complementary concepts of general preference for specific classifiers and the confidence it has in any specific result (as indicated by ranks) can be balanced while maintaining consistent rank interpretation. Mixed Group Ranks (MGR) is a new combination function that balances preference and confidence by generalizing these other functions. We demonstrate that MGR is an effective combination approach by performing multiple experiments on data sets with large numbers of classes and classifiers from the FERET face recognition study.</p>
Classification, classifier combination, ensemble methods, sensor fusion, biometrics, face recognition, mixed group ranks, logistic regression, Borda count, highest rank, voting methods.
C. Zhang, Y. Vardi and O. Melnik, "Mixed Group Ranks: Preference and Confidence in Classifier Combination," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 26, no. , pp. 973-981, 2004.