The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.03 - March (2006 vol.28)
pp: 392-402
ABSTRACT
Given a data set and a number of supervised learning algorithms, we would like to find the algorithm with the smallest expected error. Existing pairwise tests allow a comparison of two algorithms only; range tests and ANOVA check whether multiple algorithms have the same expected error and cannot be used for finding the smallest. We propose a methodology, the MultiTest algorithm, whereby we order supervised learning algorithms taking into account 1) the result of pairwise statistical tests on expected error (what the data tells us), and 2) our prior preferences, e.g., due to complexity. We define the problem in graph-theoretic terms and propose an algorithm to find the "best” learning algorithm in terms of these two criteria, or in the more general case, order learning algorithms in terms of their "goodness.” Simulation results using five classification algorithms on 30 data sets indicate the utility of the method. Our proposed method can be generalized to regression and other loss functions by using a suitable pairwise test.
INDEX TERMS
Index Terms- Machine learning, classifier design and evaluation, experimental design.
CITATION
Olcay Taner Yildiz, "Ordering and Finding the Best of K>2 Supervised Learning Algorithms", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.28, no. 3, pp. 392-402, March 2006, doi:10.1109/TPAMI.2006.61
6 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool