The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.06 - June (1995 vol.17)
pp: 599-608
ABSTRACT
<p><it>Abstract</it>—In this paper we propose a theoretical model for analysis of classification methods, in which the teacher knows the classification algorithm and chooses examples in the best way possible. We apply this model using the nearest-neighbor learning algorithm, and develop upper and lower bounds on sample complexity for several different concept classes. For some concept classes, the sample complexity turns out to be exponential even using this best-case model, which implies that the concept class is inherently difficult for the NN algorithm. We identify several geometric properties that make learning certain concepts relatively easy. Finally we discuss the relation of our work to helpful teacher models, its application to decision tree learning algorithms, and some of its implications for current experimental work.</p>
INDEX TERMS
Machine learning, nearest-neighbor, geometric concepts.
CITATION
Steven Salzberg, Arthur L. Delcher, David Heath, Simon Kasif, "Best-Case Results for Nearest-Neighbor Learning", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.17, no. 6, pp. 599-608, June 1995, doi:10.1109/34.387506
79 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool