Issue No. 01 - January (1997 vol. 19)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/34.566814
<p><b>Abstract</b>—A bootstrap technique for nearest neighbor classifier design is proposed. Our primary interest in designing a classifier is in small training sample size situations. Conventional bootstrapping techniques sample the training samples with replacement. On the other hand, our technique generates bootstrap samples by locally combining original training samples. The nearest neighbor classifier is designed on the bootstrap samples and is tested on the test samples independent of training samples. The performance of the proposed classifier is demonstrated on three artificial data sets and one real data set. Experimental results show that the nearest neighbor classifier designed on the bootstrap samples outperforms the conventional <it>k</it>-NN classifiers as well as the edited 1-NN classifiers, particularly in high dimensions.</p>
Bootstrap, nearest neighbor classifier, error rate, peaking phenomenon, small training sample size, high dimensions, outlier.
Y. Hamamoto, S. Tomita and S. Uchimura, "A Bootstrap Technique for Nearest Neighbor Classifier Design," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 19, no. , pp. 73-79, 1997.