The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.04 - April (1999 vol.21)
pp: 380-384
ABSTRACT
<p><b>Abstract</b>—A local distance measure for the nearest neighbor classification rule is shown to achieve high compression rates and high accuracy on real data sets. In the approach proposed here, first, a set of prototypes is extracted during training and, then, a feedback learning algorithm is used to optimize the metric. Even if the prototypes are randomly selected, the proposed metric outperforms, both in compression rate and accuracy, common editing procedures like ICA, RNN, and PNN. Finally, when accuracy is the major concern, we show how compression can be traded for accuracy by exploiting voting techniques. That indicates how voting can be successfully integrated with instance-bases approaches, overcoming previous negative results.</p>
INDEX TERMS
Nearest neighbor, data compression, machine learning, local metric, multiple models, case-based reasoning.
CITATION
Francesco Ricci, Paolo Avesani, "Data Compression and Local Metrics for Nearest Neighbor Classification", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.21, no. 4, pp. 380-384, April 1999, doi:10.1109/34.761268
42 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool