The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.06 - June (1996 vol.18)
pp: 607-616
ABSTRACT
<p><b>Abstract</b>—Nearest neighbor classification expects the class conditional probabilities to be locally constant, and suffers from bias in high dimensions. We propose a locally adaptive form of nearest neighbor classification to try to ameliorate this curse of dimensionality. We use a local linear discriminant analysis to estimate an effective metric for computing neighborhoods. We determine the local decision boundaries from centroid information, and then shrink neighborhoods in directions orthogonal to these local decision boundaries, and elongate them parallel to the boundaries. Thereafter, any neighborhood-based classifier can be employed, using the modified neighborhoods. The posterior probabilities tend to be more homogeneous in the modified neighborhoods. We also propose a method for global dimension reduction, that combines local dimension information. In a number of examples, the methods demonstrate the potential for substantial improvements over nearest neighbor classification.</p>
INDEX TERMS
Classification, nearest neighbors, linear discriminant analysis, curse of dimensionality.
CITATION
Trevor Hastie, Robert Tibshirani, "Discriminant Adaptive Nearest Neighbor Classification", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.18, no. 6, pp. 607-616, June 1996, doi:10.1109/34.506411
17 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool