The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.09 - September (2002 vol.24)
pp: 1281-1285
ABSTRACT
<p><b>Abstract</b>—Nearest-neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with finite samples due to the curse of dimensionality. Severe bias can be introduced under these conditions when using the nearest-neighbor rule. We propose a locally adaptive nearest-neighbor classification method to try to minimize bias. We use a <it>Chi-squared</it> distance analysis to compute a flexible metric for producing neighborhoods that are highly adaptive to query locations. Neighborhoods are elongated along less relevant feature dimensions and constricted along most influential ones. As a result, the class conditional probabilities are smoother in the modified neighborhoods, whereby better classification performance can be achieved. The efficacy of our method is validated and compared against other techniques using both simulated and real-world data.</p>
INDEX TERMS
Chi-squared distance, classification, feature relevance, nearest neighbors.
CITATION
Carlotta Domeniconi, Jing Peng, Dimitrios Gunopulos, "Locally Adaptive Metric Nearest-Neighbor Classification", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.24, no. 9, pp. 1281-1285, September 2002, doi:10.1109/TPAMI.2002.1033219
24 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool