The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.05 - May (2004 vol.26)
pp: 656-661
ABSTRACT
<p><b>Abstract</b>—Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions due to the curse-of-dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose an adaptive nearest neighbor classification method to try to minimize bias. We use quasiconformal transformed kernels to compute neighborhoods over which the class probabilities tend to be more homogeneous. As a result, better classification performance can be expected. The efficacy of our method is validated and compared against other competing techniques using a variety of data sets.</p>
INDEX TERMS
Classification, nearest neighbors, quasiconformal mapping, kernel methods, feature space.
CITATION
Jing Peng, Douglas R. Heisterkamp, H.K. Dai, "Adaptive Quasiconformal Kernel Nearest Neighbor Classification", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.26, no. 5, pp. 656-661, May 2004, doi:10.1109/TPAMI.2004.1273978
19 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool