Issue No. 05 - May (2004 vol. 26)
<p><b>Abstract</b>—Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions due to the curse-of-dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose an adaptive nearest neighbor classification method to try to minimize bias. We use quasiconformal transformed kernels to compute neighborhoods over which the class probabilities tend to be more homogeneous. As a result, better classification performance can be expected. The efficacy of our method is validated and compared against other competing techniques using a variety of data sets.</p>
Classification, nearest neighbors, quasiconformal mapping, kernel methods, feature space.
Jing Peng, H.K. Dai, Douglas R. Heisterkamp, "Adaptive Quasiconformal Kernel Nearest Neighbor Classification", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 26, no. , pp. 656-661, May 2004, doi:10.1109/TPAMI.2004.1273978