This Article 
 Bibliographic References 
 Add to: 
Adaptive Quasiconformal Kernel Nearest Neighbor Classification
May 2004 (vol. 26 no. 5)
pp. 656-661

Abstract—Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions due to the curse-of-dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose an adaptive nearest neighbor classification method to try to minimize bias. We use quasiconformal transformed kernels to compute neighborhoods over which the class probabilities tend to be more homogeneous. As a result, better classification performance can be expected. The efficacy of our method is validated and compared against other competing techniques using a variety of data sets.

[1] S. Amari and S. Wu, Improving Support Vector Machine Classifiers by Modifying Kernel Functions Neural Networks, vol. 12, no. 6, pp. 783-789, 1999.
[2] G.D. Anderson, M.K. Vananamurthy, and M.K. Vuorinen, Conformal Invariants, Inequalities, and Quasiconformal Maps, Canadian Math. Soc. Series of Monographs and Advanced Texts. New York: John Wiley and Sons, Inc., 1997.
[3] R.E. Bellman, Adaptive Control Processes. Princeton Univ. Press, 1961.
[4] D.E. Blair, Inversion Theory and Conformal Mapping. Am. Math. Soc., 2000.
[5] N. Cristianini and J. Shawe-Taylor, An Introduction to Support Vector Machines and Other Kernel-Based Learning Methods. Cambridge, UK: Cambridge Univ. Press, 2000.
[6] C. Domeniconi, J. Peng, and D. Gunopulos, Locally Adaptive Metric Nearest Neighbor Classification IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 24, no. 9, pp. 1281-1285, Sept. 2002.
[7] R.O. Duda and P.E. Hart, Pattern Classification and Scene Analysis. John Wiley and Sons, Inc., 1973.
[8] J.H. Friedman, Flexible Metric Nearest Neighbor Classification technical report, Dept. of Statistics, Stanford Univ., 1994.
[9] T. Hastie and R. Tibshirani, “Discriminant Adaptive Nearest Neighbor Classification,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 18, no. 6, pp. 607-615, June 1996.
[10] D. Heisterkamp, J. Peng, and H.K. Dai, An Adaptive Quasiconformal Kernel Metric for Image Retrieval Proc. IEEE Computer Soc. Conf. Computer Vision and Pattern Recognition, pp. 388-393, 2001.
[11] D.G. Lowe, Similarity Metric Learning for a Variable-Kernel Classifier Neural Computation, vol. 7, no. 1, pp. 72-85, 1995.
[12] J.P. Myles and D.J. Hand, The Multi-Class Metric Problem in Nearestneighbor Discrimination Rules Pattern Recognition, vol. 723, pp. 1291-1297, 1990.
[13] J. Peng, D. Heisterkamp, and H.K. Dai, LDA/SVM Driven Nearest Neighbor Classification Proc. IEEE Computer Soc. Conf. Computer Vision and Pattern Recognition, pp. 58-63, 2001.
[14] B. Scholkopf et al., Input Space versus Feature Space in Kernel-Based Methods IEEE Trans. Neural Networks, vol. 10, no. 5, pp. 1000-1017, Sept. 1999.
[15] B. Schölkopf, The Kernel Trick for Distances Advances in Neural Information Processing Systems, T.K. Leen, T.G. Dietterich, and V. Tresp, eds., vol. 13, pp. 301-307, MIT Press, 2001.
[16] Advances in Kernel Methods: Support Vector Learning, B. Scholkopf, C.J.C. Burges, and A.J. Smola, eds., Cambridge, Mass.: MIT Press, 1999.
[17] R. Short and K. Fukanaga, "The Optimal Distance Measure for Nearest Neighbor Classification," IEEE Trans. Information Theory, vol. 27, pp. 622-627, 1981.
[18] S. Tong and D. Koller, Restricted Bayes Optimal Classifers Proc. AAAI, 2000.
[19] V.N. Vapnik, Statistical Learning Theory Adaptive and Learning Systems for Signal Processing, Communications, and Control, New York: Wiley, 1998.

Index Terms:
Classification, nearest neighbors, quasiconformal mapping, kernel methods, feature space.
Jing Peng, Douglas R. Heisterkamp, H.K. Dai, "Adaptive Quasiconformal Kernel Nearest Neighbor Classification," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 26, no. 5, pp. 656-661, May 2004, doi:10.1109/TPAMI.2004.1273978
Usage of this product signifies your acceptance of the Terms of Use.