The Community for Technology Leaders
2013 IEEE 13th International Conference on Data Mining (2009)
Miami, Florida
Dec. 6, 2009 to Dec. 9, 2009
ISSN: 1550-4786
ISBN: 978-0-7695-3895-2
pp: 357-366
ABSTRACT
KNN is one of the most popular data mining methods for classification, but it often fails to work well with inappropriate choice of distance metric or due to the presence of numerous class-irrelevant features. Linear feature transformation methods have been widely applied to extract class-relevant information to improve kNN classification, which is very limited in many applications. Kernels have also been used to learn powerful non-linear feature transformations, but these methods fail to scale to large datasets. In this paper, we present a scalable non-linear feature mapping method based on a deep neural network pretrained with Restricted Boltzmann Machines for improving kNN classification in a large-margin framework, which we call DNet-kNN. DNet-kNN can be used for both classification and for supervised dimensionality reduction. The experimental results on two benchmark handwritten digit datasets and one newsgroup text dataset show that DNet-kNN has much better performance than large-margin kNN using a linear mapping and kNN based on a deep autoencoder pretrained with Restricted Boltzmann Machines.
INDEX TERMS
kNN Classification, Large Margin, Non-linear Feature Mapping, Non-linear Dimensionality Reduction, Deep Neural Networks, RBM, Deep Learning
CITATION
Zhaolei Zhang, David A. Stanley, Anthony Bonner, Renqiang Min, Zineng Yuan, "A Deep Non-linear Feature Mapping for Large-Margin kNN Classification", 2013 IEEE 13th International Conference on Data Mining, vol. 00, no. , pp. 357-366, 2009, doi:10.1109/ICDM.2009.27
105 ms
(Ver )