This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
2012 IEEE 12th International Conference on Data Mining Workshops
Nonlinear Unsupervised Feature Learning: How Local Similarities Lead to Global Coding
Brussels, Belgium Belgium
December 10-December 10
ISBN: 978-1-4673-5164-5
This paper introduces a novel coding scheme based on the diffusion map framework. The idea is to run a t-step random walk on the data graph to capture the similarity of a data point to the codebook atoms. By doing this we exploit local similarities extracted from the data structure to obtain a global similarity which takes into account the non-linear structure of the data. Unlike the locality-based and sparse coding methods, the proposed coding varies smoothly with respect to the underlying manifold. We extend the above transductive approach to an inductive variant which is of great interest for large scale datasets. We also present a method for codebook generation by coarse graining the data graph with the aim of preserving random walks. Experiments on synthetic and real data sets demonstrate the superiority of the proposed coding scheme over the state-of-the-art coding techniques especially in a semi-supervised setting where the number of labeled data is small.
Index Terms:
Encoding,Kernel,Manifolds,Spirals,Dictionaries,Clustering algorithms,Equations,manifold,coding,diffusion map,coarse graining
Citation:
Amirreza Shaban, Hamid R. Rabiee, Marzieh S. Tahaei, Erfan Salavati, "Nonlinear Unsupervised Feature Learning: How Local Similarities Lead to Global Coding," icdmw, pp.506-513, 2012 IEEE 12th International Conference on Data Mining Workshops, 2012
Usage of this product signifies your acceptance of the Terms of Use.