This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Minimum Cross-Entropy Pattern Classification and Cluster Analysis
January 1982 (vol. 4 no. 1)
pp. 11-17
John E. Shore, SENIOR MEMBER, IEEE, Information Technology Division, Naval Research Laboratory, Washington, DC 20375.
Robert M. Gray, FELLOW, IEEE, Department of Electrical Engineering, Stanford University, Stanford, CA 94305.
This paper considers the problem of classifying an input vector of measurements by a nearest neighbor rule applied to a fixed set of vectors. The fixed vectors are sometimes called characteristic feature vectors, codewords, cluster centers, models, reproductions, etc. The nearest neighbor rule considered uses a non-Euclidean information-theoretic distortion measure that is not a metric, but that nevertheless leads to a classification method that is optimal in a well-defined sense and is also computationally attractive. Furthermore, the distortion measure results in a simple method of computing cluster centroids. Our approach is based on the minimization of cross-entropy (also called discrimination information, directed divergence, K-L number), and can be viewed as a refinement of a general classification method due to Kullback. The refinement exploits special properties of cross-entropy that hold when the probability densities involved happen to be minimum cross-entropy densities. The approach is a generalization of a recently developed speech coding technique called speech coding by vector quantization.
Citation:
John E. Shore, Robert M. Gray, "Minimum Cross-Entropy Pattern Classification and Cluster Analysis," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 4, no. 1, pp. 11-17, Jan. 1982, doi:10.1109/TPAMI.1982.4767189
Usage of this product signifies your acceptance of the Terms of Use.