This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Estimation of Mutual Information in Two-Class Pattern Recognition
April 1974 (vol. 23 no. 4)
pp. 410-420
G.A. Butler, Judson B. Branch Research Center, Allstate Insurance Company
Although mutual information (MI) has been proposed for some time as a measure of the dependence between the class variable and pattern recognition features, it is only recently that the practical problems of designing computer programs to use MI have been raised. Within the two-class context, this paper compares two traditional approaches to the requisite entropy estimation (using the maximum likelihood and expected value estimators of class probabilities) with a new estimator: the expected value of binomial entropy (E). The latter is shown to be superior where one class has a priori dominance. E is also related to expected probability of error and, in a surprising result, it is shown that E is a better estimator of class probabilities than the maximum likelihood and expected value estimators over a wide range.
Index Terms:
Binomial distribution, entropy, feature selection information, mutual information, nonparametric classifier design, pattern recognition, two-class sampling.
Citation:
G.A. Butler, H.B. Ritea, "Estimation of Mutual Information in Two-Class Pattern Recognition," IEEE Transactions on Computers, vol. 23, no. 4, pp. 410-420, April 1974, doi:10.1109/T-C.1974.223956
Usage of this product signifies your acceptance of the Terms of Use.