The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.04 - April (1974 vol.23)
pp: 410-420
G.A. Butler , Judson B. Branch Research Center, Allstate Insurance Company
ABSTRACT
Although mutual information (MI) has been proposed for some time as a measure of the dependence between the class variable and pattern recognition features, it is only recently that the practical problems of designing computer programs to use MI have been raised. Within the two-class context, this paper compares two traditional approaches to the requisite entropy estimation (using the maximum likelihood and expected value estimators of class probabilities) with a new estimator: the expected value of binomial entropy (E). The latter is shown to be superior where one class has a priori dominance. E is also related to expected probability of error and, in a surprising result, it is shown that E is a better estimator of class probabilities than the maximum likelihood and expected value estimators over a wide range.
INDEX TERMS
Binomial distribution, entropy, feature selection information, mutual information, nonparametric classifier design, pattern recognition, two-class sampling.
CITATION
G.A. Butler, H.B. Ritea, "Estimation of Mutual Information in Two-Class Pattern Recognition", IEEE Transactions on Computers, vol.23, no. 4, pp. 410-420, April 1974, doi:10.1109/T-C.1974.223956
25 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool