This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
On the Relationship Between Dependence Tree Classification Error and Bayes Error Rate
October 2007 (vol. 29 no. 10)
pp. 1866-1868
Wong and Poon [1] showed that Chow and Liu’s tree dependence approximation can be derived by minimizing an upper bound of the Bayes error rate. Wong and Poon’s result was obtained by expanding the conditional entropy H(w|X). We derive the correct expansion of H(w|X) and present its implication.

[1] S.K.M. Wong and F.C.S. Poon, “Comments on Approximating Discrete Probability Distributions with Dependence Trees,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 11, no. 3, pp. 333-335, Mar. 1989.
[2] C.K. Chow and C.N. Liu, “Approximating Discrete Probability Distributions with Dependence Trees,” IEEE Trans. Information Theory, vol. 14, pp.462-467, May 1968.
[3] J.B. KruskalJr., “On the Shortest Spanning Subtree of a Graph and the Travelling Saleman Problem,” Proc. Conf. Am. Math. Soc., vol. 7, pp. 48-50, 1956.
[4] M.E. Hellman and J. Raviv, “Probability of Error, Equivocation, and the Chernoff Bound,” IEEE Trans. Information Theory, vol. 16, pp. 368-372, May 1970.
[5] T.M. Cover and J.A. Thomas, Elements of Information Theory. Wiley Interscience, 1991.
[6] H. Avi-ltzhak and T. Diep, “Arbitrarily Tight Upper and Lower Bounds on the Bayesian Probability of Error,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 18, no. 1, pp. 89-91, Jan. 1996.

Index Terms:
bayes error rate, entropy, mutual information, classification, dependence tree approximation
Citation:
Kiran S. Balagani, Vir V. Phoha, "On the Relationship Between Dependence Tree Classification Error and Bayes Error Rate," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 29, no. 10, pp. 1866-1868, Oct. 2007, doi:10.1109/TPAMI.2007.1184
Usage of this product signifies your acceptance of the Terms of Use.