The Community for Technology Leaders
Green Image
Issue No. 10 - October (2009 vol. 21)
ISSN: 1041-4347
pp: 1361-1371
Liangxiao Jiang , China University of Geosciences, Wuhan
Harry Zhang , University of New Brunswick, Fredericton
Zhihua Cai , China University of Geosciences, Wuhan
Because learning an optimal Bayesian network classifier is an NP-hard problem, learning-improved naive Bayes has attracted much attention from researchers. In this paper, we summarize the existing improved algorithms and propose a novel Bayes model: hidden naive Bayes (HNB). In HNB, a hidden parent is created for each attribute which combines the influences from all other attributes. We experimentally test HNB in terms of classification accuracy, using the 36 UCI data sets selected by Weka, and compare it to naive Bayes (NB), selective Bayesian classifiers (SBC), naive Bayes tree (NBTree), tree-augmented naive Bayes (TAN), and averaged one-dependence estimators (AODE). The experimental results show that HNB significantly outperforms NB, SBC, NBTree, TAN, and AODE. In many data mining applications, an accurate class probability estimation and ranking are also desirable. We study the class probability estimation and ranking performance, measured by conditional log likelihood (CLL) and the area under the ROC curve (AUC), respectively, of naive Bayes and its improved models, such as SBC, NBTree, TAN, and AODE, and then compare HNB to them in terms of CLL and AUC. Our experiments show that HNB also significantly outperforms all of them.
Naive Bayes, Bayesian network classifiers, learning algorithms, classification, class probability estimation, ranking.

L. Jiang, H. Zhang and Z. Cai, "A Novel Bayes Model: Hidden Naive Bayes," in IEEE Transactions on Knowledge & Data Engineering, vol. 21, no. , pp. 1361-1371, 2008.
96 ms
(Ver 3.3 (11022016))