Volume 6 - Issue 1
Class dependent feature scaling method via restrictive Bayesian network classifier combination
Abstract
In classifier combination, the relative values of a posteriori probabilities assigned to different hypotheses are more important than the accuracy of their estimates. Because of this, the independence requirement in Naive Bayesian fusion should be examined from combined accuracy point of view. In this study, ANB is proposed to relax the independence assumptions of Naive Bayes while still permitting efficient inference. Classification results of ANB are compared to Naive Bayes classifiers and tree augmented Naive Bayes classifiers. Experiments on UCI data sets show that ANB achieves better classification accuracy estimate in some domains, whereby in the remaining domains the performance is similar.
Paper Details
PaperID: 77954249094
Author's Name: Wang, L., Li, X., Xu, P.
Volume: Volume 6
Issues: Issue 1
Keywords: Independence assumption, Naive Bayes, Posteriori probabilities
Year: 2010
Month: January
Pages: 33-38