TY - GEN
A1 - Taheri, Sona
A1 - Mammadov, Musa
A2 - Korbicz, Józef - red.
A2 - Uciński, Dariusz - red.
PB - Zielona Góra: Uniwersytet Zielonogórski
N2 - Naive Bayes is among the simplest probabilistic classifiers. It often performs surprisingly well in many real world applications, despite the strong assumption that all features are conditionally independent given the class. In the learning process of this classifier with the known structure, class probabilities and conditional probabilities are calculated using training data, and then values of these probabilities are used to classify new observations.
N2 - In this paper, we introduce three novel optimization models for the naive Bayes classifier where both class probabilities and conditional probabilities are considered as variables. The values of these variables are found by solving the corresponding optimization problems. Numerical experiments are conducted on several real world binary classification data sets, where continuous features are discretized by applying three different methods.
N2 - The performances of these models are compared with the naive Bayes classifier, tree augmented naive Bayes, the SVM, C4.5 and the nearest neighbor classifier. The obtained results demonstrate that the proposed models can significantly improve the performance of the naive Bayes classifier, yet at the same time maintain its simple structure.
L1 - http://www.zbc.uz.zgora.pl/Content/78886/AMCS_2013_23_4_8.pdf
L2 - http://www.zbc.uz.zgora.pl/Content/78886
KW - Bayesian networks
KW - naive Bayes classifier
KW - optimization
KW - discretization
T1 - Learning the naive Bayes classifier with optimization models
UR - http://www.zbc.uz.zgora.pl/dlibra/docmetadata?id=78886
ER -