We introduce a new neural network model that generalizes the principles of the Naïve Bayes classification method. It is trained with use of backpropagation-like algorithm, in purpose of obtaining optimal combination of several classifiers. Experimental results are presented.
CITATION STYLE
Ślęzak, D., Wróblewski, J., & Szczuka, M. (2003). Constructing extensions of Bayesian classifiers with use of normalizing neural networks. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 2871, pp. 408–416). Springer Verlag. https://doi.org/10.1007/978-3-540-39592-8_57
Mendeley helps you to discover research relevant for your work.