28 juillet 2025 à 1 août 2025
Fuseau horaire Europe/Paris

Generalized Naive Bayes with continuous explanatory variables

28 juil. 2025, 18:00
30m
F207

F207

Contributed talk Machine learning ML

Orateur

Edith Alice Kovács (Budapest University of Technology and Economics)

Description

Naive Bayes is one of the most widely used machine learning algorithms, appreciated for its simplicity, efficiency, and ease of interpretation—qualities that make it appealing across various fields. However, Naive Bayes operates under the assumption that the explanatory variables are conditionally independent given the class label, an assumption that often does not hold true in practice.

To address this limitation, we propose a method that relaxes the independence assumption by allowing certain dependencies between the explanatory variables. The central idea of our approach is to find a structure that best approximates the joint probability distribution of the data while minimizing the Kullback-Leibler divergence. We introduce a greedy algorithm that gradually builds this structure step by step.

Initially, the algorithm was designed for cases with discrete variables. We are now adapting it for situations where the explanatory variables follow a joint multivariate Gaussian probability distribution. Furthermore, we extend our approach to accommodate more general continuous variables.

Finally, we evaluate the performance of our method against other classification techniques using real-world datasets. Our method is also capable of identifying the most informative set of explanatory variables.

Author

M. Ábrahám Papp (Budapest University of Technology and Economics)

Co-auteurs

Dr Botond Szilágyi (Budapest University of Technology and Economics) Edith Alice Kovács (Budapest University of Technology and Economics)

Documents de présentation

Aucun document.