Inference and Learning with Model Uncertainty in Probabilistic Logic Programs
Victor Verreet, Vincent Derkinderen, Pedro Zuidberg Dos Martires, Luc De Raedt
[AAAI-22] Main Track
Abstract:
An issue that has so far received only limited attention in probabilistic logic programming (PLP) is the modelling of so-called epistemic uncertainty, the uncertainty about the model itself. Accurately quantifying this model uncertainty is paramount to robust inference, learning and ultimately decision making. We introduce BetaProbLog, a PLP language that can model epistemic uncertainty. BetaProbLog has sound semantics, an effective inference algorithm that combines Monte Carlo techniques with knowledge compilation, and a parameter learning algorithm. We emprically outperform state-of-the-art methods on probabilistic inference tasks in second-order Bayesian networks, digit classification and discriminative learning in the presence of epistemic uncertainty.
Introduction Video
Sessions where this paper appears
-
Poster Session 6
Sat, February 26 8:45 AM - 10:30 AM (+00:00)
Red 2
-
Poster Session 7
Sat, February 26 4:45 PM - 6:30 PM (+00:00)
Red 2
-
Oral Session 7
Sat, February 26 6:30 PM - 7:45 PM (+00:00)
Red 2