Pushing the Boundaries of Probabilistic Inference through Message Contraction Optimization

Jul 24, 2024ยท
Martin Roa-Villescas
Martin Roa-Villescas
,
Jin-Guo Liu
,
Patrick Wijnings
,
Sander Stuijk
,
Henk Corporaal
ยท 0 min read
Abstract
A key aspect of intelligent systems is their capacity to reason under uncertainty. This task involves calculating probabilities of relevant variables while considering any available information, a process commonly referred to as probabilistic inference. When working with discrete variables, the primary operations in probabilistic inference algorithms involve adding and multiplying multidimensional arrays with labeled dimensions, known as factors. The algorithmic complexity is dictated by the highest dimensional factor involved in any calculation; a concept referred to as the induced tree width. Despite advances in state-of-the-art techniques focused on reducing this metric, many real-world problems remain too complex to solve through existing probabilistic inference algorithms. In this work, we introduce a new method for adding and multiplying factors, which leads to marked improvements in inference performance, particularly for more complex models. Furthermore, this method serves as the core of a novel optimization framework introduced in this work, which employs metaprogramming to further enhance the runtime performance of probabilistic inference algorithms. Our method complements current leading-edge techniques aimed at reducing the induced tree width, thereby extending the range of models that can be effectively solved using exact inference. To validate the performance of our approach, we compare it against two other open-source libraries designed for probabilistic inference. Our method demonstrates an average speedup of 23 times on the UAI 2014 benchmark set. For the 10 most complex problems of this set, the average speedup increases to 64 times, highlighting the scalability of our method.
Type
Publication
In the Artificial Intelligence - Machine Learning, Convolutional Neural Networks and Large Language Models