Statistique - Probabilités - Optimisation et Contrôle

Multiparametric Boltzmann tuning: on the crossroad of probability and convex optimisation

by Sergey Dovgal (IMB)

René Baire (IMB)

René Baire



Suppose that you want to randomly sample a complicated item
from a probability space, defined by a context-free grammar with given
fixed proportions of terminal letters. This problem is doable if the
number of letters is small, but turns out to be #P-complete when the
number of letters is unbounded (this is harder than NP-hard in the
current hierarchy). In fact, both enumeration and sampling problems
are computationally intractable in such a setting. However, by
slightly deforming the target probability distribution to the
so-called multiparametric Boltzmann distribution, you may get much
faster sampling algorithms, provided that there is a tuning oracle for
finding the good target weights. In this talk, I will guide you
through the concept of multiparametric Boltzmann samplers and will
show how we constructed a tuning oracle having a polynomial time-space
complexity using a convex optimisation procedure. If time permits, we
shall also discuss
* the complexity of the exact-sampling algorithm by reduction to #2-sat
* the notion of self-concordant barriers leading to a fine complexity
estimate of the tuning procedure
* numerous applications of Boltzmann samplers in combinatorics and beyond
* the link between Boltzmann tuning and maximum likelihood estimation
* the software package "Paganini" implementing our tuning concept.

This is a joint work with Maciej Bendkowski and Olivier Bodini,
accepted to publication in CPC in April 2021

The preprint is available at:
Our software is on github: