Statistics seminar

​Luigi Acerbi, University of Helsinki: Practical sample-efficient Bayesian inference for models with and without likelihoods

Zoom: with password 293587

Bayesian inference in applied fields of science and engineering can be challenging because in the best-case scenario the likelihood is a black-box (e.g., mildly-to-very expensive, no gradients) and more often than not it is not even available, with the researcher being only able to simulate data from the model. In this talk, I review a recent sample-efficient framework for approximate Bayesian inference, Variational Bayesian Monte Carlo (VBMC), which uses only a limited number of potentially noisy log-likelihood evaluations. VBMC produces both a nonparametric approximation of the posterior distribution and an approximate lower bound of the model evidence, useful for model selection. VBMC combines well with a technique we (re)introduced, inverse binomial sampling (IBS), that obtains unbiased and normally-distributed estimates of the log-likelihood via simulation. VBMC has been tested on many real problems (up to 10 dimensions) from computational and cognitive neuroscience, with and without likelihoods. Our method performed consistently well in reconstructing the ground-truth posterior and model evidence with a limited budget of evaluations, showing promise as a general tool for black-box, sample-efficient approximate inference — with exciting potential extensions to more complex cases.

1. MATLAB toolbox:
2. VBMC paper:
3. VBMC with noisy likelihoods paper:
4. Inverse binomial sampling preprint:  

​Organiser: Umberto Picchini ( Please contact me if you wish to be informed of future seminars
Category Seminar
Location: Online
Starts: 20 October, 2020, 14:15
Ends: 20 October, 2020, 15:15

Page manager Published: Tue 20 Oct 2020.