Umberto Picchini, Chalmers University of Technology & University of Gothenburg: Fast, lightweight and semi-amortised simulation-based inference
Overview
- Date:Starts 15 May 2024, 13:15Ends 15 May 2024, 14:00
- Location:MV:L14, Chalmers tvärgata 3
- Language:English
Abstract: Bayesian inference for complex models with an intractable likelihood can be tackled using algorithms performing many calls to computer simulators. These approaches are collectively known as "simulation-based inference" (SBI). Recent SBI methods use neural-conditional-estimation, that is neural networks are employed to provide approximations to the likelihood function or the posterior distribution of model parameters. While neural-based posterior and likelihood estimation methods have produced exceptionally flexible inference strategies, these can be computationally intensive to run and have a non-negligible impact on energy expenditure and memory requirements. In this work, we propose more "frugal" strategies that can run with limited resources and exhibit a much smaller computational footprint. We investigate structured mixtures of probability distributions and design a new SBI method named Sequential Mixture Posterior and Likelihood Estimation (SeMPLE). SeMPLE learns closed-form approximations for both the posterior p(θ∣y) and the likelihood p(y∣θ) from the same training data, using a Gaussian mixture model that can be efficiently learned. For a variety of stochastic models (including SDEs and Markov jump processes), also exhibiting multimodal posteriors, we show that SeMPLE returns state-of-art quality inference when compared with neural-conditional-estimation, while being much faster to train and lightweight on computational resources.
Joint work with Henrik Häggström, Pedro L. C. Rodrigues, Geoffroy Oudoumanessah and Florence Forbes.
- Senior Lecturer, Applied Mathematics and Statistics, Mathematical Sciences
