Seminar
The event has passed

Statistics seminar

Adrien Corenflos, University of Warwick: Particle-MALA and Particle-mGrad: Gradient-based MCMC methods for high-dimensional state-space models

Overview

The event has passed
  • Date:Starts 29 May 2024, 13:15Ends 29 May 2024, 14:00
  • Location:
    MV:L14, Chalmers tvärgata 3
  • Language:English

Abstract: State-of-the-art methods for Bayesian inference in state-space models are (a) conditional sequential Monte Carlo (CSMC) algorithms; (b) sophisticated 'classical' MCMC algorithms like MALA, or mGRAD from Titsias and Papaspiliopoulos (2018). The former propose N particles at each time step to exploit the model's 'decorrelation-over-time' property and thus scale favourably with the time horizon, T, but break down if the dimension of the latent states, D, is large. The latter leverage gradient/prior-informed local proposals to scale favourably with D but exhibit sub-optimal scalability with T due to a lack of model-structure exploitation. We introduce methods which combine the strengths of both approaches. The first, Particle-MALA, spreads N particles locally around the current state using gradient information, thus extending MALA to T>1 time steps and N>1 proposals. The second, Particle-mGRAD, additionally incorporates (conditionally) Gaussian prior dynamics into the proposal, thus extending the mGRAD algorithm. We prove that Particle-mGRAD interpolates between CSMC and Particle-MALA, resolving the 'tuning problem' of choosing between CSMC (superior for highly informative prior dynamics) and Particle-MALA (superior for weakly informative prior dynamics). We similarly extend other 'classical' MCMC approaches like auxiliary MALA, aGRAD, and preconditioned Crank-Nicolson-Langevin (PCNL). In experiments, our methods substantially improve upon both CSMC and sophisticated `classical' MCMC approaches for both highly and weakly informative prior dynamics.

TL;DR: We aim to solve the curse of dimensionality in state-space model inferences by combining the nice property (in time) of conditional particle filtering methods, with the nice property (in space) of MALA and other gradient-based algorithms.

Moritz Schauer
  • Senior Lecturer, Applied Mathematics and Statistics, Mathematical Sciences
Statistics seminar | Chalmers