Student seminar
The event has passed

Master's Thesis presentation, Ruben Seyer

Differentiable Monte Carlo Samplers with Piecewise Deterministic Markov Processes

Overview

The event has passed
  • Date:Starts 7 June 2023, 10:00Ends 7 June 2023, 11:00
  • Location:
    MV:L14, Chalmers tvärgata 3
  • Language:English

Monte Carlo gradient estimation, i.e. estimating the gradient of an expectation with respect to distribution parameters by sampling, is a common problem of statistics and machine learning.

Analysing the sensitivity of a stochastic system in this way has applications in e.g. Bayesian inference, reinforcement learning, and probabilistic programming.

One type of gradient estimation differentiates through the sampling scheme itself, using pathwise derivatives, also known as the reparameterization trick.

However, many common Monte Carlo samplers in use today, such as Hamiltonian Monte Carlo, contain a nondifferentiable rejection step.

In recent years, a new type of Monte Carlo samplers based on PDMPs have been studied, suitable for efficient large-scale Bayesian inference.

Here we construct two new PDMP-based differentiable Monte Carlo methods, in one dimension using the Zig-Zag sampler, and in higher dimensions using the Bouncy Particle sampler.

We prove that the one-dimensional estimator is unbiased and in simple cases strongly consistent, conjecturing its consistency in general.

Our simulations suggest that the multi-dimensional estimator is unbiased and strongly consistent in some common settings.

This forms a promising starting point for development of a new class of efficient stochastic gradient estimators.