Overview
Date:
Starts 22 May 2026, 09:00Ends 22 May 2026, 13:00Location:
Euler, Skeppsgränd 3, GöteborgOpponent:
Professor Donald Estep, Department of Statistics and Actuarial Science, Simon Fraser University, CanadaThesis
Read thesis (Opens in new tab)
Bayesian filtering concerns the sequential estimation of the hidden state of a dynamical system from partial and noisy observations. Its central object is the conditional distribution of the hidden state given the available data, which provides both point estimates and a quantitative description of uncertainty. In nonlinear and non-Gaussian settings, this distribution is typically not available in closed form, and the construction of accurate and computationally feasible approximation methods becomes a central challenge, especially for high-dimensional systems.
This thesis studies Bayesian filtering with particular emphasis on density-based formulations when the underlying state is governed by a stochastic differential equation. We formulate the filtering problem through the evolution of conditional probability densities, described by stochastic and deterministic partial differential equations. Across the four appended papers, we develop methodologies for approximating these equations in high-dimensional settings. The proposed approaches draw on stochastic analysis, numerical analysis, and deep learning. They combine operator splitting, probabilistic backward representations, logarithmic transformations, and neural networks to approximate the conditional probability density. Theoretical convergence orders are established and verified numerically. The approaches are successfully demonstrated on nonlinear, high-dimensional, and partially observed stochastic differential equations.
Taken together, the papers in this thesis develop a framework for Bayesian filtering that combines probabilistic density representations with modern learning-based computational methods. The results indicate that these approaches can provide accurate and scalable alternatives to classical filtering methods for nonlinear and high-dimensional systems.
This thesis studies Bayesian filtering with particular emphasis on density-based formulations when the underlying state is governed by a stochastic differential equation. We formulate the filtering problem through the evolution of conditional probability densities, described by stochastic and deterministic partial differential equations. Across the four appended papers, we develop methodologies for approximating these equations in high-dimensional settings. The proposed approaches draw on stochastic analysis, numerical analysis, and deep learning. They combine operator splitting, probabilistic backward representations, logarithmic transformations, and neural networks to approximate the conditional probability density. Theoretical convergence orders are established and verified numerically. The approaches are successfully demonstrated on nonlinear, high-dimensional, and partially observed stochastic differential equations.
Taken together, the papers in this thesis develop a framework for Bayesian filtering that combines probabilistic density representations with modern learning-based computational methods. The results indicate that these approaches can provide accurate and scalable alternatives to classical filtering methods for nonlinear and high-dimensional systems.
Kasper Bågmark
- Doctoral Student, Applied Mathematics and Statistics, Mathematical Sciences
