In this project we advance the understanding of deep neural networks through the investigation of stochastic continuous-depth neural networks. These can be thought of as deep neural networks (DNN) composed of infinitely many stochastic layers, where each single layer only brings about a gradual change to the output of the preceding layers. We will analyse such stochastic continuous-depth neural networks using tools from stochastic calculus and Bayesian statistics. From that, we will derive practically relevant and novel training algorithms for stochastic DNNs with the aim to capture the uncertainty associated with the predictions of the network. Recruitment
This project is supported by Chalmers AI Research Centre (CHAIR)
. We are currently in the process of recruiting a PhD student
for this project. The deadline is 6 March 2020
. This is part of a larger call for six or more doctoral students in mathematics for artificial intelligence. The announcement and information of how to apply is available here
. Please contact principle investigator Moritz Schauer (email@example.com
) for further information or questions about this specific project.