This WASP AI/Math project aims to shed light on the mathematical structure of deep neural networks using techniques and insights from a variety of different fields in mathematics and physics, including quantum mechanics, information theory, differential geometry, group theory and gauge theory.

Despite the overwhelming success of deep neural networks we are still at a loss for explaining exactly how deep learning works, and why it works so well. What is the mathematical framework underlying deep learning? One promising direction is to consider *symmetries* as an underlying design principle for network architectures. This can be implemented by constructing deep neural networks on a group G that acts transitively on the input data. This is directly relevant for instance in the case of spherical signals where G is a rotation group.

Even more generally, it is natural to consider the question of how to train neural networks in the case of “non-flat” data. Relevant applications include omnidirectional computer vision, biomedicine, and climate observations, just to mention a few situations where data is naturally “curved”. Mathematically, this calls for developing a theory of deep learning on manifolds, or even more exotic structures, like graphs or algebraic varieties. A special class consists of homogeneous spaces G/H, where H is a subgroup. The project aims to develop the mathematical framework for convolutional neural networks which are globally equivariant with respect to G and locally with respect to H. The formalism is closely similar to gauge theory in physics and exploring this connection is a potential side project.

The PI of the project is Daniel Persson at Mathematical Sciences. For interviews with Daniel concerning the project, see *Exploring the underlying symmetries of AI* and *AI, fast growing research area for mathematicians.*

Christoffer Petersson, Deep Learning Research Engineer at Zenseact and Adjunct Associate Professor at Mathematical Sciences, is industrial supervisor in the project. Fredrik Ohlsson, Associate Professor at Department of Mathematics and Mathematical Statistics at Umeå University, is an external researcher in the project.

**Papers:**

*Homogeneous vector bundles and G-equivariant convolutional neural networks*

Jimmy Aronsson

Preprint: https://arxiv.org/abs/2105.05400

*Geometric Deep Learning and Equivariant Neural Networks*

Jan E. Gerken, Jimmy Aronsson, Oscar Carlsson, Hampus Linander, Fredrik Ohlsson, Christoffer Petersson, Daniel Persson

Preprint: https://arxiv.org/abs/2105.13926