##### Colloquia Autumn 2017

**17-09-11**

Mark Pollicott (University of Warwick)

Mark Pollicott (University of Warwick)

*Dynamical Zeta functions*

The Riemann zeta function is an important tool in the theory of prime numbers. A more geometric analogue of this is the Selberg Zeta function where the prime numbers are replaced by closed geodesics on a compact surface of constant negative curvature. However to generalize this to more general settings one needs to take a more dynamical viewpoint. We will discuss old and new results in this direction.

Slides_Pollicott_2017.pdf

**17-10-09**

David J.T. Sumpter (Uppsala Universitet)

David J.T. Sumpter (Uppsala Universitet)

*The mathematics of collective behaviour, artificial life and simulated intelligence*

One of the most challenging questions in mathematical modelling is building models that explain how interactions on one scale produce patterns on another much larger scale. How do we explain the ant trail networks that cover several km and are produced by individual ants that are less than a cm long? How do we explain the motion of a bird flock or fish school when each individual follows only a few nearest neighbours? These are questions about how we integrate the behaviour of lots of little particles to explain the pattern as a whole. They are also examples where many of our favourite mathematical tools for adding up the behaviour of lots of individuals, such as the Central Limit Theorem, don’t work. The patterns generated are more than the sum of their parts.

I start by talking about these examples (of collective behaviour) and describe how combinations of models and experiments have started to answer this question. I then move on to a more challenging question: can we describe how self-reproducing organisms arise from simple chemical interactions. This problem is far from solved, but I present a mathematical model that tries to address it. Finally, given the difficulty in solving the ‘artificial life’ problem I make a few comments on the even more ambitious goal of simulating human-like intelligence.

Slides_Sumpter_2017.pdf

**17-11-13 (N.B. in the room Pascal)**

Bo Berndtsson (Chalmers/GU)

Bo Berndtsson (Chalmers/GU)

*Real and Complex Brunn-Minkowski Theory*

The classical Brunn-Minkowski inequality is the corner stone of Convex Geometry. It has many applications; the isoperimetric inequality is for instance a simple consequence. Many proofs have been given over the years -- e.g. one by Hilbert to illustrate the power of his new theory of linear integral equations in 1912.

I will start with a soft introduction to the Brunn-Minkowski inequality for the volumes of convex bodies, and focus on the proof of Brascamp and Lieb from the 70-ties. The Brascamp-Lieb inequality can be seen as a real variable counterpart of Hörmander's famous $L^2$-estimate for the $\bar\partial$-operator in complex analysis. This leads us to an overwhelming question: What is the complex analysis version of the Brunn-Minkowski inequality? I will argue that there is a natural counterpart which involves $L^2$-norms of holomorphic functions (or sections of vector bundles) instead of volumes. It reduces to the real version in the presence of enough symmetry and is of interest in complex analysis, algebraic geometry and Kähler geometry.

**17-12-04**

Rebecka Jörnsten (Chalmers/GU)

Rebecka Jörnsten (Chalmers/GU)

*On the concept of Data Depth*

The notion of depth of an observation can be used to provide non-parametric analyses of multivariate data (i.e., without an explicit model assumption). Basically, data depth is used to generalize the concept of a median to multivariate data. In addition, the depth of observations can be used to order observations with respect to a sample of observations or a distribution, i.e. a generalization of quantiles and percentiles for multivariate data - how do we identify "extreme" observations in a higher dimensional setting?

In this talk, I present a review of recent developments in data depth; how it can be used for testing, classification and clustering. I conclude with an illustration where I use the idea of data depth to generate robust network estimates from genomic data. The "deepest network" achieves superior false-discovery-rate performance as well as better preservation of the true network topology compared to the commonly used threshold-based estimates (bootstrap edge-presence frequency).