Abstracts, see below. 
26/1, Sebastian Engelke, EcPeterole Polytechnique Fédérale de Lausanne, Robust bounds for multivariate extreme value distributions
2/2, Igor Rychlik, Spatio-temporal model for wind speed variability in Atlantic Ocean
9/2, Christophe A. N. Biscio, Aalborg University, The accumulated persistence function, a new functional summary statistic for topological data analysis, with a view to brain artery trees and spatial point process applications. Abstract APF-goteborg.pdf
16/2, ​Måns Magnusson, Linköping University, ​Sparse Partially Collapsed MCMC for Parallel Inference in Topic Models
2/3, Henrike Häbel, Final seminar: Pairwise interaction in 3D – the interplay between repulsive and attractive forces
9/3, Ottmar Cronie, ​The second-order analysis of marked spatio-temporal point processes: applications to earthquake data
16/3, Krzysztof Bartoszek, Uppsala University, ​A punctuated stochastic model of adaptation
23/3, Artem Kaznatcheev, University of Oxford, ​The evolutionary games that cancers play and how to measure them
30/3, Robert Noble, ETH Basel, Evolution, ecology, and cancer risk: from naked mole rats to modern humans
6/4, John Wiedenhoeft, Rutgers University, Fast Bayesian Inference of Copy Number Variants using Hidden Markov Models with Wavelet Compression
18/5, Lloyd Demetrius, Harvard University, Max Planck Institut, An entropic selection principle in evolutionary processes
1/6, Daniel Nichol, The Institute of Cancer Research, Collateral Sensitivity is Contingent on the Repeatability of Evolution
8/6, Jie Yen Fan, Monash University, Limit theorems for size, age, type, and type structure dependent populations
12/10, ​Aila Särkkä, Anisotropy analysis of spatial point patterns
19/10, Özgür Asar, Acibadem University, Istanbul, Linear Mixed Effects Modelling for Non-Gaussian Repeated Measurement Data. Abstract_Ozgur_Asar.pdfAbstract_Ozgur_Asar.pdf
30/11, Philip Gerlee​, Fourier series of stochastic processes: an application to spatial game theory​
7/12, Julia Mortera, Università Roma Tre & University of Bristol, Inference about relationships from DNA mixtures. Abstract Gothenborg2017.pdf
14/12, David Bolin, ​The SPDE approach for Gaussian random fields with general smoothness
26/1, Sebastian Engelke, Ecole Polytechnique Fédérale de Lausanne: Robust bounds for multivariate extreme value distributions
Abstract: Univariate extreme value theory is used to estimate the value at risk of an asset in regions where few or no observations are available. It is based on the asymptotic result that the maximum of the data follows approximately a generalized extreme value distribution. Blanchet and Murthy (2016, http://arxiv.org/abs/1601.06858) recently studied worst case bounds for high exceedance probabilities that are robust against incorrect model assumptions of the extremal types theorem. For two or more dependent assets, multivariate extreme value theory provides an asymptotically justified framework for the estimation of joint exceedances. Here, the strength of dependence is crucial and it is typically modelled by a parametric family of distributions. In this work we analyse bounds that are robust against misspecification of the true dependence between assets. They arise as the explicit solution to a convex optimization problem and take a surprisingly simple form. In a financial context, these robust bounds can be interpreted as the worst-case scenarios of a systematic stress test. We show the importance of this approach in simulations and apply it to real data from finance. This is joint work with Jevgenijs Ivanovs (Aarhus University).
2/2, Igor Rychlik:  ​Spatio-temporal model for wind speed variability in Atlantic Ocean
Abstract: Investments in wind energy harvesting facilities are often high. At the same time uncertainties for the corresponding energy gains are large. Therefore a reliable model to describe the variability of wind speed is needed to estimate the expected available wind power and return values, e.g. 100 years wind speeds expected length of the wind conditions favourable for the wind-energy harvesting and other statistics of interest.
In Northern Atlantic wind speeds can be successfully modelled by means of a spatio-temporal transformed Gaussian field. Its dependence structure is localized by introduction of time and space dependent parameters in the field. However rarely occurring hurricanes in Caribbean region are missed by the model. A new model is presented that will cover both Northern Atlantic and the Caribbean region where hurricanes occur.
The model has the advantage of having a relatively small number of parameters.  These parameters have natural physical interpretation and are statistically fitted to represent variability of observed wind speed in ERA Interim reanalysis data set.
Some validations and applications of the model will be presented. Rice’s method is employed to estimate the 100 years wind speed in some spatial region. This talk presents an ongoing research.
16/2, Måns Magnusson, Linköping University: Sparse Partially Collapsed MCMC for Parallel Inference in Topic Models
Abstract: Topic models are widely used for probabilistic modelling of text. MCMC sampling from the posterior distribution is typically performed using a collapsed Gibbs sampler. We propose a parallel sparse partially collapsed Gibbs sampler and compare its speed and efficiency to state-of-the-art samplers for topic models. The experiments, which are performed on well-known small and large corpora, show that the expected increase in statistical inefficiency from only partial collapsing is smaller than commonly assumed. This minor inefficiency can be more than compensated by the speed-up from parallelization of larger corpora. The proposed algorithm is fast, efficient, exact, and can be used in more modelling situations than the ordinary collapsed sampler. Work to speed up the computations further using the Polya-Urn distribution will also be presented.
2/3, Henrike Häbel, Final seminar / Slutseminarium: Pairwise interaction in 3D – the interplay between repulsive and attractive forces
The spatial statistical analysis of the structure in materials is crucial for understanding and controlling their properties and function. For example, the transport of body fluids in hygiene products or the release of a drug from an oral formulation highly depends on the structure of the respective material. The material structure can be characterized in terms of its homogeneity and isotropy. In turn, characterization and modelling of the structure may provide valuable knowledge for ensuring high quality products and treatments. We study the pairwise interaction between points in three dimensions, which may be particle positions in a gel or locations of pore branching points in a porous pharmaceutical coating. The basis for our approach is a pairwise interaction Gibbs point process, which can be defined in terms of a pair-potential function describing the pairwise interaction. In this talk, I will present how different pair-potential functions relate to different material properties. I will present and contrast estimation methods while comparing spatial pairwise interactions to physical chemical attractive and repulsive forces. With this work, we want to contribute to the cross-disciplinary research on the characterization and modeling of materials with a rather simple and efficient methodology that can be applicable to various materials.
This seminar will be given as a final seminar of my Ph.D. studies including a discussion led by Mats Rudemo.
9/3, Ottmar Cronie, The second-order analysis of marked spatio-temporal point processes: applications to earthquake data
Abstract: A marked spatio-temporal point process (MSTPP) is a random element used to describe the spatial evolution of a set of incidents over time, when there is also some event-specific piece of information present. Typical applications involve earthquakes, where the marks are the associated magnitudes, and disease occurrences, where the marks may be some classification such as sex or age.
This talk introduces so-called marked second-order reduced moment measures, and thereby also K-functions, for (inhomogeneous) MSTPPs, in order to analyse and explicitly quantify dependence between points of different mark-categories in a MSTPP. These summary statistics are defined in a general form, so their definitions depend on the specific choices made for the mark-space and the mark-reference measure. We propose unbiased estimators as well as a new way of testing whether the marks are independent of their space-time locations.
By employing Voronoi intensity estimators to estimate the underlying intensity function, these new statistics are finally employed to analyse the well-known Andaman sea earthquake dataset. We confirm that clustering takes place between main and fore-/after-shocks at virtually all space and time scales. In addition, we find support for the idea that, conditionally on the space-time locations of the earthquakes, the magnitudes do not behave like an iid sequence.
16/3, Krzysztof Bartoszek, Uppsala University, A punctuated stochastic model of adaptation
Abstract: Contemporary stochastic evolution models commonly assume gradual change for a phenotype. However the fossil record and biological theory suggests that development is rather undergoing punctuated change. In this setup one assumes that there are very short time intervals during which dramatic change occurs and between these the species are at stasis. Motivated by this I will present a branching Ornstein-Uhlenbeck model with jumps at speciation points. I will in particular discuss a very recent result concerning  weak convergence: for a classical  central limit theorem to hold dramatic change has to be a rare event.
23/3, Artem Kaznatcheev, University of Oxford: The evolutionary games that cancers play and how to measure them
Abstract: Tumours are complex ecosystems with various cancerous and non-cancerous sub- populations of cells adopting a heterogeneous set of survival and propagation strategies. Evolutionary game theory (EGT) is a tool to make sense of such systems in both a theoretical and experimental setting. I will start with a minimal theoretical model that spatializes the 2-strategy pairwise Go-vs-Grow game and show that the edge effect of a static boundary (such as a blood vessel, organ capsule or basement membrane) allows a tumour without invasive phenotypes in the bulk to have a polyclonal boundary with invasive cells. Then I will consider a slightly more complicated theoretical study of a 3-strategy game that couples a public and club good to model the social dilemmas of tumour acidity and vasculature and allows us to explore treatment order and timing. But this raises the question: how do we know if the interactions are a Go-vs-Grow game or double goods game? In general, we guess these games from intuitions acquired by looking at population level experiments. This looking can be formalized into an operational specification of an effective game. I will show how we measure such an effective game in ALK+ non-small cell lung cancer, and that – like in the theoretical study of the double goods game -- the presence or absence of treatment changes the qualitative type of game (and not just its quantitative details). However, this effective game collapses the local interactions -- reductive game -- and spatial structure into a single measurement. This is in contrast to a typical study of space in EGT -- like the spatialized Go-vs-Grow game -- that starts with the reductive game and then simulate that interaction over a graph (or other model of space) to show a surprising difference in dynamics between the spatial model and the inviscid mean-field. Hence, I advocate that we turn the typical space-in-EGT study on its head. Let’s start from the measurement of an effective game and an operationalization of spatial structure to then provide a method for inferring the reductive game. I will sketch some ideas on how to approach this through time-lapse microscopy of the Petri dish.
30/3,  ​Robert Noble, ETH Basel: ​Evolution, ecology, and cancer risk: from naked mole rats to modern humans
Abstract:Mathematical models of cancer initiation conventionally examine how rapidly a cell population acquires a certain number of mutational “hits” or “drivers”. However, to understand variation in cancer risk, we must consider two additional factors. First, ecology: just as the fate of a normal stem cell depends on its niche, the relative fitness of a transformed cell depends on its microenvironment. Second, organismal evolution: tissues that generate more mutations are at greater risk of invasive neoplasms and thus are expected to have evolved more effective cancer prevention mechanisms, such as tissue compartmentalization. Extrinsic factors (such as tobacco smoke, UV radiation, diet, and infections) can alter cancer risk by changing the microenvironment, as well as by influencing stem cell division rates or mutation rates. I will present a reanalysis of recently published data that reveals a major effect of anatomical location on cancer risk, compelling us to reframe the question of how much cancer is due to unavoidable “bad luck”. I will argue that most modern-day cancer is in fact due to environments deviating from central tendencies of distributions that have prevailed during cancer resistance evolution. Otherwise, cancer risk across the tree of life almost never exceeds ~5%. I will support this claim with case-study demographic models for humans and naked mole rats (until recently thought to be virtually immune to cancer). The overarching framework provides a basis for understanding how cancer risk is shaped by evolution and environmental change.
6/4, John Wiedenhoeft, Rutgers University: Fast Bayesian Inference of Copy Number Variants using Hidden Markov Models with Wavelet Compression
Hidden Markov Models (HMM) play a central role in the statistical analysis of sequential data. For reasons of computational efficiency, classic frequentist methods such as Baum-Welch and Viterbi decoding are still widely used, despite drawbacks including locally maximal likelihood estimates. While recent advances in computing power have increasingly enabled the use of Bayesian methods, they are not able to cope with sequences comprising multiple millions or billions of observations as they arise for example in whole genome sequencing. We show that wavelet theory can be used to derive a simple dynamic data compression scheme to drastically improve speed and convergence behaviour of Forward-Backward Gibbs sampling in HMM, allowing for Bayesian inference of the marginal distributions of several million latent state variables in a matter of seconds to a few minutes.
18/5, ​Lloyd Demetrius, Harvard University, Max Planck Institut: An entropic selection principle in evolutionary processes
The Entropic Selection Principle states that the outcome of competition between related populations of metabolic entities – macromolecules, cells, higher organisms – is contingent on the resource constraints, and determined by evolutionary entropy, a statistical measure of the diversity of pathways of energy flow between these interacting elements.
I will describe the mathematical basis for the selection principle, and then discuss its application to
(i)  the evolution of life history
(ii)  the origins and evolution of cooperation
(iii)  the origin and propagation of age-related diseases.
Ref: Boltzmann, Darwin and Directionality Theory, Physics Reports, vol. 530 (2013)
1/6, Daniel Nichol, The Institute of Cancer Research: Collateral Sensitivity is Contingent on the Repeatability of Evolution
Abstract: The emergence of drug resistance is governed by Darwinian evolution. Therapeutic strategies that account for these evolutionary principles are needed to maintain the efficacy of antibiotics or to overcome resistance to cancer therapies. One such strategy is the identification of drug sequences wherein the evolution of resistance to each drug sensitises the population to the next; a phenomenon known as collateral sensitivity. Recently, experimental evolution has been used to find such sequences. I present mathematical modelling, coupled with experimental validation, indicating that drug strategies designed through experimental evolution often suggest collateral sensitivity where cross resistance ultimately occurs. This divergence in clinical response occurs as evolutionary trajectories are not necessarily repeatable in rugged fitness landscapes and low-replicate-number experiments can miss divergent trajectories. Using empirically derived estimates for these fitness landscapes, I demonstrate the potential for model-driven-design of drug sequences that minimise the likelihood of highly resistant disease arising. Finally, I explore the challenges of designing experimental studies to predict evolution and suggest an alternative approach that combines mathematical modelling, experimental evolution and distributed collection of clinical data.
8/6, Jie Yen Fan, Monash University: Limit theorems for size, age, type, and type structure dependent populations
We consider a multi-type population model in which individual lifespan and reproduction depend on the population size and composition, over ages and types of individuals. Under the assumption that the initial population is large and the dependence upon demographical conditions smooth, we give the Law of Large Numbers and Central Limit Theorem for the population density, i.e. the population size and composition over ages and types normed by its habitat carrying capacity, as the latter tends to infinity. The Law of Large Numbers, in its turn, yields a multi-dimensional generalised McKendrick-von Foerster differential equation, and the Central Limit Theorem shows that the limit process satisfies a new stochastic partial differential equation. The key to the analysis is Sobolev embedding. An important case is populations with two types, females and males, where only females reproduce, but with a fertility, influenced by the availability of males. Thus, modelling of mating (which is often not evident and even irrelevant, like in fish and algae) can be bypassed in the study of populations with sexual reproduction.
12/10, Aila Särkkä: ​Anisotropy analysis of spatial point patterns
Abstract:  In the early spatial point process literature, observed point patterns were typically small and no repetitions were available. It was natural to assume that the patterns were realizations of stationary and isotropic point processes. Nowadays, large data sets with repetitions have become more and more common and it is important to think about the validity of these assumptions. Non-stationarity has gotten quite a lot of attention during the recent years and it is straightforward to include it in many point process models. Isotropy, on the other hand, is often still assumed without further checking, and even though there are several tools suggested to detect isotropy and test for it, they have not been so widely used.
This seminar will give an overview of nonparametric methods for anisotropy analysis of (stationary) point processes. Methods based on nearest neighbour and second order summary statistics as well as spectral and wavelet analysis will be discussed. The techniques will be illustrated on both a clustered and a regular example. Finally, one of the methods will be used to estimate the deformation history in polar ice using the measured anisotropy of air inclusions from deep ice cores.

14/12, David Bolin: The SPDE approach for Gaussian random fields with general smoothness
Abstract: A popular approach for modeling and inference in spatial statistics is to represent Gaussian random fields as solutions to stochastic partial differential equations (SPDEs) of the form Lβu = W, where W is Gaussian white noise, L is a second-order differential operator, and β > 0 is a parameter that determines the smoothness of u. However, this approach has been limited to the case 2β ∈ N, which excludes several important covariance models such as the exponential covariance on R2.
We demonstrate how this restriction can be avoided by combining a finite element discretization in space with a rational approximation of the function x−β to approximate the solution u. For the resulting approximation, an explicit rate of strong convergence is derived and we show that the method has the same computational benefits as in the restricted case 2β ∈ N when used for statistical inference and prediction. Several numerical experiments are performed to illustrate the accuracy of the method, and to show how it can be used for likelihood-based inference for all model parameters including β.

Published: Tue 19 Dec 2017.