EVENTS

MAC-MIGS Deep Dive in High Dimensional Sampling and Applications by Prof. Gabriel Stoltz

1st November 2022

We are pleased to invite members of staff and PhD students to attend two tutorial lectures and a seminar given by Prof. Gabriel Stoltz as part of his visit to Edinburgh, on the topic of “Algorithms for sampling high dimensional probability measures, with applications in molecular modelling and Bayesian statistics”

Gabriel Stoltz is a Senior researcher at CERMICS and a professor at Ecole des Ponts. His work focuses on the mathematical and numerical analysis of models from molecular simulation, with a current emphasis on computational statistical physics. This involves, from a mathematical viewpoint, techniques ranging from probability theory and the study of stochastic processes, to functional analysis and the theory of partial differential equations, as well as numerical analysis and scientific computing.

This is a hybrid event. The Zoom link is https://ed-ac-uk.zoom.us/j/81301145884 passcode: 5ampling

The times and location for the tutorials/seminar are as follows:

Timetable

1st of November:  13:00-14:15, Bayes 5.02 Tutorial Talk 1 An introduction to high dimensional sampling
2nd of November: 13:00-14:15, Bayes 5.46 Tutorial Talk 2 Hypocoercivity and the longtime convergence of degenerate stochastic dynamics
3rd of November: 11:00-12:15, Bayes 5.02 Seminar Reducing the mini-batching error in Bayesian inference using Adaptive Langevin dynamics

 

Registration is mandatory for the tutorials and recommended for the seminar. Link to the registration has already been sent out by email to all students and staff.

Note: All in-person slots have already been filled.

Lunch will be provided for registered attendees on all three days, before the talks on the 1st and 2nd and after the talk on the the 3rd.

The event is funded by MAC-MIGS [https://www.mac-migs.ac.uk].
MAC-MIGS is an EPSRC Centre for Doctoral Training based in Edinburgh, UK.

The abstracts for the talks can be found below.

You will receive an email once your participation has been confirmed.

Abstracts

 

Tutorial Talk 1
Title: An introduction to high dimensional sampling

Brief introduction to molecular dynamics (the computational implementation of the theory of statistical physics) and to Bayesian inference, two situations where sampling a high dimensional probability measure is required. Average properties for these two applications are typically obtained through ergodic averages of discretizations of certain stochastic differential equations. I will provide an introduction on the most popular stochastic dynamics to this end and their numerical analysis.

Tutorial Talk 2
Title:  Hypocoercivity and the longtime convergence of degenerate stochastic dynamics

The discussion will be focused on the properties of Langevin dynamics, a degenerate stochastic differential equation, which can be seen as a perturbation of Hamiltonian dynamics. From an analytical point of view, the generator of Langevin dynamics is a degenerate elliptic operator. The evolution of the law of the stochastic process is governed by the Fokker-Planck equation, and its longtime convergence can be obtained via hypocoercive techniques, which I will review. I will also present the implication of these analytical results in terms of error estimates for the computation of average properties of molecular systems by estimating the asymptotic variance of time averages in a central limit theorem.

Seminar
Title: Reducing the mini-batching error in Bayesian inference using Adaptive Langevin dynamics

The computational cost of usual Monte Carlo methods in Bayesian inference scales linearly with the number of data points. One option to reduce the cost is to resort to mini-batching to estimate the gradient. However, this leads to additional noise and hence a bias on the invariant measure. We advocate using the so-called Adaptive Langevin dynamics, which is a modification of standard inertial Langevin dynamics with a dynamical friction, which performs bias correction. We show using techniques from hypocoercivity that the law of Adaptive Langevin dynamics converges exponentially fast to equilibrium, with a rate which can be quantified in terms of the key parameters of the dynamics. Applications and extensions to Bayesian Neural  Networks will also be discussed.

None of the talks will be provided by teleconference or recorded.

Organisers: Katerina Karoni, Stefan Klus, Benedict Leimkuhler