Professor Mark Girolami to present Medallion Lecture

by

Professor Mark Girolami

Professor Mark Girolami

Every year eight researchers are invited to give a Medallion Lecture by the Committee on Special Lectures.

Professor Mark Girolami from Imperial’s Mathematics department will present a Medallion Lecture at the 2017 Joint Statistical Meetings in Baltimore, which take place July 29–August 4, 2017.

Mark Girolami is an an EPSRC Established Career Research Fellow (2012–2018) and the Director of the Lloyds Register Foundation Turing Programme on Data Centric Engineering. An extended biography can be found on the Institute of Mathematical Statistics website.

A brief ceremony precedes the lecture, when each speaker receives a Medallion.

Find out more about Prof Girolami’s talk, entitled ‘Probabilistic Numerical Computation: a Role for Statisticians in Numerical Analysis?’ from the abstract below:

Consider the consequences of an alternative history. What if Leonhard Euler had happened to read the posthumous publication of the paper by Thomas Bayes on “An Essay towards solving a Problem in the Doctrine of Chances”? This paper was published in 1763 in the Philosophical Transactions of the Royal Society, so if Euler had read this article, we can wonder whether the section in his three volume book Institutionum calculi integralis, published in 1768, on numerical solution of differential equations might have been quite different.

Would the awareness by Euler of the “Bayesian” proposition of characterising uncertainty due to unknown quantities using the probability calculus have changed the development of numerical methods and their analysis to one that is more inherently statistical?

Fast forward the clock two centuries to the late 1960s in America, when the mathematician F.M. Larkin published a series of papers on the definition of Gaussian Measures in infinite dimensional Hilbert spaces, culminating in the 1972 work on “Gaussian Measure on Hilbert Space and Applications in Numerical Analysis”. In that work the formal definition of the mathematical tools required to consider average case errors in Hilbert spaces for numerical analysis were laid down and methods such as Bayesian Quadrature or Bayesian Monte Carlo were developed in full, long before their independent reinvention in the 1990s and 2000s brought them to a wider audience.

Now in 2017 the question of viewing numerical analysis as a problem of Statistical Inference in many ways seems natural and is being demanded by applied mathematicians, engineers and physicists who need to carefully and fully account for all sources of uncertainty in mathematical modelling and numerical simulation.

Now we have a research frontier that has emerged in scientific computation founded on the principle that error in numerical methods, which for example solves differential equations, entails uncertainty that ought to be subjected to statistical analysis. This viewpoint raises exciting challenges for contemporary statistical and numerical analysis, including the design of statistical methods that enable the coherent propagation of probability measures through a computational and inferential pipeline.

Reporter

Claudia Cannon

Claudia Cannon
Faculty of Natural Sciences

Click to expand or contract

Contact details

Email: c.cannon@imperial.ac.uk

Show all stories by this author