skip to content
 

Timetable (UNQW01)

Key UQ methodologies and motivating applications

Monday 8th January 2018 to Friday 12th January 2018

Monday 8th January 2018
09:00 to 09:50 Registration
09:50 to 10:00 Welcome from David Abrahams (INI Director)
10:00 to 11:00 Jim Berger (Duke University); (University of Chicago)
Statistical perspectives on UQ, past and present
INI 1
11:00 to 11:30 Morning Coffee
11:30 to 12:30 Ralph Smith (North Carolina State University)
Uncertainty Quantification from a Mathematical Perspective
From both mathematical and statistical perspectives, the fundamental goal of Uncertainty Quantification (UQ) is to ascertain uncertainties inherent to parameters, initial and boundary conditions, experimental data, and models themselves to make predictions with improved and quantified accuracy.  Some factors that motivate recent developments in mathematical UQ analysis include the following.  The first is the goal of quantifying uncertainties for models and applications whose complexity precludes sole reliance on sampling-based methods.  This includes simulation codes for discretized partial differential equation (PDE) models, which can require hours to days to run.  Secondly, models are typically nonlinearly parameterized thus requiring nonlinear statistical analysis.  Finally, there is often emphasis on extrapolatory or out-of-data predictions; e.g., using time-dependent models to predict future events.  This requires embedding statistical models within physical laws, such as conservation relations, to provide the structure required for extrapolatory predictions.  Within this context, the discussion will focus on techniques to isolate subsets and subspaces of inputs that are uniquely determined by data.  We will also discuss the use of stochastic collocation and Gaussian process techniques to construct and verify surrogate models, which can be used for Bayesian inference and subsequent uncertainty propagation to construct prediction intervals for statistical quantities of interest.  The presentation will conclude with discussion pertaining to the quantification of model discrepancies in a manner that preserves physical structures.
INI 1
12:30 to 13:00 Poster Blitz I INI 1
13:00 to 14:00 Lunch @ Churchill College
14:00 to 15:00 Bertrand Iooss (Electricté de France); (Université Paul Sabatier Toulouse III)
Uncertainty quantification of numerical experiments: Several issues in industrial applications
INI 1
15:00 to 16:00 Ahmed ElSheikh (Heriot-Watt University)
Machine learning techniques for uncertainty quantification of subsurface reservoir models
INI 1
16:00 to 16:30 Afternoon Tea
16:30 to 17:30 Richard Bradley (London School of Economics)
tba
INI 1
17:30 to 18:30 Welcome Wine Reception & Poster Session I at INI
Tuesday 9th January 2018
09:00 to 10:00 Michael Goldstein (Durham University)
Uncertainty analysis for complex systems modelled by computer simulators
This talk will present an overview of aspects of a general Bayesian methodology for performing detailed uncertainty analyses for complex physical systems which are modelled by computer simulators. Key features of this methodology are (i) simulator emulation, to allow us to explore the full range of outputs of the simulator (ii) history matching to identify all input choices consistent with historical data, and thus all future outcomes consistent with these choices (iii) structural discrepancy modelling, to make reliable uncertainty statements about the real world.
INI 1
10:00 to 11:00 Clayton Webster (University of Tennessee); (Oak Ridge National Laboratory)
tba
INI 1
11:00 to 11:30 Morning Coffee
11:30 to 12:30 Henry Wynn (London School of Economics)
Experimental design in computer experiments: review and recent research
Computer experiments have led to a growth in the development of certain types of experimental design which fill out the input space of a simulator in a comprehensive way: Latin Hypercube Sampling, Sobol sequences and many others. They differ from more traditional factorial experimental designs which have been used typically to fit polynomial response surfaces. Despite this structural difference the principals of good and indeed optimal design still apply as do the tensions between general purpose designs and designs tuned to particular models and utility functions.

The talk will be split between the fundamental principles of experimental design as applied to computer experiments and a review of notable methods from the research community. Some attention will be given to designs based on information theoretic principles and the connection to more general theories of learning, where Bayesian principles are particularly useful. 
INI 1
12:30 to 13:30 Lunch @ Churchill College
13:30 to 14:30 Lindsay Lee (University of Leeds)
UQ in earth sciences: applications and challenges
Co-authors: Ken Carslaw (University of Leeds), Carly Reddington (University of Leeds), Kirsty Pringle (University of Leeds), Graham Mann (University of Leeds), Oliver Wild (University of Lancaster), Edmund Ryan (University of Lancaster), Philip Stier (University of Oxford), Duncan Watson-Parris (University of Oxford)

I will introduce some of the applications of UQ in earth sciences and the challenges remaining that could be addressed during the programme. Earth science models are 3-d dynamic models whose CPU demands and data storage often limits the sample size for UQ. We often choose to use averages of the data and dimension reduction to carry out UQ but it is not always clear that the uncertainty quantified is the most useful for uncertainty reduction or increasing confidence in prediction. I will ask whether we should be applying the same techniques to understand and improve the model as those used to reduce uncertainty in predictions showing some examples where the end goal is different. I will look at UQ when constraint or calibration is the goal and how we incorporate uncertainty and use ‘real’ data. This will also raise the question of identifiability in our uncertainty quantification and how to deal with and accurately quantify irreducible uncertainty. Finally, I would like to discuss how we validate our methods in a meaningful way.
INI 1
14:30 to 15:30 Peter Jan van Leeuwen (University of Reading)
tba
INI 1
15:30 to 16:00 Afternoon Tea
16:00 to 17:00 Jakub Bijak (University of Southampton)
Uncertainty quantification in demography: Challenges and possible solutions
Demography, with its over 350 years of history, is renowned for its empiricism, firm links with statistics, and the lack of a strong theoretical background. The uncertainty of demographic processes is well acknowledged, and over the past three decades, methods have been designed for quantifying the uncertainty in population estimates and forecasts. In parallel, developments in model-based demographic simulation methods, such as agent-based modelling, have recently offered a promise of shedding some light on the complexity of population processes. However, the existing approaches are still far from fulfilling this promise, and are themselves fraught with epistemological pitfalls. Crucially, complex problems are not analytically tractable with the use of traditional methods. In this talk, I will discuss the potential of uncertainty quantification in bringing together the empirical data, statistical inference and computer simulations, with insights into behavioural and social theory and knowledge of social mechanisms. The discussion will be illustrated by an example of a new research programme on model-based migration studies.
INI 1
Wednesday 10th January 2018
09:00 to 10:00 Julia Charrier (Aix Marseille Université)
A few elements of numerical analysis for PDEs with random coefficients of lognormal type
INI 1
10:00 to 11:00 David Ginsbourger (Other); (Universität Bern)
Variations on the Expected Improvement
INI 1
11:00 to 11:30 Morning Coffee
11:30 to 12:30 Howard Elman (University of Maryland)
Linear Algebra Methods for Parameter-Dependent Partial Differential Equations
We discuss some recent developments in solution algorithms for the linear algebra problems that arise from parameter-dependent partial differential equations (PDEs). In this setting, there is a need to solve large coupled algebraic systems (which come from stochastic Galerkin methods), or large numbers of standard spatially discrete systems (from Monte Carlo or stochastic collocation methods). The ultimate goal is solutions that represent surrogate approximations that can be evaluated cheaply for multiple values of the parameters, which can be used effectively for simulation or uncertainty quantification.

Our focus is on representing parameterized solutions in reduced-basis or low-rank matrix formats. We show that efficient solution algorithms can be built from multigrid methods designed for the underlying discrete PDE, in combination with methods for truncating the ranks of iterates, which reduce both cost and storage requirements of solution algorithms. These ideas can be applied to the systems arising from many ways of treating the parameter spaces, including stochastic Galerkin and collocation. In addition, we present new approaches for solving the dense systems that arise from reduced-order models by preconditioned iterative methods and we show that such approaches can also be combined with empirical interpolation methods to solve the algebraic systems that arise from nonlinear PDEs. 
INI 1
12:30 to 13:30 Lunch @ Churchill College
13:30 to 17:00 Free Afternoon
19:30 to 22:00 Formal Dinner at Christ's College
Thursday 11th January 2018
09:00 to 10:00 Markus Bachmayr (Rheinische Friedrich-Wilhelms-Universität Bonn)
Low-rank approximations for parametric and random PDEs
INI 1
10:00 to 11:00 Tony O'Hagan (University of Sheffield)
Gaussian process emulation
Gaussian process (GP) emulators are a widely used tool in uncertainty quantification (UQ). This talk will define an emulator as a particular kind of surrogate model, then why GPs provide a tractable and flexible basis for constructing emulators, and set out some basic GP theory. There are a number of choices to be made in building a GP emulator, and these are discussed before going on to describe other uses of GPs in UQ and elsewhere.
INI 1
11:00 to 11:30 Morning Coffee
11:30 to 12:30 Masoumeh Dashti (University of Sussex)
The Bayesian approach to inverse problems
INI 1
12:30 to 13:30 Lunch @ Churchill College
13:30 to 14:30 Ron Bates
tba
INI 1
14:30 to 15:30 Ronni Bowman (Defence Science and Technology Laboratory)
UQ and Communication for Chemical and Biological Hazard Response
INI 1
15:30 to 16:00 Afternoon Tea
16:00 to 17:00 Richard Clayton (University of Sheffield)
Computational models of the heart: Why they are useful, and how they would benefit from UQ
INI 1
17:00 to 17:30 Poster Blitz II INI 1
17:30 to 18:30 Drinks Reception & Poster Session II at INI
Friday 12th January 2018
09:00 to 10:00 Robert Scheichl (University of Bath)
tba
INI 1
10:00 to 11:00 Mark Girolami (Imperial College London); (The Alan Turing Institute)
Probabilistic Numerical Computation: a Role for Statisticians in Numerical Analysis?
Consider the consequences of an alternative history. What if Euler had read the posthumous publication of the paper by Thomas Bayes on “An Essay towards solving a Problem in the Doctrine of Chances”? This was published in 1763 in the Philosophical Transactions of the Royal Society, so if Euler had read this article, we can wonder whether the section in his three volume book Institutionum calculi integralis, published in 1768, on numerical solution of differential equations might have been quite different.

Would the awareness by Euler of the “Bayesian” proposition of characterising uncertainty due to unknown quantities using the probability calculus have changed the development of numerical methods and their analysis to one that is more inherently statistical?

Fast forward the clock two centuries. F.M. Larkin published a paper in 1972 “Gaussian Measure on Hilbert Space and Applications in Numerical Analysis”. Therein a formal definition of the mathematical tools required to consider, for Hilbert spaces, probabilistic numerical analysis were laid down and methods such as Bayesian Quadrature or Bayesian Monte Carlo were developed in full.

Now the question of viewing numerical analysis as a problem of Statistical Inference in many ways seems natural and is being demanded by applied mathematicians, engineers and physicists who need to carefully and fully account for all sources of uncertainty in mathematical modelling and numerical simulation.

At present we have a research frontier that has emerged in scientific computation founded on the principle that error in numerical methods, which for example solves differential equations, entails uncertainty that ought to be subjected to formal statistical analysis. This viewpoint raises exciting challenges for contemporary statistical and numerical analysis, including the design of statistical methods that enable the coherent propagation of probability measures through a computational and inferential pipeline. 
INI 1
11:00 to 11:30 Morning Coffee
11:30 to 12:30 Claudia Schillings (University of Warwick)
Analysis of Ensemble Kalman Inversion
The Ensemble Kalman filter (EnKF) has had enormous impact on the applied sciences since its introduction in the 1990s by Evensen and coworkers. It is used for both data assimilation problems, where the objective is to estimate a partially observed time-evolving system, and inverse problems, where the objective is to estimate a (typically distributed) parameter appearing in a differential equation. In this talk we will focus on the identification of parameters through observations of the response of the system - the inverse problem. The EnKF can be adapted to this setting by introducing artificial dynamics. Despite documented success as a solver for such inverse problems, there is very little analysis of the algorithm. In this talk, we will discuss well-posedness and convergence results of the EnKF based on the continuous time scaling limits, which allow to derive estimates on the long-time behavior of the EnKF and, hence, provide insights into the convergence properties of the algorithm. In particular, we are interested in the properties of the EnKF for a fixed ensemble size. Results from various numerical experiments supporting the theoretical findings will be presented. This is joint work with Dirk Bloemker (U Augsburg), Mike Christie (Heriot-Watt University), Andrew M. Stuart (Caltech) and Philipp Wacker (FAU Erlangen-Nuernberg).
INI 1
12:30 to 13:30 Lunch @ Churchill College
13:30 to 14:30 Richard Wilkinson (University of Nottingham)
tba
INI 1
14:30 to 15:30 Catherine Powell (University of Manchester); Peter Challenor (University of Exeter)
Discussion session
INI 1
15:30 to 16:00 Afternoon Tea
University of Cambridge Research Councils UK
    Clay Mathematics Institute London Mathematical Society NM Rothschild and Sons