skip to content
 

Timetable (CLPW04)

Uncertainty in Climate Prediction: Models, Methods and Decision Support

Monday 6th December 2010 to Friday 10th December 2010

Monday 6th December 2010
11:30 to 12:30 Registration INI 1
12:30 to 13:30 Lunch at Wolfson Court
13:30 to 13:55 Registration INI 1
13:55 to 14:00 Welcome from Sir David Wallace (INI Director) INI 1
14:00 to 15:00 K Horsburgh & J Huthnance ([National Oceanographic Centre])
Overview of Newton Institute "climate" programme
This talk will give an overview of the activities, emerging issues and findings from the programme. These include:

  • the statistical combination of scenarios, of model ensembles and of parametrisation values, including constraints from observations, all with their related uncertainties
  • the approach to inferences from paleo-data
  • some developments in identifying approaches to "tipping points"
  • contributions that reduced models may make to our understanding.
  • INI 1
    15:00 to 15:30 Afternoon tea
    15:30 to 16:30 M Collins ([Exeter])
    Synthesising Model Projections of Future Climate Change
    It has never been a more exciting time to be a climate modeler. International coordination of model experiments by the Coupled Model Intercomparison Project (CMIP5) will see the estimated production of over 2 petabytes of model output from 20 modeling centers and 40 different climate models (downloading the data on a home broadband would take 25 years). There will be new model functionality in terms of the processes represented in the models including chemistry and biology, new forcing scenarios including palaeoclimate and idealized cases and new experiments initialized with observations to look at near-term climate variability and change. Moreover there is an unprecedented interest in, and scrutiny of, climate model projections from fellow scientists, from the public and from governments.

    How on earth are we to make sense of this information overload? This talk will review some of the approaches that we expect will be used to analyses the new CMIP5 multi-model database. Some approaches rely on physical understanding of the climate system to make sense of the data. Some use simple statistical approaches to rationalize the output. In some specific cases, more complex statistical approaches may be applied carefully. Finally, all the approaches will have to be synthesized to provide a summary of the state of climate modeling science. This challenge will be discussed.
    INI 1
    16:30 to 17:00 Break with tea
    17:00 to 18:00 T Palmer ([Oxford])
    After Climategate and Cancun; What Next for Climate Science?
    The last year has been a difficult time for climate science, with leaked emails undermining public confidence and perhaps contributing to the failure of Copenhagen to reach an agreement on emissions cuts. On top of this, mid-term elections in the US suggest it will be difficult for President Obama to carry into legislation any substantial agreements on emissions cuts that may be made in Cancun, making the chances of such agreements less likely in the first place.

    How does climate science move forward in the light of these events? The evidence above suggests, whether we like it or not, that the arguments of so-called climate "sceptics", that the cost of major emissions cuts is not justified given existing uncertainties in climate predictions, have substantial political traction. Hence I believe that we are unlikely to move from the current stalemate without further advancing the science of climate change, in particular without reducing these uncertainties substantially. But this is not an easy task. In this talk I will review why these uncertainties exist in the first place. Ultimately, as we shall see, the issue is mathematical, we know the equations of climate at the level of partial differential equations, but we do not know how to solve these equations without at the same time producing biases and errors which are as large as the climate signal we are trying to predict. I will outline two new areas of work, which have been a focus of activity at the Isaac Newton Institute over the last four months, designed to reduce uncertainty in climate prediction. One is in the area of stochastic closure schemes for climate models, the other is in the area of data assimilation.

    Putting this new science into practice, however, is not straightforward, and will require new computing infrastructure hitherto unavailable to climate science. Hence, I will conclude with a plea to the governments of the world. Let's take the current stalemate of opinion as justifying a renewed effort to do all we humanly can to reduce existing uncertainties in predictions of climate change, globally and regionally, so we can move the argument forward, one way or the other, for the good of humanity. This will require a new sense of dedication both by scientists and by politicians around the world: by scientists to focus their efforts on the science needed to reduce uncertainties, and by politicians to fund the technological infrastructure needed to enable this science to be done as effectively and speedily as possible.
    INI 1
    18:00 to 18:45 Welcome wine reception
    18:45 to 19:30 Dinner at Wolfson Court (residents only)
    Tuesday 7th December 2010
    09:30 to 10:30 M Crucifix (Université Catholique de Louvain)
    Climate modelling at Quaternary time scales
    Climate changes at time scales of several (tens) of thousands years like glacial interglacial cycles may be viewed as an 'emergent feature' of the climate system. They can be understood at different levels : from a general circulations prospective (how changes in astronomical elements affect the hydrological, nutrient and carbon cycles); from a dynamical system prospective (locate and characterize bifuraction points, detect synchronisation phenemena, identify couplings between different time scales... ); or from a statistical prospective (estimate the probability of events and assess the predictability of the climate system at these time scales). The ambition of a general theory Pleistocene climate is to merge these approaches.

    The recent mathematical developments reviewed during the present Newton Institute constitute promising avenues to this end. For example, statistical emulators allow to explore in depth the input and parameter spaces of general circulation simulators, including their sensitivity to the astronomical forcing. Monte-Carlo statistical methods allow to calibrate low-order stochastic dynamical systems, and guide the process of criticism and selection of models. The purpose of this talk is to summarise advances gained during the Newton Institute along these lines.
    INI 1
    10:30 to 11:00 Morning coffee
    11:00 to 11:40 J Rougier ([Bristol])
    A statistical emulator for HadCM3
    As part of the PalaeoQUMP project, we would like to emulate HadCM3 North American Mid-Holocene summer temperature (MTWA) anomalies, at the resolution of the simulator (3.75 by 2.5 degrees). This will allow us to calibrate the parameterisation of HadCM3 to palaeoclimate measurements at the gridcell level, and to do continental-scale reconstructions based on these measurements.

    Emulation has been a major theme during this programme. In the broader field of Computer Experiments its utility is clear, and its application to simulators with scalar outputs has now become standard. However, jointly emulating a large and structured collection of simulator outputs, such as a spatial field of climate quantities, is much more challenging. During this programme we have made progress on both the principles of such an emulation, notably the role of dimensional reduction and of separability, the computational aspects, and also in building an actual emulator for HadCM3.

    This talk will focus on the practical aspects of building an emulator, and show how our emulator can be used to learn about HadCM3, and to learn about the value of palaeoclimate measurements for parameter calibration.
    INI 1
    11:40 to 12:30 M Goldstein ([Durham])
    Assessing climate uncertainty: models, meaning and methods
    INI 1
    12:30 to 13:30 Lunch at Wolfson Court
    14:00 to 15:00 D Stephenson ([Exeter])
    Grand Challenges in Probabilistic Climate Prediction
    This talk will summarise the unsolved mathematical and statistical challenges in probabilistic climate prediction that have emerged from theme B of this programme and the successful satellite workshop organised by the University of Exeter in September 2010.

    Related Links •http://www.newton.ac.uk/programmes/CLP/clpw02p.html - Exeter workshop talks and videos •http://empslocal.ex.ac.uk/iniw - Exeter workshop wiki pages
    INI 1
    15:00 to 15:30 Afternoon tea
    15:30 to 16:30 Discussion INI 1
    16:30 to 17:00 R Chandler ([UCL])
    Rewarding strength, discounting weakness: combining information from multiple climate simulators
    Although modern climate simulators represent our best available understanding of the climate system, projections can vary appreciably between them. Increasingly therefore, users of climate projections are advised to consider information from an "ensemble" of different simulators or "multimodel ensemble" (MME).

    When analysing a MME the simplest approach is to average each quantity of interest over all simulators, possibly weighting each simulator according to some measure of "quality". This approach has two drawbacks. Firstly, it is heuristic: results can differ between weighting schemes, leaving users little better off than before. Secondly, no simulator is uniformly better than all others: if projections of several different quantities are required the rankings of the simulators (and hence the implied weights) may differ considerably between quantities of interest.

    A more sophisticated approach is to use modern statistical techniques to derive probability density functions (pdfs) for the quantities of interest. However, no systematic attempt has yet been made to sample the range of possible modelling decisions in building a MME: therefore it is not clear to what extent the resulting "probabilities" are in any way relevant to the downstream user.

    This talk presents a statistical framework that addresses all of these issues, building on Leith and Chandler (2010). The emphasis is on conceptual aspects, although the framework has been applied in practice elsewhere. A mathematical analysis of the framework shows that:

    (a) Information from individual simulators is automatically weighted, alongside that from historical observations and from prior knowledge. (b) The weights reflect the relative value of different information sources for each quantity of interest. Thus each simulator is rewarded for its strengths, whereas its weaknesses are discounted. (c) The weights for an individual simulator depend on its internal variability, its expected consensus with other simulators, the internal variability of the real climate, and the propensity of simulators collectively to deviate from the real climate. (d) Some subjective judgements are inevitable.

    Reference: Leith, N.A. and Chandler, R.E. (2010). A framework for interpreting climate model outputs. J. R. Statist. Soc. C 59(2): 279-296.
    INI 1
    17:00 to 17:30 P Challenor ([National Oceanography Centre])
    Adventures in Emulation
    Emulators are widely used in climate modelling and in the analysis of computer experiments in general. In this talk I explore some ideas on how we might extend the way we build and exploit emulators to investigate climate problems. Particularly I will consider how extremal emulators can be used to look at extremes and how they change across the model input space; and how emulators can be used to improve the parameterisation of sub-grid scale processes.
    INI 1
    18:45 to 19:30 Dinner at Wolfson Court (residents only)
    Wednesday 8th December 2010
    09:30 to 10:30 P Cox ([Exeter])
    New Newtonian Alchemy: Turning Noise into Signal
    This talk will summarise progress on the application of three strands of the Newton Institute Mathematical and Statistical Approaches to Climate Modelling and Prediction Programme, which are all connected through the general notion of turning noise into signal:

    (a) Maximum Entropy Production (MEP) principles of the climate system – application to surface-to-atmosphere turbulent fluxes. (b) Time-series techniques to give forewarning of climate tipping points – application to possible climate-driven Amazon forest dieback. (c) Fluctuation theories to determine climate system sensitivities – using interannual variability of CO2 as a constraint on the sensitivity of tropical land carbon to climate change.

    In the case of (b) and (c) short-timescale variability is used to determine the vulnerability of the system to an abrupt change and a warming-induced release of land carbon, respectively. In (a), the application of MEP to vertical turbulent fluxes puts these “noisy” fluxes at centre-stage in the climate system, rather than being treated as a residual term in the surface energy balance.

    The Newton Institute programme has stimulated progress in all three of these areas. This talk will summarize the current state-of-play and suggest priorities for follow-on research.
    INI 1
    10:30 to 11:00 NS Namachchivaya ([Illinois at Urbana-Champaign])
    Homogenized Hybrid Particle Filters in Multi-scale Environments
    Multi-scale problems are commonly recognized for their complexity, yet the main challenge in multi-scale modeling is to recognize their simplicity, and make use of it to see how information interacts with these complex structures and scales. This paper outlines efficient methods that combine the information from the observations with the dynamics of coupled ocean - atmosphere models and seeks to improve decadal climate predictions by developing, testing, and improving the initialization of different components of the Earth system, particularly the ocean.

    We present a reduced-order particle filtering algorithm, the Homogenized Hybrid Particle Filter (HHPF), for state estimation in nonlinear multi-scale dynamical systems. This method combines stochastic homogenization with nonlinear filtering theory to construct an efficient particle filtering algorithm for the approximation of a reduced-order nonlinear filter for multi-scale systems. In this work, we show that the HHPF gives a good approximation of the conditional law of the coarse-grained dynamics in the limit as the scaling parameter goes to zero and the number of particles is sufficiently large.
    INI 1
    11:00 to 11:30 Morning coffee
    11:30 to 12:00 R Warren ([East Anglia])
    The Community Integrated Assessment System, CIAS, and its user interface CLIMASCOPE
    Successful communication of knowledge to climate change policy makers requires the careful integration of scientific knowledge in an integrated assessment that can be clearly communicated to stakeholders, and which encapsulates the uncertainties in the analysis and conveys the need for using a risk assessment approach. It is important that (i) the system is co-designed with the users (ii) relevant disciplines are included (iii) assumptions made are clear (iv) the robustness of outputs to uncertainties is demonstrated (v) the system is flexible so that it can keep up with changing stakeholder needs and (vi) the results are communicated clearly and are readily accessible. The "Community Integrated Assessment System" (CIAS) is a unique multi-institutional, modular, and flexible integrated assessment system for modeling climate change which fulfils the above criteria. It differs from other integrated models in being flexible, allowing various combinations of component modules, to be connected together into alternative integrated assessment models. Modules may be written at different institutions in different computer languages and/or based on different operating systems. Scientists are able determine which particular CIAS coupled model they wish to use through a web portal. This includes the facility to implement Latin hypercube experimental design facilitating formal uncertainty analysis. Further exploration of robustness is possible through the ability to select, for example, alternative climate models to address the same questions. It has been applied to study climate change mitigation, through for example the AVOIDing dangerous climate change project for the UK Department of Energy and Climate Change (DECC), in which the avoided impacts (benefits) of alternative climate policies were compared to no-policy baselines. AVOID is a DECC funded initiative which allows fine tuning of research tools and outputs for the use of policy makers. A second web portal, CLIMASCOPE, is being developed to allow a wide range of users free access to regional climate change projections and encourage risk assessment through encapsulation of uncertainties. We have begun discussions on appropriate design and use of this tool with stakeholders located in Mexico and Madagascar. Both the AVOID and CLIMASCOPE discussions provide fora in which we can come to better understand stakeholder needs, and use this knowledge to guide the evolving design of CIAS.
    INI 1
    12:00 to 12:30 M Aldrin (Norwegian Computing Centre)
    Bayesian estimation of the climate sensitivity based on a simple climate model fitted to global temperature observations
    Bayesian estimation of the climate sensitivity based on a simple climate model fitted to global temperature observations

    Magne Aldrin, Norwegian Computing Center and University of Oslo Marit Holden, Norwegian Computing Center Peter Guttorp, Norwegian Computing Center and University of Seattle

    The climate sensitivity is a central parameter in understanding climate change. It is defined as the increase in global temperature due to a doubling of CO2 compared to pre-industrial time. Our aim is to estimate the climate sensitivity by modelling the relationship between (estimates of) radiative forcing and observations of global temperature and ocean heat content in post-industrial time. Complex general circulation models are computationally expensive for this purpose, and we use instead a simple climate model of reduced complexity. This climate model is deterministic, and we combine it with a stochastic model to do proper inference.

    Our combined model is

    yt = mt(xt-, S, theta) + nt

    Here, yt is the observed vector of global temperature and ocean heat content in year t and mt the corresponding output from the simple climate model. Furthermore, the model input xt- is the unknown radiative forcing in year t and previous years. S is the climate sensitivity which is the parameter of interest andtheta is a vector with other model parameters. Finally, nt is an autoregressive error term accounting for model errors and measurement errors. We use a flat prior for the climate sensitivity and informative priors for most other parameters.

    The model was fitted to observations of global temperatures from 1850 to 2007 and of ocean heat content from 1955 to 2007. The work is still in progress, so the estimate of the climate sensitivity is preliminary. However, this preliminary estimate is a few degrees Celsius above zero, which is comparable with other estimates.

    We believe that this approach is a valuable addition to other methods for estimating the climate sensitivity, where physical knowledge and observed data are linked together by statistical modelling and estimation methods.

    From a statistical point of view, it is an example of calibration of computer models, but with more emphasis on modelling the discrepancy between the observations and the computer model, than on using an emulator or surrogate model for the computer model, that has been central in much of the recent work in this area.
    INI 1
    12:30 to 13:30 Lunch at Wolfson Court
    13:30 to 15:00 Posters with coffee INI 1
    15:00 to 15:30 Afternoon tea
    15:30 to 16:10 P Williams ([Reading])
    The importance of numerical time-stepping errors
    Comprehensive assessments of uncertainty in climate prediction models should in principle consider contributions from the discretised numerical schemes, as well as from the parameterised physics and other sources. The numerical contributions are often assumed to be negligible, however. This talk reviews the evidence for uncertainty arising from time-stepping schemes, and suggests a possible avenue for progress to reduce it.

    The context for the progress is that many climate models use the simple leapfrog scheme in concert with the Robert-Asselin filter. Unfortunately, this filter introduces artificial damping and degrades the formal accuracy, because of a conservation problem. A simple modification to the filter is proposed, which fixes the conservation problem, thereby reducing the artificial damping and increasing the formal accuracy.

    The Robert-Asselin-Williams (RAW) filter may easily be incorporated into existing climate models, via the addition of only a few lines of code that are computationally very inexpensive. Results will be shown from recent implementations of the RAW filter in various models. The modification will be shown to reduce model biases and to significantly improve the predictive skill.
    INI 1
    16:10 to 18:45 Discussions INI 1
    19:30 to 22:00 Conference Dinner at Christ's College
    Thursday 9th December 2010
    09:30 to 10:00 N Hritonenko ([Prairie View A&M])
    Engaging with policymakers: modelling of the optimal investments into environmental maintenance, abatement, and adaptation for long-term climate policies
    In order to design an efficient and sustainable climate policy, a crucial decision issue is to find the rational policy mix between environmental abatement and adaptation. An aggregated economic-environmental model is constructed that allows us to explore this topic analytically. The model is formulated in the classic Solow-Swan macroeconomic framework as an infinite-horizon social planner problem with the adaptation and abatement investments as separate decision variables. The utility function depends on the environmental quality and adaptation efforts. A qualitative analysis of the model demonstrates the existence of a unique steady state and leads to determining the optimal balance between investing into adaptation measures and emission abatement. The outcomes of the model investigation can be implemented in associated long-term environmental policies and regulations. The model is calibrated on available economic and climate data and predictions. Some other economic-environmental models with endogenous technological change, energy restrictions, and environmental CO2 quotas will be also briefly discussed.
    INI 1
    10:00 to 11:00 R Street ([Oxford])
    Lessons learned from striving to support decision and policy making: the challenges when providing climate information
    The necessity to provide climate information (historical and future climates) to support decision and policy making, particularly in the context of adaptation, raises many challenges. These challenges arise as there is an increasing need for information that can inform decisions and policy development and that same time reflects the evolving state of the climate science and adaptation theory and practice. These challenges include understanding uncertainties and different means of reflecting these in adaptation assessments. Addressing these challenges often means understanding the nature and processes involved in making the associated decisions and policies. It also means looking at the information from the perspective of extracting that which is necessary to inform these processes; more than just providing a description of the climate. There is also a need to for decisions and policies to ‘embrace’ the uncertainties. This presentation explores some of the lessons learned and the challenges still needing to be addressed towards achieving the desired balance and to enhance the value of climate information in supporting adaptation.
    INI 1
    11:00 to 11:30 Morning coffee
    11:30 to 12:00 M Edwards (CAFOD)
    Is it going to get wetter or drier in Uganda?
    The humanitarian and development sector is wrestling with the issue of climate change. Many of our beneficiaries in Africa, Asia and Latin America are reporting changes in weather and climate; yet, are these changes a result of natural perturbations in climate or a manifestation of anthropogenic climate change? Unless we know what we are dealing with we may respond to the threat of a changing climate inappropriately and, thereby, create vulnerability rather than reduce it – the aim of our work. We therefore need answers from the scientists. Understandably, it is impossible for climate scientists to say whether any single weather event can be attributed to climate change but we do need to know whether, for example, the current drought in East Africa is a result of climate change or a natural part of the cyclical climate of that region. If we cannot answer this ‘simple’ question then one has to ask whether climate modelling can offer anything of value to those in the humanitarian and development sector who are trying to support adaptation strategies to climate change. We may as well look at past and current trends and adapt to what we see rather than base our response strategies on abstract representations of the future which are fraught with uncertainty – climate models. Scientists are expert at saying what they cannot say about climate change but they are notoriously bad at saying what can be said about the impacts of climate change. As such, those dealing with the impacts of climate change are starting to question the role of climate science in their work – does it really have a role to play in helping people adapt to climate change? If so, what is this role? This paper will ask the simple question: ‘Is it going to get wetter or drier in Uganda’? It is hoped that the answer to this question will help the climate scientists appreciate more fully the challenges faced by the humanitarian and development sector as it responds to what appears to be an increasing incidence of ‘strange’ weather.
    INI 1
    12:00 to 12:30 P Thornton ([Copenhagen])
    What does the agricultural research-for-development community need from climate and weather data?
    Agricultural development in many parts of the tropics, particularly sub-Saharan Africa, faces daunting challenges, which climate change and increasing climate variability will compound in vulnerable areas. A plus-two degree world appears inevitable, and planning for and implementing successful adaptation strategies is critical if agricultural growth is to occur, food security be achieved, and household livelihoods be enhanced. Reliable climate and weather information, at various spatial and temporal scales, is essential in the design of effective strategies for adaptation to climate change and to better our understanding of the critical thresholds in global and regional food systems. Land-based observation and data collection systems in parts of the tropics have been in decline for decades, and the uncertainty in basic information adds considerable difficulty to the quantification and evaluation of impacts and adaptation options. While new technology is being brought to bear on the data issue, much remains to be done. Serious attention also needs to be given to the evaluation of uncertainty and its presentation to decision-makers, in a way that can facilitate the design and implementation of policy-relevant action.
    INI 1
    12:30 to 13:30 Lunch at Wolfson Court
    13:30 to 15:00 Posters with coffee INI 1
    15:00 to 15:30 Afternoon tea
    15:30 to 16:10 D Whitaker (Knowledge Transfer Network)
    Engagement with business- What are the barriers to use of climate data, where should future research be taken?
    Will give details of how financial services use climate models and how uncertainty matters to them. Also discuss how future research might be directed.
    INI 1
    16:10 to 17:00 H Held ([PIK])
    Climate investments optimized under uncertainty
    While the EU has been propagating the 2° target over the last decade, now also the Copenhagen Accord (2009) recognizes that 'the increase in global temperature should be below 2 degrees Celsius' (compared to pre-industrial levels). In recent years, energy economics have derived welfare-optimal investment streams into low-emission energy mixes and associated costs. According to our analyses, auxiliary targets that are in line with the 2° target could be achieved at relatively low costs if energy investments were triggered rather swiftly.

    While such analyses assume 'perfect foresight' of a benevolent 'social planner', an accompanying suite of experiments explicitly acknowledges the rather uncertain nature of key responses to human decisions within the climate as well as the technology system. We outline first results into that direction and indicate an intrinsic need for generalisation within target approaches under uncertainty.
    INI 1
    17:00 to 18:00 Discussions INI 1
    18:45 to 19:30 Dinner at Wolfson Court (residents only)
    Friday 10th December 2010
    09:00 to 09:15 Introduction to break-out sessions INI 1
    09:15 to 10:30 Break-out sessions 1 and 2 (e.g. emerging findings; effective dissemination) INI 1
    09:15 to 12:00 Break-out discussions INI 2
    10:30 to 11:00 Report-back (Plenary) INI 1
    11:00 to 11:30 Morning coffee
    11:30 to 12:10 Break-out session 3 (e.g. future directions) INI 1
    12:10 to 12:30 Report-back and wrap-up (Plenary) INI 1
    12:30 to 13:30 Lunch at Wolfson Court
    18:45 to 19:30 Dinner at Wolfson Court (residents only)
    University of Cambridge Research Councils UK
        Clay Mathematics Institute London Mathematical Society NM Rothschild and Sons