skip to content
 

Beating the Curse of Dimensionality: A Theoretical Analysis of Deep Neural Networks and Parametric PDEs

Presented by: 
Gitta Kutyniok Technische Universität Berlin
Date: 
Thursday 20th June 2019 - 14:20 to 15:10
Venue: 
INI Seminar Room 1
Abstract: 
High-dimensional parametric partial differential equations (PDEs) appear in various contexts including control and optimization problems, inverse problems, risk assessment, and uncertainty quantification. In most such scenarios the set of all admissible solutions associated with the parameter space is inherently low dimensional. This fact forms the foundation for the so-called reduced basis method.

Recently, numerical experiments demonstrated the remarkable efficiency of using deep neural networks to solve parametric problems. In this talk, we will present a theoretical justification for this class of approaches. More precisely, we will derive upper bounds on the complexity of ReLU neural networks approximating the solution maps of parametric PDEs. In fact, without any knowledge of its concrete shape, we use the inherent low-dimensionality of the solution manifold to obtain approximation rates which are significantly superior to those provided by classical approximation results. We use this low-dimensionality to guarantee the existence of a reduced basis. Then, for a large variety of parametric PDEs, we construct neural networks that yield approximations of the parametric maps not suffering from a curse of dimensionality and essentially only depending on the size of the reduced basis.

This is joint work with Philipp Petersen (Oxford), Mones Raslan, and Reinhold Schneider.
The video for this talk should appear here if JavaScript is enabled.
If it doesn't, something may have gone wrong with our embedded player.
We'll get it fixed as soon as possible.
University of Cambridge Research Councils UK
    Clay Mathematics Institute London Mathematical Society NM Rothschild and Sons