skip to content

Certified dimension reduction of the input parameter space of vector-valued functions

Presented by: 
Olivier Zahm Massachusetts Institute of Technology
Thursday 8th March 2018 - 14:45 to 15:30
INI Seminar Room 1
Co-authors: Paul Constantine (University of Colorado), Clémentine Prieur (University Joseph Fourier), Youssef Marzouk (MIT)

Approximation of multivariate functions is a difficult task when the number of input parameters is large. Identifying the directions where the function does not significantly vary is a key preprocessing step to reduce the complexity of the approximation algorithms.

Among other dimensionality reduction tools, the active subspace is defined by means of the gradient of a scalar-valued function. It can be interpreted as the subspace in the parameter space where the gradient varies the most. In this talk, we propose a natural extension of the active subspace for vector-valued functions, e.g. functions with multiple scalar-valued outputs or functions taking values in function spaces. Our methodology consists in minimizing an upper-bound of the approximation error obtained using Poincaré-type inequalities.

We also compare the proposed gradient-based approach with the popular and widely used truncated Karhunen-Loève decomposition (KL). We show that, from a theoretical perspective, the truncated KL can be interpreted as a method which minimizes a looser upper bound of the error compared to the one we derived. Also, numerical comparisons show that better dimension reduction can be obtained provided gradients of the function are available. 
University of Cambridge Research Councils UK
    Clay Mathematics Institute London Mathematical Society NM Rothschild and Sons