skip to content

Explicit error bounds for randomized Smolyak algorithms and an application to infinite-dimensional integration

Presented by: 
Michael Gnewuch
Thursday 21st February 2019 - 11:00 to 11:35
INI Seminar Room 1
Smolyak's method, also known as hyperbolic cross approximation or sparse grid method, is a powerful %black box tool to tackle multivariate tensor product problems just with the help of efficient algorithms for the corresponding univariate problem. We provide upper and lower error bounds for randomized Smolyak algorithms with fully explicit dependence on the number of variables and the number of information evaluations used. The error criteria we consider are the worst-case root mean square error (the typical error criterion for randomized algorithms, often referred to as ``randomized error'') and the root mean square worst-case error (often referred to as ``worst-case error''). Randomized Smolyak algorithms can be used as building blocks for efficient methods, such as multilevel algorithms, multivariate decomposition methods or dimension-wise quadrature methods, to tackle successfully high-dimensional or even infinite-dimensional problems. As an example, we provide a very general and sharp result on infinite-dimensional integration on weighted reproducing kernel Hilbert spaces and illustrate it for the special case of weighted Korobov spaces. We explain how this result can be extended, e.g., to spaces of functions whose smooth dependence on successive variables increases (``spaces of increasing smoothness'') and to the problem of L_2-approximation (function recovery).
The video for this talk should appear here if JavaScript is enabled.
If it doesn't, something may have gone wrong with our embedded player.
We'll get it fixed as soon as possible.
University of Cambridge Research Councils UK
    Clay Mathematics Institute London Mathematical Society NM Rothschild and Sons