skip to content

Principal component analysis for learning tree tensor networks

Presented by: 
Anthony Nouy Université de Nantes
Friday 9th March 2018 - 11:45 to 12:30
INI Seminar Room 1
We present an extension of principal component analysis for functions of multiple random variables and an associated algorithm for the approximation of such functions using tree-based low-rank formats (tree tensor networks). A multivariate function is here considered as an element of a Hilbert tensor space of functions defined on a product set equipped with a probability measure. The algorithm only requires evaluations of functions on a structured set of points which is constructed adaptively. The algorithm constructs a hierarchy of subspaces associated with the different nodes of a dimension partition tree and a corresponding hierarchy of projection operators, based on interpolation or least-squares projection. Optimal subspaces are estimated using empirical principal component analysis of interpolations of partial random evaluations of the function. The algorithm is able to provide an approximation in any tree-based format with either a prescribed rank or a prescribed relative error, with a number of evaluations of the order of the storage complexity of the approximation format.
The video for this talk should appear here if JavaScript is enabled.
If it doesn't, something may have gone wrong with our embedded player.
We'll get it fixed as soon as possible.
University of Cambridge Research Councils UK
    Clay Mathematics Institute London Mathematical Society NM Rothschild and Sons