The Nystrom extension and spectral methods in learning: low-rank approximation of quadratic forms and products
Seminar Room 1, Newton Institute
Spectral methods are of fundamental importance in statistics and machine learning, as they underlie algorithms from classical principal components analysis to more recent approaches that exploit manifold structure. In most cases, the core technical problem can be reduced to computing a low-rank approximation to a positive-definite kernel. Motivated by such applications, we present here two new algorithms for the approximation of positive semi-definite kernels, together with error bounds that improve upon known results. The first of thesebased on samplingleads to a randomized algorithm whereupon the kernel induces a probability distribution on its set of partitions, whereas the latter approachbased on sortingprovides for the selection of a partition in a deterministic way. After detailing their numerical implementation and verifying performance via simulation results for representative problems in statistical data analysis, we conclude with an extension of these results to the sparse representation of linear operators and the efficient approximation of matrix products.
If it doesn't, something may have gone wrong with our embedded player.
We'll get it fixed as soon as possible.