skip to content
 

Applications of approximate inference and experimental design for sparse (generalised) linear models

Date: 
Friday 27th June 2008 - 11:30 to 12:30
Venue: 
INI Seminar Room 1
Session Chair: 
Mike Titterington
Abstract: 

Sparsity, or more general sub-Gaussianity, is a fundamental regularization principle for high-dimensional statistics. A recent surge of activity has clarified the behaviour of efficient sparse estimators in the worst case, but much less is known about practically efficient approximations to Bayesian inference, which is required for higher-level tasks such as experimental design.

We present an efficient framework for Bayesian inference on generalized linear models with sparsity priors, based on the expectation propagation algorithm, a deterministic variational approximation. We highlight some applications where this framework produces promising results. We hope to convey the relevance of approximate inference methods in practice, which substantially go beyond point estimation, yet whose theoretical properties and algorithmic scalability remains insufficiently understood.

The video for this talk should appear here if JavaScript is enabled.
If it doesn't, something may have gone wrong with our embedded player.
We'll get it fixed as soon as possible.
University of Cambridge Research Councils UK
    Clay Mathematics Institute London Mathematical Society NM Rothschild and Sons