Expectation Propagation -- Experimental Design for the Sparse Linear M
Seminar Room 2, Newton Institute Gatehouse
Expectation propagation (EP) is a novel variational method for approximate Bayesian inference, which has given promising results in terms of computational efficiency and accuracy in several machine learning applications. It can readily be applied to inference in linear models with non-Gaussian priors, generalised linear models, or nonparametric Gaussian process models, among others, yet has not been used in Statistics so far to our knowledge. I will give an introduction to this framework. I will then show how to address sequential experimental design for a linear model with non-Gaussian sparsity priors, giving some results in two different machine learning applications. These results indicate that experimental design for these models may have significantly different properties than for linear-Gaussian models, where Bayesian inference is analytically tractable and experimental design seems best understood. EP as a statistical approximation technique, and especially experimental design for models different from linear-Gaussian ones, is not well-understood theoretically. To advance on the understanding, it seems promising to relate it to work in Statistics on multivariate continuous-variable distributions, and I am hoping very much for feedback from the audience in that respect.