skip to content

Improvements to variational Bayesian inference

Presented by: 
WH Teh [UCL]
Wednesday 26th March 2008 - 11:00 to 12:00
INI Seminar Room 2

Variational Bayesian (VB) inference is an approximate inference framework that has been successfully applied in a wide variety of graphical models. It is well accepted that VB provides lowered variance in posterior estimation in exchange for higher bias, as opposed to Markov chain Monte Carlo (MCMC) inference. In this talk we shall explore improvements to the VB framework in order to reduce bias, in the context of a specific Bayesian network called latent Dirichlet allocation. Specifically we consider two ideas: collapsing or integrating out variables before any approximations are made, and hybrid methods that combine VB and MCMC techniques.

Presentation Material: 
University of Cambridge Research Councils UK
    Clay Mathematics Institute London Mathematical Society NM Rothschild and Sons