Dirichlet process, related priors and posterior asymptotics
Seminar Room 1, Newton Institute
We begin with the problem of prior construction on the space of probability measures and motivate the Dirichlet process as a natural candidate. Naively, the process may be constructed on a product space through Kolmogorov consistency theorem, but measure theoretic difficulties arise. To avoid these problems, we describe a construction of the Dirichlet process through countable co-ordinate projections. We discuss several properties of the Dirichlet process such as expectation, variance, posterior conjugacy, self-similarity, support, discreteness, marginal distribution, ties, convergence and the Sethuraman construction. To construct a prior for density estimation, the Dirichlet process may be convoluted with a kernel. Next we turn our attention to the study of consistency and convergence rates of posterior distribution for density estimation. We argue that positive prior probabilities of a neighborhood of the true density defined by the Kullback-Leibler divergence plays a key role in consistency studies. We sketch arguments to show that commonly used priors such as Dirichlet mixtures, Polya trees and Gaussian processes satisfy Kullback-Leibler property under mild restrictions. Additional conditions involving existence of certain tests or bounds for metric entropy are needed for topologies other than the weak topology. Next we study convergence rates of posterior distributions with respect to the Hellinger or the total variation distance. We argue that the rate of concentration of the prior in the Kullback-Leibler type neighborhoods of the true density and growth rate of metric entropies determine the convergence rate. We discuss the example of Dirichlet mixture of normals.