Header logo is

Dirichlet Process Gaussian Mixture Models: Choice of the Base Distribution




In the Bayesian mixture modeling framework it is possible to infer the necessary number of components to model the data and therefore it is unnecessary to explicitly restrict the number of components. Nonparametric mixture models sidestep the problem of finding the “correct” number of mixture components by assuming infinitely many components. In this paper Dirichlet process mixture (DPM) models are cast as infinite mixture models and inference using Markov chain Monte Carlo is described. The specification of the priors on the model parameters is often guided by mathematical and practical convenience. The primary goal of this paper is to compare the choice of conjugate and non-conjugate base distributions on a particular class of DPM models which is widely used in applications, the Dirichlet process Gaussian mixture model (DPGMM). We compare computational efficiency and modeling performance of DPGMM defined using a conjugate and a conditionally conjugate base distribution. We show that better density models can result from using a wider class of priors with no or only a modest increase in computational effort.

Author(s): Görür, D. and Rasmussen, CE.
Journal: Journal of Computer Science and Technology
Volume: 25
Number (issue): 4
Pages: 653-664
Year: 2010
Month: July
Day: 0

Department(s): Empirical Inference
Bibtex Type: Article (article)

Digital: 0
DOI: 10.1007/s11390-010-9355-8
Language: en
Organization: Max-Planck-Gesellschaft
School: Biologische Kybernetik

Links: PDF


  title = {Dirichlet Process Gaussian Mixture Models: Choice of the Base Distribution },
  author = {G{\"o}r{\"u}r, D. and Rasmussen, CE.},
  journal = {Journal of Computer Science and Technology},
  volume = {25},
  number = {4},
  pages = {653-664},
  organization = {Max-Planck-Gesellschaft},
  school = {Biologische Kybernetik},
  month = jul,
  year = {2010},
  month_numeric = {7}