UCL logo

UCL Discovery

UCL home » Library Services » Electronic resources » UCL Discovery

Collapsed Variational Dirichlet Process Mixture Models

Kurihara, K; Welling, M; Teh, YW; (2007) Collapsed Variational Dirichlet Process Mixture Models. In: Veloso, MM, (ed.) 20TH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI-07), PROCEEDINGS. (pp. 2796 - 2801). IJCAI-INT JOINT CONF ARTIF INTELL

Full text not available from this repository.

Abstract

Nonparametric Bayesian mixture models, in particular Dirichlet process (DP) mixture models, have shown great promise for density estimation and data clustering. Given the size of today's datasets, computational efficiency becomes an essential ingredient in the applicability of these techniques to real world data. We study and experimentally compare a number of variational Bayesian (VB) approximations to the DP mixture model. In particular we consider the standard VB approximation where parameters are assumed to be independent from cluster assignment variables, and a novel collapsed VB approximation where mixture weights are marginalized out. For both VB approximations we consider two different ways to approximate the DP, by truncating the stick-breaking construction, and by using a finite mixture model with a symmetric Dirichlet prior.

Type:Proceedings paper
Title:Collapsed Variational Dirichlet Process Mixture Models
Event:Workshop on Analytics for Noisy Unstructured Text Data held in Conjunction with the 20th International Joint Conference on Artificial Intelligence
Location:Hyderabad, INDIA
Dates:2007-01-06 - 2007-01-12
UCL classification:UCL > School of Life and Medical Sciences > Faculty of Life Sciences > Gatsby Computational Neuroscience Unit

Archive Staff Only: edit this record