Marginalized Gaussian Mixture Model

:::{post} Sept 18, 2021 :tags: mixture model, :category: intermediate :::

Gaussian mixtures are a flexible class of models for data that exhibits subpopulation heterogeneity. A toy example of such a data set is shown below.

A natural parameterization of the Gaussian mixture model is as the latent variable model

μ1,,μKN(0,σ2)τ1,,τKGamma(a,b)wDir(α)z  wCat(w)x  zN(μz,τz1).\begin{align*} \mu_1, \ldots, \mu_K & \sim N(0, \sigma^2) \\ \tau_1, \ldots, \tau_K & \sim \textrm{Gamma}(a, b) \\ \boldsymbol{w} & \sim \textrm{Dir}(\boldsymbol{\alpha}) \\ z\ |\ \boldsymbol{w} & \sim \textrm{Cat}(\boldsymbol{w}) \\ x\ |\ z & \sim N(\mu_z, \tau^{-1}_z). \end{align*}

An implementation of this parameterization in PyMC3 is available at {doc}gaussian_mixture_model. A drawback of this parameterization is that is posterior relies on sampling the discrete latent variable $z$. This reliance can cause slow mixing and ineffective exploration of the tails of the distribution.

An alternative, equivalent parameterization that addresses these problems is to marginalize over $z$. The marginalized model is

μ1,,μKN(0,σ2)τ1,,τKGamma(a,b)wDir(α)f(x  w)=i=1Kwi N(x  μi,τi1),\begin{align*} \mu_1, \ldots, \mu_K & \sim N(0, \sigma^2) \\ \tau_1, \ldots, \tau_K & \sim \textrm{Gamma}(a, b) \\ \boldsymbol{w} & \sim \textrm{Dir}(\boldsymbol{\alpha}) \\ f(x\ |\ \boldsymbol{w}) & = \sum_{i = 1}^K w_i\ N(x\ |\ \mu_i, \tau^{-1}_i), \end{align*}

where

N(x  μ,σ2)=12πσexp(12σ2(xμ)2)N(x\ |\ \mu, \sigma^2) = \frac{1}{\sqrt{2 \pi} \sigma} \exp\left(-\frac{1}{2 \sigma^2} (x - \mu)^2\right)

is the probability density function of the normal distribution.

Marginalizing $z$ out of the model generally leads to faster mixing and better exploration of the tails of the posterior distribution. Marginalization over discrete parameters is a common trick in the Stan community, since Stan does not support sampling from discrete distributions. For further details on marginalization and several worked examples, see the Stan User's Guide and Reference Manual.

PyMC3 supports marginalized Gaussian mixture models through its NormalMixture class. (It also supports marginalized general mixture models through its Mixture class) Below we specify and fit a marginalized Gaussian mixture model to this data in PyMC3.

We see in the following plot that the posterior distribution on the weights and the component means has captured the true value quite well.

We see that the posterior predictive samples have a distribution quite close to that of the observed data.

Watermark