The good old “fully” Bayesian and hardcore Bayesian James Scott’s paper, Bayesian inference for logistic models using Polya-Gamma latent variables presents an effective way to perform posterior inference using Polya-Gamma data-augmentation trick. So, what is that trick? The goal here is that given a (e.g.,) binomial likelihood on given a p-dimensional input and a p-dimensional weight vector , where has a Gaussian prior

how to sample from the posterior over . It turns out one can sample from it simply by iterating the following two steps, introducing the Polya-Gamma latent variable

where the mean and covariance are defined by

where . If anyone who knows the Gaussian fun facts, this basically coincides with writing down the likelihood term as

Now, let’s figure out why one can write the likelihood this way.

**Theorem 1.** Let denote the density of the r.v. , . Then, the following holds for all :

where the integrand can be viewed as the joint distribution over () and . Using Theorem 1, we can write down the likelihood term for the th observation as

where . So, the posterior is given by

where

### Like this:

Like Loading...