Accurate Frequentist Generalised Linear Mixed Model Analysis via Expectation Propagation
- Publication Type:
- Thesis
- Issue Date:
- 2021
Open Access
Copyright Clearance Process
- Recently Added
- In Progress
- Open Access
This item is open access.
Generalised linear mixed models are a particularly powerful and well established statistical tool. Unlike linear mixed models, where the integrals arising in likelihood functions can be expressed in closed form, the likelihood functions expressed in generalised linear mixed models do not follow tractable solutions. Methods such as Gauss-Hermite quadrature and Laplace approximation are the standard approaches to overcome these integrals. Although Gauss-Hermite quadrature is accurate it is also slow, rendering it unsuitable for analyses with more than two or three random effects. Laplace approximations are the most feasible solution, however the approximate inference they provide in binary models is well known to be inaccurate. A less common approach is to use Bayesian ideas such as data cloning, however they involve a number of technicalities and as such are difficult to implement. Although expectation propagation is generally used in Bayesian settings, in this thesis we introduce a novel approach where we use it as frequentists to achieve high accuracy results with minimal computational cost for inference on generalised linear mixed models. We show our methodology can be used to solve one level probit models without the need for quadrature, providing consistent and accurate results. We explain how using quadrature we can also extend our method to logistic, Poisson and negative-binomial models. Additionally we show how these models can be extended to two level models and crossed random effects models for the probit case. Finally we present applications of our methodology on two real datasets, both with different technical challenges.
Please use this identifier to cite or link to this item: