< Back to previous page

Publication

A comparison of variational approximations for fast inference in mixed logit models

Journal Contribution - Journal Article

Variational Bayesian methods aim to address some of the weaknesses (computation time, storage costs and convergence monitoring) of mainstream Markov chain Monte Carlo based inference at the cost of a biased but more tractable approximation to the posterior distribution. We investigate the performance of variational approximations in the context of the mixed logit model, which is one of the most used models for discrete choice data. A typical treatment using the variational Bayesian methodology is hindered by the fact that the expectation of the so called log-sum-exponential function has no explicit expression. Therefore additional approximations are required to maintain tractability. In this paper we compare seven different possible bounds or approximations. We found that quadratic bounds are not sufficiently accurate. A recently proposed non-quadratic bound did perform well. We also found that the Taylor series approximation used in a previous study of variational Bayes for mixed logit models is only accurate for specific settings. Our proposed approximation based on quasi Monte Carlo sampling performed consistently well across all simulation settings while remaining computationally tractable.
Journal: Computational Statistics
ISSN: 0943-4062
Issue: 1
Volume: 32
Pages: 93 - 125
Publication year:2017
BOF-keylabel:yes
IOF-keylabel:yes
BOF-publication weight:0.1
CSS-citation score:1
Authors from:Higher Education
Accessibility:Closed