Boosting black box variational inference

Locatello F, Dresdner G, Khanna R, Valera I, Rätsch G. 2018. Boosting black box variational inference. Advances in Neural Information Processing Systems. NeurIPS: Neural Information Processing Systems vol. 31.

Conference Paper | Published | English

Scopus indexed
Author
Locatello, FrancescoISTA ; Dresdner, Gideon; Khanna, Rajiv; Valera, Isabel; Rätsch, Gunnar
Department
Abstract
Approximating a probability density in a tractable manner is a central task in Bayesian statistics. Variational Inference (VI) is a popular technique that achieves tractability by choosing a relatively simple variational family. Borrowing ideas from the classic boosting framework, recent approaches attempt to \emph{boost} VI by replacing the selection of a single density with a greedily constructed mixture of densities. In order to guarantee convergence, previous works impose stringent assumptions that require significant effort for practitioners. Specifically, they require a custom implementation of the greedy step (called the LMO) for every probabilistic model with respect to an unnatural variational family of truncated distributions. Our work fixes these issues with novel theoretical and algorithmic insights. On the theoretical side, we show that boosting VI satisfies a relaxed smoothness assumption which is sufficient for the convergence of the functional Frank-Wolfe (FW) algorithm. Furthermore, we rephrase the LMO problem and propose to maximize the Residual ELBO (RELBO) which replaces the standard ELBO optimization in VI. These theoretical enhancements allow for black box implementation of the boosting subroutine. Finally, we present a stopping criterion drawn from the duality gap in the classic FW analyses and exhaustive experiments to illustrate the usefulness of our theoretical and algorithmic contributions.
Publishing Year
Date Published
2018-06-06
Proceedings Title
Advances in Neural Information Processing Systems
Volume
31
Conference
NeurIPS: Neural Information Processing Systems
Conference Location
Montreal, Canada
Conference Date
2018-12-03 – 2018-12-08
eISSN
IST-REx-ID

Cite this

Locatello F, Dresdner G, Khanna R, Valera I, Rätsch G. Boosting black box variational inference. In: Advances in Neural Information Processing Systems. Vol 31. Neural Information Processing Systems Foundation; 2018.
Locatello, F., Dresdner, G., Khanna, R., Valera, I., & Rätsch, G. (2018). Boosting black box variational inference. In Advances in Neural Information Processing Systems (Vol. 31). Montreal, Canada: Neural Information Processing Systems Foundation.
Locatello, Francesco, Gideon Dresdner, Rajiv Khanna, Isabel Valera, and Gunnar Rätsch. “Boosting Black Box Variational Inference.” In Advances in Neural Information Processing Systems, Vol. 31. Neural Information Processing Systems Foundation, 2018.
F. Locatello, G. Dresdner, R. Khanna, I. Valera, and G. Rätsch, “Boosting black box variational inference,” in Advances in Neural Information Processing Systems, Montreal, Canada, 2018, vol. 31.
Locatello F, Dresdner G, Khanna R, Valera I, Rätsch G. 2018. Boosting black box variational inference. Advances in Neural Information Processing Systems. NeurIPS: Neural Information Processing Systems vol. 31.
Locatello, Francesco, et al. “Boosting Black Box Variational Inference.” Advances in Neural Information Processing Systems, vol. 31, Neural Information Processing Systems Foundation, 2018.
All files available under the following license(s):
Copyright Statement:
This Item is protected by copyright and/or related rights. [...]

Link(s) to Main File(s)
Access Level
OA Open Access

Export

Marked Publications

Open Data ISTA Research Explorer

Sources

arXiv 1806.02185

Search this title in

Google Scholar
ISBN Search