# Gibbs sampler and coordinate ascent variational inference: A set-theoretical review

@article{Lee2020GibbsSA, title={Gibbs sampler and coordinate ascent variational inference: A set-theoretical review}, author={Se Yoon Lee}, journal={arXiv: Statistics Theory}, year={2020} }

A central task in Bayesian machine learning is the approximation of the posterior distribution. Gibbs sampler and coordinate ascent variational inference are renownedly utilized approximation techniques that rely on stochastic and deterministic approximations. This article clarifies that the two schemes can be explained more generally in a set-theoretical point of view. The alternative views are consequences of a duality formula for variational inference.

#### One Citation

Improving MC-Dropout Uncertainty Estimates with Calibration Error-based Optimization

- Computer Science
- 2021

This study proposes two new loss functions by combining cross entropy with Expected Calibration Error (ECE) and Predictive Entropy (PE) and shows that the new proposed loss functions lead to having a calibrated MC-Dropout method. Expand

#### References

SHOWING 1-10 OF 71 REFERENCES

Concentration inequalities and model selection, volume

- 2007

Probability and Measure

- Mathematics
- 1979

Probability. Measure. Integration. Random Variables and Expected Values. Convergence of Distributions. Derivatives and Conditional Probability. Stochastic Processes. Appendix. Notes on the Problems.… Expand

Advances in Variational Inference

- Mathematics, Computer Science
- IEEE Transactions on Pattern Analysis and Machine Intelligence
- 2019

An overview of recent trends in variational inference is given and a summary of promising future research directions is provided. Expand

Yes, but Did It Work?: Evaluating Variational Inference

- Computer Science, Mathematics
- ICML
- 2018

Two diagnostic algorithms are proposed that give a goodness of fit measurement for joint distributions, while simultaneously improving the error in the estimate. Expand

Frequentist Consistency of Variational Bayes

- Computer Science, Mathematics
- ArXiv
- 2017

It is proved that the VB posterior converges to the Kullback–Leibler (KL) minimizer of a normal distribution, centered at the truth and the corresponding variational expectation of the parameter is consistent and asymptotically normal. Expand

Theoretical and Computational Guarantees of Mean Field Variational Inference for Community Detection

- Mathematics, Computer Science
- ArXiv
- 2017

The mean field method for community detection under the Stochastic Block Model has a linear convergence rate and converges to the minimax rate within $\log n$ iterations and similar optimality results for Gibbs sampling and an iterative procedure to calculate maximum likelihood estimation are obtained, which can be of independent interest. Expand

- 2016

An overview of gradient descent optimization algorithms

- Computer Science
- ArXiv
- 2016

This article looks at different variants of gradient descent, summarize challenges, introduce the most common optimization algorithms, review architectures in a parallel and distributed setting, and investigate additional strategies for optimizing gradient descent. Expand

Uncertainty in Deep Learning

- Computer Science
- 2016

This work develops tools to obtain practical uncertainty estimates in deep learning, casting recent deep learning tools as Bayesian models without changing either the models or the optimisation, and develops the theory for such tools. Expand

Variational Inference: A Review for Statisticians

- Computer Science, Mathematics
- ArXiv
- 2016

Variational inference (VI), a method from machine learning that approximates probability densities through optimization, is reviewed and a variant that uses stochastic optimization to scale up to massive data is derived. Expand