Advances in Variational Inference
- PMID: 30596568
- DOI: 10.1109/TPAMI.2018.2889774
Advances in Variational Inference
Abstract
Many modern unsupervised or semi-supervised machine learning algorithms rely on Bayesian probabilistic models. These models are usually intractable and thus require approximate inference. Variational inference (VI) lets us approximate a high-dimensional Bayesian posterior with a simpler variational distribution by solving an optimization problem. This approach has been successfully applied to various models and large-scale applications. In this review, we give an overview of recent trends in variational inference. We first introduce standard mean field variational inference, then review recent advances focusing on the following aspects: (a) scalable VI, which includes stochastic approximations, (b) generic VI, which extends the applicability of VI to a large class of otherwise intractable models, such as non-conjugate models, (c) accurate VI, which includes variational models beyond the mean field approximation or with atypical divergences, and (d) amortized VI, which implements the inference over local latent variables with inference networks. Finally, we provide a summary of promising future research directions.
Similar articles
-
Evaluating probabilistic programming and fast variational Bayesian inference in phylogenetics.PeerJ. 2019 Dec 18;7:e8272. doi: 10.7717/peerj.8272. eCollection 2019. PeerJ. 2019. PMID: 31976168 Free PMC article.
-
Probabilistic Models with Deep Neural Networks.Entropy (Basel). 2021 Jan 18;23(1):117. doi: 10.3390/e23010117. Entropy (Basel). 2021. PMID: 33477544 Free PMC article. Review.
-
Differentiable samplers for deep latent variable models.Philos Trans A Math Phys Eng Sci. 2023 May 15;381(2247):20220147. doi: 10.1098/rsta.2022.0147. Epub 2023 Mar 27. Philos Trans A Math Phys Eng Sci. 2023. PMID: 36970826 Free PMC article. Review.
-
Gradient Regularization as Approximate Variational Inference.Entropy (Basel). 2021 Dec 3;23(12):1629. doi: 10.3390/e23121629. Entropy (Basel). 2021. PMID: 34945935 Free PMC article.
-
Sampling the Variational Posterior with Local Refinement.Entropy (Basel). 2021 Nov 8;23(11):1475. doi: 10.3390/e23111475. Entropy (Basel). 2021. PMID: 34828173 Free PMC article.
Cited by
-
Scalable Bayesian Approach for the Dina Q-Matrix Estimation Combining Stochastic Optimization and Variational Inference.Psychometrika. 2023 Mar;88(1):302-331. doi: 10.1007/s11336-022-09884-4. Epub 2022 Sep 12. Psychometrika. 2023. PMID: 36097246
-
Deep Individual Active Learning: Safeguarding against Out-of-Distribution Challenges in Neural Networks.Entropy (Basel). 2024 Jan 31;26(2):129. doi: 10.3390/e26020129. Entropy (Basel). 2024. PMID: 38392384 Free PMC article.
-
Bayesian statistical learning for big data biology.Biophys Rev. 2019 Feb;11(1):95-102. doi: 10.1007/s12551-019-00499-1. Epub 2019 Feb 7. Biophys Rev. 2019. PMID: 30729409 Free PMC article. Review.
-
Handling missing data in variational autoencoder based item response theory.Br J Math Stat Psychol. 2025 Feb;78(1):378-397. doi: 10.1111/bmsp.12363. Epub 2024 Oct 26. Br J Math Stat Psychol. 2025. PMID: 39460706 Free PMC article.
-
MOFA+: a statistical framework for comprehensive integration of multi-modal single-cell data.Genome Biol. 2020 May 11;21(1):111. doi: 10.1186/s13059-020-02015-1. Genome Biol. 2020. PMID: 32393329 Free PMC article.
LinkOut - more resources
Full Text Sources
Other Literature Sources