A New Bayesian Lasso
- PMID: 27570577
- PMCID: PMC4996624
- DOI: 10.4310/SII.2014.v7.n4.a12
A New Bayesian Lasso
Abstract
Park and Casella (2008) provided the Bayesian lasso for linear models by assigning scale mixture of normal (SMN) priors on the parameters and independent exponential priors on their variances. In this paper, we propose an alternative Bayesian analysis of the lasso problem. A different hierarchical formulation of Bayesian lasso is introduced by utilizing the scale mixture of uniform (SMU) representation of the Laplace density. We consider a fully Bayesian treatment that leads to a new Gibbs sampler with tractable full conditional posterior distributions. Empirical results and real data analyses show that the new algorithm has good mixing property and performs comparably to the existing Bayesian method in terms of both prediction accuracy and variable selection. An ECM algorithm is provided to compute the MAP estimates of the parameters. Easy extension to general models is also briefly discussed.
Keywords: Bayesian Lasso; Gibbs Sampler; Lasso; MCMC; Scale Mixture of Uniform.
Figures
References
-
- Andrews DF, Mallows CL. Scale mixtures of normal distributions. Journal of the Royal Statistical Society. Series B (Methodological) 1974;36:99–102.
-
- Bae K, Mallick B. Gene selection using a two-level hierarchical bayesian model. Bioinformatics. 2004;20(18):3423–3430. - PubMed
-
- Chen X, Wang JZ, McKeown JM. A bayesian lasso via reversible-jump mcmc. Signal Processing. 2011;91(8):1920–1932.
-
- Choy STB, Wan W, Chan C. Bayesian student-t stochastic volatility models via scale mixtures. Advances in Econometrics. 2008;23:595–618.
-
- Efron B, Hastie T, Johnstone I, Tibshirani R. Least angle regression. The Annals of Statistics. 2004;32(2):407–499.
Grants and funding
LinkOut - more resources
Full Text Sources
Other Literature Sources