# Session 15: Advances in Bayesian methods for high-dimensional data

Session title: Advances in Bayesian methods for high-dimensional data
Organizer: Howard Bondell (U. of Melbourne)
Chair: Xuan Bi (Yale)
Time: June 4th, 1:45pm – 3:15pm
Location: VEC 1202/1203

Speech 1: The Graphical Horseshoe Estimator for Inverse Covariance Matrices
Abstract: Gaussian scale mixture priors are common in high-dimensional Bayesian analysis. While optimization algorithms for the extremely popular Lasso and elastic net scale to dimension in the hundreds of thousands, Bayesian computation by Markov chain Monte Carlo (MCMC) is limited to problems an order of magnitude smaller. This is due to high computational cost per step and growth of the variance of time-averages as a function of dimension. We propose an MCMC algorithm for computation in these models that combines block updating and approximations of the Markov kernel to combat both of these factors. Our algorithm gives orders of magnitude speedup over the best existing alternatives in high-dimensional applications. We give theoretical guarantees for the accuracy of the approximation. Scalability of the algorithm is illustrated in an application to a genome wide association study with $N=2,267$ observations and $p=98,385$ predictors. The empirical results show that the new algorithm yields estimates with lower mean squared error, intervals with better coverage, and elucidates features of the posterior often missed by previous algorithms, including bimodality of marginals indicating uncertainty about which covariates belong in the model. This latter feature is an important motivation for a Bayesian approach to testing and selection in high dimensions.(joint work with James Johndrow & Paulo Orenstein)