Beyond Logconcavity: Provable Guarantees for Sampling Multimodal Distributions using Simulated Tempering Langevin Monte Carlo
This topic contains 0 replies, has 1 voice, and was last updated by arXiv 1 year, 9 months ago.

Beyond Logconcavity: Provable Guarantees for Sampling Multimodal Distributions using Simulated Tempering Langevin Monte Carlo
A key task in Bayesian statistics is sampling from distributions that are only specified up to a partition function (i.e., constant of proportionality). However, without any assumptions, sampling (even approximately) can be #Phard, and few works have provided “beyond worstcase” guarantees for such settings. For logconcave distributions, classical results going back to Bakry and ‘Emery (1985) show that natural continuoustime Markov chains called Langevin diffusions mix in polynomial time. The most salient feature of logconcavity violated in practice is unimodality: commonly, the distributions we wish to sample from are multimodal. In the presence of multiple deep and wellseparated modes, Langevin diffusion suffers from torpid mixing. We address this problem by combining Langevin diffusion with simulated tempering. The result is a Markov chain that mixes more rapidly by transitioning between different temperatures of the distribution. We analyze this Markov chain for the canonical multimodal distribution: a mixture of gaussians (of equal variance). The algorithm based on our Markov chain provably samples from distributions that are close to mixtures of gaussians, given access to the gradient of the logpdf. For the analysis, we use a spectral decomposition theorem for graphs (Gharan and Trevisan, 2014) and a Markov chain decomposition technique (Madras and Randall, 2002).
Beyond Logconcavity: Provable Guarantees for Sampling Multimodal Distributions using Simulated Tempering Langevin Monte Carlo
by Rong Ge, Holden Lee, Andrej Risteski
https://arxiv.org/pdf/1710.02736v1.pdf
You must be logged in to reply to this topic.