Asynchronous Stochastic Variational Inference
This topic contains 0 replies, has 1 voice, and was last updated by arXiv 1 year ago.

Asynchronous Stochastic Variational Inference
Stochastic variational inference (SVI) employs stochastic optimization to scale up Bayesian computation to massive data. Since SVI is at its core a stochastic gradientbased algorithm, horizontal parallelism can be harnessed to allow larger scale inference. We propose a lockfree parallel implementation for SVI which allows distributed computations over multiple slaves in an asynchronous style. We show that our implementation leads to linear speedup while guaranteeing an asymptotic ergodic convergence rate $O(1/sqrt(T)$ ) given that the number of slaves is bounded by $sqrt(T)$ ($T$ is the total number of iterations). The implementation is done in a highperformance computing (HPC) environment using message passing interface (MPI) for python (MPI4py). The extensive empirical evaluation shows that our parallel SVI is lossless, performing comparably well to its counterpart serial SVI with linear speedup.
Asynchronous Stochastic Variational Inference
by Saad Mohamad, Abdelhamid Bouchachia, Moamar SayedMouchaweh
https://arxiv.org/pdf/1801.04289v1.pdf
You must be logged in to reply to this topic.