A GAMP Based Low Complexity Sparse Bayesian Learning Algorithm
This topic contains 0 replies, has 1 voice, and was last updated by arXiv 1 year, 3 months ago.

A GAMP Based Low Complexity Sparse Bayesian Learning Algorithm
In this paper, we present an algorithm for the sparse signal recovery problem that incorporates damped Gaussian generalized approximate message passing (GGAMP) into ExpectationMaximization (EM)based sparse Bayesian learning (SBL). In particular, GGAMP is used to implement the Estep in SBL in place of matrix inversion, leveraging the fact that GGAMP is guaranteed to converge with appropriate damping. The resulting GGAMPSBL algorithm is much more robust to arbitrary measurement matrix $boldsymbol{A}$ than the standard damped GAMP algorithm while being much lower complexity than the standard SBL algorithm. We then extend the approach from the single measurement vector (SMV) case to the temporally correlated multiple measurement vector (MMV) case, leading to the GGAMPTSBL algorithm. We verify the robustness and computational advantages of the proposed algorithms through numerical experiments.
A GAMP Based Low Complexity Sparse Bayesian Learning Algorithm
by Maher AlShoukairi, Philip Schniter, Bhaskar D. Rao
https://arxiv.org/pdf/1703.03044v2.pdf
You must be logged in to reply to this topic.