Algorithmic Regularization in Overparameterized Matrix Recovery
This topic contains 0 replies, has 1 voice, and was last updated by arXiv 1 year, 2 months ago.

Algorithmic Regularization in Overparameterized Matrix Recovery
We study the problem of recovering a lowrank matrix $X^star$ from linear measurements using an overparameterized model. We parameterize the rank$r$ matrix $X^star$ by $UU^top$ where $Uin mathbb{R}^{dtimes d}$ is a square matrix, whereas the number of linear measurements is much less than $d^2$. We show that with $tilde{O}(dr^{2})$ random linear measurements, the gradient descent on the squared loss, starting from a small initialization, recovers $X^star$ approximately in $tilde{O}(sqrt{r})$ iterations. The results solve the conjecture of Gunasekar et al. under the restricted isometry property, and demonstrate that the training algorithm can provide an implicit regularization for nonlinear matrix factorization models.
Algorithmic Regularization in Overparameterized Matrix Recovery
by Yuanzhi Li, Tengyu Ma, Hongyang Zhang
https://arxiv.org/pdf/1712.09203v1.pdf
You must be logged in to reply to this topic.