Machine Learning

Algorithmic Regularization in Over-parameterized Matrix Recovery

Tagged: ,

This topic contains 0 replies, has 1 voice, and was last updated by  arXiv 1 year, 6 months ago.


  • arXiv
    5 pts

    Algorithmic Regularization in Over-parameterized Matrix Recovery

    We study the problem of recovering a low-rank matrix $X^star$ from linear measurements using an over-parameterized model. We parameterize the rank-$r$ matrix $X^star$ by $UU^top$ where $Uin mathbb{R}^{dtimes d}$ is a square matrix, whereas the number of linear measurements is much less than $d^2$. We show that with $tilde{O}(dr^{2})$ random linear measurements, the gradient descent on the squared loss, starting from a small initialization, recovers $X^star$ approximately in $tilde{O}(sqrt{r})$ iterations. The results solve the conjecture of Gunasekar et al. under the restricted isometry property, and demonstrate that the training algorithm can provide an implicit regularization for non-linear matrix factorization models.

    Algorithmic Regularization in Over-parameterized Matrix Recovery
    by Yuanzhi Li, Tengyu Ma, Hongyang Zhang
    https://arxiv.org/pdf/1712.09203v1.pdf

You must be logged in to reply to this topic.