Machine Learning

Tracking the gradients using the Hessian: A new look at variance reducing stochastic methods

Tagged: 

This topic contains 0 replies, has 1 voice, and was last updated by  arXiv 1 month ago.


  • arXiv
    5 pts

    Tracking the gradients using the Hessian: A new look at variance reducing stochastic methods

    Our goal is to improve variance reducing stochastic methods through better control variates. We first propose a modification of SVRG which uses the Hessian to track gradients over time, rather than to recondition, increasing the correlation of the control variates and leading to faster theoretical convergence close to the optimum. We then propose accurate and computationally efficient approximations to the Hessian, both using a diagonal and a low-rank matrix. Finally, we demonstrate the effectiveness of our method on a wide range of problems.

    Tracking the gradients using the Hessian: A new look at variance reducing stochastic methods
    by Robert M. Gower, Nicolas Le Roux, Francis Bach
    https://arxiv.org/pdf/1710.07462v1.pdf

You must be logged in to reply to this topic.