Machine Learning

IHT dies hard: Provable accelerated Iterative Hard Thresholding

Tagged: ,

This topic contains 0 replies, has 1 voice, and was last updated by  arXiv 1 year ago.


  • arXiv
    5 pts

    IHT dies hard: Provable accelerated Iterative Hard Thresholding

    We study –both in theory and practice– the use of momentum motions in classic iterative hard thresholding (IHT) methods. By simply modifying plain IHT, we investigate its convergence behavior on convex optimization criteria with non-convex constraints, under standard assumptions. In diverse scenaria, we observe that acceleration in IHT leads to significant improvements, compared to state of the art projected gradient descent and Frank-Wolfe variants. As a byproduct of our inspection, we study the impact of selecting the momentum parameter: similar to convex settings, two modes of behavior are observed –“rippling” and linear– depending on the level of momentum.

    IHT dies hard: Provable accelerated Iterative Hard Thresholding
    by Rajiv Khanna, Anastasios Kyrillidis
    https://arxiv.org/pdf/1712.09379v1.pdf

You must be logged in to reply to this topic.