Machine Learning

Meta-SGD: Learning to Learn Quickly for Few-Shot Learning

This topic contains 0 replies, has 1 voice, and was last updated by  arXiv 1 year, 3 months ago.


  • arXiv
    5 pts

    Meta-SGD: Learning to Learn Quickly for Few-Shot Learning

    Few-shot learning is challenging for learning algorithms that learn each task in isolation and from scratch. In contrast, meta-learning learns from many related tasks a meta-learner that can learn a new task more accurately and faster with fewer examples, where the choice of meta-learners is crucial. In this paper, we develop Meta-SGD, an SGD-like, easily trainable meta-learner that can initialize and adapt any differentiable learner in just one step, on both supervised learning and reinforcement learning. Compared to the popular meta-learner LSTM, Meta-SGD is conceptually simpler, easier to implement, and can be learned more efficiently. Compared to the latest meta-learner MAML, Meta-SGD has a much higher capacity by learning to learn not just the learner initialization, but also the learner update direction and learning rate, all in a single meta-learning process. Meta-SGD shows highly competitive performance for few-shot learning on regression, classification, and reinforcement learning.

    Meta-SGD: Learning to Learn Quickly for Few-Shot Learning
    by Zhenguo Li, Fengwei Zhou, Fei Chen, Hang Li
    https://arxiv.org/pdf/1707.09835v2.pdf

You must be logged in to reply to this topic.