Machine Learning

A Gentle Tutorial of Recurrent Neural Network with Error Backpropagation

Tagged: , , , , ,

This topic contains 0 replies, has 1 voice, and was last updated by  arXiv 10 months, 4 weeks ago.


  • arXiv
    5 pts

    A Gentle Tutorial of Recurrent Neural Network with Error Backpropagation

    We describe recurrent neural networks (RNNs), which have attracted great attention on sequential tasks, such as handwriting recognition, speech recognition and image to text. However, compared to general feedforward neural networks, RNNs have feedback loops, which makes it a little hard to understand the backpropagation step. Thus, we focus on basics, especially the error backpropagation to compute gradients with respect to model parameters. Further, we go into detail on how error backpropagation algorithm is applied on long short-term memory (LSTM) by unfolding the memory unit.

    A Gentle Tutorial of Recurrent Neural Network with Error Backpropagation
    by Gang Chen
    https://arxiv.org/pdf/1610.02583v3.pdf

You must be logged in to reply to this topic.