Topic Tag: RNN

home Forums Topic Tag: RNN

 Hierarchical Attentive Recurrent Tracking

 

Class-agnostic object tracking is particularly difficult in cluttered environments as target specific discriminative models cannot be learned a priori. Inspired by how the human visual cortex employs spatial attention and separate “where” and “what” processing pathways to ac…


 Deep learning: Technical introduction

 

This note presents in a technical though hopefully pedagogical way the three most common forms of neural network architectures: Feedforward, Convolutional and Recurrent. For each network, their fundamental building blocks are detailed. The forward pass and the update rules for the backpropagation a…


 Telepath: Understanding Users from a Human Vision Perspective in Large-Scale Recommender Systems

    

Designing an e-commerce recommender system that serves hundreds of millions of active users is a daunting challenge. From a human vision perspective, there’re two key factors that affect users’ behaviors: items’ attractiveness and their matching degree with users’ interests.…


 Predicting Remaining Useful Life using Time Series Embeddings based on Recurrent Neural Networks

We consider the problem of estimating the remaining useful life (RUL) of a system or a machine from sensor data. Many approaches for RUL estimation based on sensor data make assumptions about how machines degrade. Additionally, sensor data from machines is noisy and often suffers from missing value…


 CSI: A Hybrid Deep Model for Fake News Detection

  

The topic of fake news has drawn attention both from the public and the academic communities. Such misinformation has the potential of affecting public opinion, providing an opportunity for malicious parties to manipulate the outcomes of public events such as elections. Because such high stakes are…


 Neural Probabilistic Model for Non-projective MST Parsing

  

In this paper, we propose a probabilistic parsing model, which defines a proper conditional probability distribution over non-projective dependency trees for a given sentence, using neural representations as inputs. The neural network architecture is based on bi-directional LSTM-CNNs which benefits…


 Gate Activation Signal Analysis for Gated Recurrent Neural Networks and Its Correlation with Phoneme Boundaries

In this paper we analyze the gate activation signals inside the gated recurrent neural networks, and find the temporal structure of such signals is highly correlated with the phoneme boundaries. This correlation is further verified by a set of experiments for phoneme segmentation, in which better r…


 Modelling Protagonist Goals and Desires in First-Person Narrative

   

Many genres of natural language text are narratively structured, a testament to our predilection for organizing our experiences as narratives. There is broad consensus that understanding a narrative requires identifying and tracking the goals and desires of the characters and their narrative outcom…


 Gradual Learning of Deep Recurrent Neural Networks

  

Deep Recurrent Neural Networks (RNNs) achieve state-of-the-art results in many sequence-to-sequence tasks. However, deep RNNs are difficult to train and suffer from overfitting. We introduce a training method that trains the network gradually, and treats each layer individually, to achieve improved…


 What is the Role of Recurrent Neural Networks (RNNs) in an Image Caption Generator?

   

In neural image captioning systems, a recurrent neural network (RNN) is typically viewed as the primary generation' component. This view suggests that the image features should beinjected’ into the RNN. This is in fact the dominant view in the literature. Alternatively, the RNN can inste…


 Phonetic Temporal Neural Model for Language Identification

  

Deep neural models, particularly the LSTM-RNN model, have shown great potential for language identification (LID). However, the use of phonetic information has been largely overlooked by most existing neural LID methods, although this information has been used very successfully in conventional phon…


 Learning the Enigma with Recurrent Neural Networks

    

Recurrent neural networks (RNNs) represent the state of the art in translation, image captioning, and speech recognition. They are also capable of learning algorithmic tasks such as long addition, copying, and sorting from a set of training examples. We demonstrate that RNNs can learn decryption al…


 Position-based Content Attention for Time Series Forecasting with Sequence-to-sequence RNNs

We propose here an extended attention model for sequence-to-sequence recurrent neural networks (RNNs) designed to capture (pseudo-)periods in time series. This extended attention model can be deployed on top of any RNN and is shown to yield state-of-the-art performance for time series forecasting o…


 Boltzmann machines for time-series

 

We review Boltzmann machines extended for time-series. These models often have recurrent structure, and back propagration through time (BPTT) is used to learn their parameters. The per-step computational complexity of BPTT in online learning, however, grows linearly with respect to the length of pr…


 Neural Networks Compression for Language Modeling

  

In this paper, we consider several compression techniques for the language modeling problem based on recurrent neural networks (RNNs). It is known that conventional RNNs, e.g, LSTM-based networks in language modeling, are characterized with either high space complexity or substantial inference time…


 Building Your Own Neural Machine Translation System in TensorFlow

   

Posted by Thang Luong, Research Scientist, and Eugene Brevdo, Staff Software Engineer, Google Brain Team Machine translation – the task of automatically translating between languages – is one of the most active research areas in the machine learning community. Among the many approaches to machi…