Topic Tag: paper

home Forums Topic Tag: paper

 Phonetic Temporal Neural Model for Language Identification

  

Deep neural models, particularly the LSTM-RNN model, have shown great potential for language identification (LID). However, the use of phonetic information has been largely overlooked by most existing neural LID methods, although this information has been used very successfully in conventional phon…


 Learning the Enigma with Recurrent Neural Networks

    

Recurrent neural networks (RNNs) represent the state of the art in translation, image captioning, and speech recognition. They are also capable of learning algorithmic tasks such as long addition, copying, and sorting from a set of training examples. We demonstrate that RNNs can learn decryption al…


 Supervised Speech Separation Based on Deep Learning: An Overview

 

Speech separation is the task of separating target speech from background interference. Traditionally, speech separation is studied as a signal processing problem. A more recent approach formulates speech separation as a supervised learning problem, where the discriminative patterns of speech, spea…


 Classification via Tensor Decompositions of Echo State Networks

This work introduces a tensor-based method to perform supervised classification on spatiotemporal data processed in an echo state network. Typically when performing supervised classification tasks on data processed in an echo state network, the entire collection of hidden layer node states from the…


 Super-Convergence: Very Fast Training of Residual Networks Using Large Learning Rates

In this paper, we show a phenomenon where residual networks can be trained using an order of magnitude fewer iterations than is used with standard training methods, which we named “super-convergence”. One of the key elements of super-convergence is training with cyclical learning rates …


 DelugeNets: Deep Networks with Efficient and Flexible Cross-layer Information Inflows

  

Deluge Networks (DelugeNets) are deep neural networks which efficiently facilitate massive cross-layer information inflows from preceding layers to succeeding layers. The connections between layers in DelugeNets are established through cross-layer depthwise convolutional layers with learnable filte…


 Neural Network-based Graph Embedding for Cross-Platform Binary Code Similarity Detection

 

The problem of cross-platform binary code similarity detection aims at detecting whether two binary functions coming from different platforms are similar or not. It has many security applications, including plagiarism detection, malware detection, vulnerability search, etc. Existing approaches rely…


 Neuro-RAM Unit with Applications to Similarity Testing and Compression in Spiking Neural Networks

We study distributed algorithms implemented in a simplified biologically inspired model for stochastic spiking neural networks. We focus on tradeoffs between computation time and network complexity, along with the role of randomness in efficient neural computation. It is widely accepted that neural…


 Notes: A Continuous Model of Neural Networks. Part I: Residual Networks

In this series of notes, we try to model neural networks as as discretizations of continuous flows on the space of data, which can be called flow model. The idea comes from an observation of their similarity in mathematical structures. This conceptual analogy has not been proven useful yet, but it …


 On the approximation by single hidden layer feedforward neural networks with fixed weights

Feedforward neural networks have wide applicability in various disciplines of science due to their universal approximation property. Some authors have shown that single hidden layer feedforward neural networks (SLFNs) with fixed weights still possess the universal approximation property provided th…


 Position-based Content Attention for Time Series Forecasting with Sequence-to-sequence RNNs

We propose here an extended attention model for sequence-to-sequence recurrent neural networks (RNNs) designed to capture (pseudo-)periods in time series. This extended attention model can be deployed on top of any RNN and is shown to yield state-of-the-art performance for time series forecasting o…


 A Capacity Scaling Law for Artificial Neural Networks

In this article, we derive the calculation of two critical numbers that quantify the capabilities of artificial neural networks with gating functions, such as sign, sigmoid, or rectified linear units. First, we derive the calculation of the Vapnik-Chervonenkis dimension of a network with binary out…


 Boltzmann machines and energy-based models

We review Boltzmann machines and energy-based models. A Boltzmann machine defines a probability distribution over binary-valued patterns. One can learn parameters of a Boltzmann machine via gradient based approaches in a way that log likelihood of data is increased. The gradient and Laplacian of a …


 Neural Networks Compression for Language Modeling

  

In this paper, we consider several compression techniques for the language modeling problem based on recurrent neural networks (RNNs). It is known that conventional RNNs, e.g, LSTM-based networks in language modeling, are characterized with either high space complexity or substantial inference time…


 Phoenix: A Self-Optimizing Chess Engine

  

Since the advent of computers, many tasks which required humans to spend a lot of time and energy have been trivialized by the computers’ ability to perform repetitive tasks extremely quickly. Playing chess is one such task. It was one of the first games which was `solved’ using AI. Wit…


 Pillar Networks++: Distributed non-parametric deep and wide networks

     

In recent work, it was shown that combining multi-kernel based support vector machines (SVMs) can lead to near state-of-the-art performance on an action recognition dataset (HMDB-51 dataset). This was 0.4% lower than frameworks that used hand-crafted features in addition to the deep convolutional f…


 Weight-based Fish School Search algorithm for Many-Objective Optimization

Optimization problems with more than one objective consist in a very attractive topic for researchers due to its applicability in real-world situations. Over the years, the research effort in Computational Intelligence area resulted in algorithms able to achieve good results by solving problems wit…