Topic Tag: MNIST

home Forums Topic Tag: MNIST

 Neural Networks Regularization Through Invariant Features Learning

 

Training deep neural networks is known to require a large number of training samples. However, in many applications only few training samples are available. In this work, we tackle the issue of training neural networks for classification task when few training samples are available. We attempt to s…


 Embedded Binarized Neural Networks

We study embedded Binarized Neural Networks (eBNNs) with the aim of allowing current binarized neural networks (BNNs) in the literature to perform feedforward inference efficiently on small embedded devices. We focus on minimizing the required memory footprint, given that these devices often have m…


 BranchyNet: Fast Inference via Early Exiting from Deep Neural Networks

  

Deep neural networks are state of the art methods for many learning tasks due to their ability to extract increasingly better features at each network layer. However, the improved performance of additional layers in a deep network comes at the cost of added latency and energy usage in feedforward i…


 Unsupervised Generative Modeling Using Matrix Product States

  

Generative modeling, which learns joint probability distribution from training data and generates samples according to it, is an important task in machine learning and artificial intelligence. Inspired by probabilistic interpretation of quantum physics, we propose a generative model using matrix pr…


 Decision Stream: Cultivating Deep Decision Trees

  

Various modifications of decision trees have been extensively used during the past years due to their high efficiency and interpretability. Tree node splitting based on relevant feature selection is a key step of decision tree learning, at the same time being their major shortcoming: the recursive …


 Training Spiking Neural Networks for Cognitive Tasks: A Versatile Framework Compatible to Various Temporal Codes

Conventional modeling approaches have found limitations in matching the increasingly detailed neural network structures and dynamics recorded in experiments to the diverse brain functionalities. On another approach, studies have demonstrated to train spiking neural networks for simple functions usi…


 Robustly representing inferential uncertainty in deep neural networks through sampling

    

As deep neural networks (DNNs) are applied to increasingly challenging problems, they will need to be able to represent their own uncertainty. Modelling uncertainty is one of the key features of Bayesian methods. Using Bernoulli dropout with sampling at prediction time has recently been proposed as…