#### Scalable Estimation of Dirichlet Process Mixture Models on Distributed Data

We consider the estimation of Dirichlet Process Mixture Models (DPMMs) in distributed environments, where data are distributed across multiple computing nodes. A key advantage of Bayesian nonparametric models such as DPMMs is that they allow new components to be introduced on the fly as needed. Thi…

#### DropoutDAgger: A Bayesian Approach to Safe Imitation Learning

While imitation learning is becoming common practice in robotics, this approach often suffers from data mismatch and compounding errors. DAgger is an iterative algorithm that addresses these issues by continually aggregating training data from both the expert and novice policies, but does not consi…

#### ZhuSuan: A Library for Bayesian Deep Learning

In this paper we introduce ZhuSuan, a python probabilistic programming library for Bayesian deep learning, which conjoins the complimentary advantages of Bayesian methods and deep learning. ZhuSuan is built upon Tensorflow. Unlike existing deep learning libraries, which are mainly designed for dete…

#### Direction-Aware Semi-Dense SLAM

To aide simultaneous localization and mapping (SLAM), future perception systems will incorporate forms of scene understanding. In a step towards fully integrated probabilistic geometric scene understanding, localization and mapping we propose the first direction-aware semi-dense SLAM system. It joi…

#### A segmental framework for fully-unsupervised large-vocabulary speech recognition

audio Bayes language speech text

Zero-resource speech technology is a growing research area that aims to develop methods for speech processing in the absence of transcriptions, lexicons, or language modelling text. Early term discovery systems focused on identifying isolated recurring patterns in a corpus, while more recent full-c…

#### Bridging the Gap between Probabilistic and Deterministic Models: A Simulation Study on a Variational Bayes Predictive Coding Recurrent Neural Network Model

The current paper proposes a novel variational Bayes predictive coding RNN model, which can learn to generate fluctuated temporal patterns from exemplars. The model learns to maximize the lower bound of the weighted sum of the regularization and reconstruction error terms. We examined how this weig…

#### Learning Unknown Markov Decision Processes: A Thompson Sampling Approach

We consider the problem of learning an unknown Markov Decision Process (MDP) that is weakly communicating in the infinite horizon setting. We propose a Thompson Sampling-based reinforcement learning algorithm with dynamic episodes (TSDE). At the beginning of each episode, the algorithm generates a …

#### MOLTE: a Modular Optimal Learning Testing Environment

We address the relative paucity of empirical testing of learning algorithms (of any type) by introducing a new public-domain, Modular, Optimal Learning Testing Environment (MOLTE) for Bayesian ranking and selection problem, stochastic bandits or sequential experimental design problems. The Matlab-b…

#### A Comparison of Public Causal Search Packages on Linear, Gaussian Data with No Latent Variables

We compare Tetrad (Java) algorithms to the other public software packages BNT (Bayes Net Toolbox, Matlab), pcalg (R), bnlearn (R) on the vanilla” task of recovering DAG structure to the extent possible from data generated recursively from linear, Gaussian structure equation models (SEMs) with…

#### Upper Bound of Bayesian Generalization Error in Stochastic Matrix Factorization

Stochastic matrix factorization (SMF) has proposed and it can be understood as a restriction to non-negative matrix factorization (NMF). SMF is useful for inference of topic models, NMF for binary matrices data, and Bayesian Network. However, it needs some strong assumption to reach unique factoriz…

#### Deep Mean-Shift Priors for Image Restoration

In this paper we introduce a natural image prior that directly represents a Gaussian-smoothed version of the natural image distribution. We include our prior in a formulation of image restoration as a Bayes estimator that also allows us to solve noise-blind image restoration problems. We show that …

#### Comparative Benchmarking of Causal Discovery Techniques

In this paper we present a comprehensive view of prominent causal discovery algorithms, categorized into two main categories (1) assuming acyclic and no latent variables, and (2) allowing both cycles and latent variables, along with experimental results comparing them from three perspectives: (a) s…

#### Uncertainty measurement with belief entropy on interference effect in Quantum-Like Bayesian Networks

Social dilemmas have been regarded as the essence of evolution game theory, in which the prisoner’s dilemma game is the most famous metaphor for the problem of cooperation. Recent findings revealed people’s behavior violated the Sure Thing Principle in such games. Classic probability me…

#### Causality-Aided Falsification

Falsification is drawing attention in quality assurance of heterogeneous systems whose complexities are beyond most verification techniques’ scalability. In this paper we introduce the idea of causality aid in falsification: by providing a falsification solver — that relies on stochasti…

#### Distributed Bayesian Learning with Stochastic Natural-gradient Expectation Propagation and the Posterior Server

This paper makes two contributions to Bayesian machine learning algorithms. Firstly, we propose stochastic natural gradient expectation propagation (SNEP), a novel alternative to expectation propagation (EP), a popular variational inference algorithm. SNEP is a black box variational algorithm, in t…

#### Bayesian Optimisation for Safe Navigation under Localisation Uncertainty

In outdoor environments, mobile robots are required to navigate through terrain with varying characteristics, some of which might significantly affect the integrity of the platform. Ideally, the robot should be able to identify areas that are safe for navigation based on its own percepts about the …

#### Active Exploration for Learning Symbolic Representations

We introduce an online active exploration algorithm for data-efficiently learning an abstract symbolic model of an environment. Our algorithm is divided into two parts: the first part quickly generates an intermediate Bayesian symbolic model from the data that the agent has collected so far, which …

#### An embedded segmental K-means model for unsupervised segmentation and clustering of speech

Unsupervised segmentation and clustering of unlabelled speech are core problems in zero-resource speech processing. Most approaches lie at methodological extremes: some use probabilistic Bayesian models with convergence guarantees, while others opt for more efficient heuristic techniques. Despite c…

#### Deep Learning: A Bayesian Perspective

Deep learning is a form of machine learning for nonlinear high dimensional pattern matching and prediction. By taking a Bayesian probabilistic perspective, we provide a number of advantages, with more efficient algorithms for optimisation and hyper-parameter tuning, and an explanation of predictive…

#### Reward-based stochastic self-configuration of neural circuits

Synaptic connections between neurons in the brain are dynamic because of continuously ongoing spine dynamics, axonal sprouting, and other processes. In fact, it was recently shown that the spontaneous synapse-autonomous component of spine dynamics is at least as large as the component that depends …

#### Robustly representing inferential uncertainty in deep neural networks through sampling

Bayes CIFAR Convolutional Neural Network DNN MNIST

As deep neural networks (DNNs) are applied to increasingly challenging problems, they will need to be able to represent their own uncertainty. Modelling uncertainty is one of the key features of Bayesian methods. Using Bernoulli dropout with sampling at prediction time has recently been proposed as…

#### Disintegration and Bayesian Inversion, Both Abstractly and Concretely

The notions of disintegration and Bayesian inversion are fundamental in conditional probability theory. They produce channels, as conditional probabilities, from a joint state, or from an already given channel (in opposite direction). These notions exist in the literature, in concrete situations, b…