#### Finite-dimensional Gaussian approximation with linear inequality constraints

Gaussian Process Genetic Programming

Introducing inequality constraints in Gaussian process (GP) models can lead to more realistic uncertainties in learning a great variety of real-world problems. We consider the finite-dimensional Gaussian approach from Maatouk and Bay (2017) which can satisfy inequality conditions everywhere (either…

#### Scalable Gaussian Processes with Billions of Inducing Inputs via Tensor Train Decomposition

CIFAR DNN Gaussian Process Genetic Programming MNIST

We propose a method (TT-GP) for approximate inference in Gaussian Process (GP) models. We build on previous scalable GP research including stochastic variational inference based on inducing inputs, kernel interpolation, and structure exploiting algebra. The key idea of our method is to use Tensor T…

#### Bayesian Alignments of Warped Multi-Output Gaussian Processes

We present a Bayesian extension to convolution processes which defines a representation between multiple functions by an embedding in a shared latent space. The proposed model allows for both arbitrary alignments of the inputs and and also non-parametric output warpings to transform the observation…

#### A Unifying Framework for Gaussian Process Pseudo-Point Approximations using Power Expectation Propagation

Gaussian Process Genetic Programming

Gaussian processes (GPs) are flexible distributions over functions that enable high-level assumptions about unknown functions to be encoded in a parsimonious, flexible and general way. Although elegant, the application of GPs is limited by computational and analytical intractabilities that arise wh…

#### Learning Scalable Deep Kernels with Recurrent Structure

Bayes Gaussian Process gradient LSTM RNN speech

Many applications in speech, robotics, finance, and biology deal with sequential data, where ordering matters and recurrent structures are common. However, this structure cannot be easily captured by standard kernel functions. To model such structure, we propose expressive closed-form kernel functi…

#### Remote Sensing Image Classification with Large Scale Gaussian Processes

Bayes Gaussian Process Genetic Programming image Support Vector Machine

Current remote sensing image classification problems have to deal with an unprecedented amount of heterogeneous and complex data sources. Upcoming missions will soon provide large data streams that will make land cover/use classification difficult. Machine learning classifiers can help at this, and…

#### Adaptive Generation-Based Evolution Control for Gaussian Process Surrogate Models

The interest in accelerating black-box optimizers has resulted in several surrogate model-assisted version of the Covariance Matrix Adaptation Evolution Strategy, a state-of-the-art continuous black-box optimizer. The version called Surrogate CMA-ES uses Gaussian processes or random forests surroga…

#### Ensemble Multi-task Gaussian Process Regression with Multiple Latent Processes

Gaussian Process Genetic Programming

Multi-task/Multi-output learning seeks to exploit correlation among tasks to enhance performance over learning or solving each task independently. In this paper, we investigate this problem in the context of Gaussian Processes (GPs) and propose a new model which learns a mixture of latent processes…

#### A probabilistic data-driven model for planar pushing

This paper presents a data-driven approach to model planar pushing interaction to predict both the most likely outcome of a push and its expected variability. The learned models rely on a variation of Gaussian processes with input-dependent noise called Variational Heteroscedastic Gaussian processe…

#### GP-SUM. Gaussian Processes Filtering of non-Gaussian Beliefs

Bayes Gaussian Process Genetic Programming

This work centers on the problem of stochastic filtering for systems that yield complex beliefs. The main contribution is GP-SUM, a filtering algorithm for dynamic systems expressed as Gaussian Processes (GP), that does not rely on linearizations or Gaussian approximations of the belief. The algori…

#### Bayesian Optimization for Parameter Tuning of the XOR Neural Network

When applying Machine Learning techniques to problems, one must select model parameters to ensure that the system converges but also does not become stuck at the objective function’s local minimum. Tuning these parameters becomes a non-trivial task for large models and it is not always appare…

#### Perturbative Black Box Variational Inference

Black box variational inference (BBVI) with reparameterization gradients triggered the exploration of divergence measures other than the Kullback-Leibler (KL) divergence, such as alpha divergences. These divergences can be tuned to be more mass-covering (preventing overfitting in complex models), b…

#### 3D Deformable Object Manipulation using Fast Online Gaussian Process Regression

In this paper, we present a general approach to automatically visual-servo control the position and shape of a deformable object whose deformation parameters are unknown. The servo-control is achieved by online learning a model mapping between the robotic end-effector’s movement and the objec…

#### On the Design of LQR Kernels for Efficient Controller Learning

Bayes Gaussian Process Genetic Programming

Finding optimal feedback controllers for nonlinear dynamic systems from data is hard. Recently, Bayesian optimization (BO) has been proposed as a powerful framework for direct controller tuning from experimental trials. For selecting the next query point and finding the global optimum, BO relies on…

#### Analogical-based Bayesian Optimization

Bayes Gaussian Process Genetic Programming

Some real-world problems revolve to solve the optimization problem max_{xinmathcal{X}}fleft(xright) where fleft(.right) is a black-box function and X might be the set of non-vectorial objects (e.g., distributions) where we can only define a symmetric and non-negative similarity score on it. This se…

#### Latent Gaussian Process Regression

We introduce Latent Gaussian Process Regression which is a latent variable extension allowing modelling of non-stationary multi-modal processes using GPs. The approach is built on extending the input space of a regression problem with a latent variable that is used to modulate the covariance functi…

#### Learning from lions: inferring the utility of agents from their trajectories

We build a model using Gaussian processes to infer a spatio-temporal vector field from observed agent trajectories. Significant landmarks or influence points in agent surroundings are jointly derived through vector calculus operations that indicate presence of sources and sinks. We evaluate these i…

#### Bayesian Optimisation for Safe Navigation under Localisation Uncertainty

In outdoor environments, mobile robots are required to navigate through terrain with varying characteristics, some of which might significantly affect the integrity of the platform. Ideally, the robot should be able to identify areas that are safe for navigation based on its own percepts about the …

#### Convolutional Gaussian Processes

We present a practical way of introducing convolutional structure into Gaussian processes, making them more suited to high-dimensional inputs like images. The main contribution of our work is the construction of an inter-domain inducing point approximation that is well-tailored to the convolutional…

#### Spectral Mixture Kernels for Multi-Output Gaussian Processes

Initially, multiple-output Gaussian processes models (MOGPs) were constructed as linear combinations of independent, latent, single-output Gaussian processes (GPs). This resulted in cross-covariance functions with limited parametric interpretation, thus conflicting with single-output GPs and their …