GP Regression with Uncertain Inputs¶
Introduction¶
In this notebook, we’re going to demonstrate one way of dealing with uncertainty in our training data. Let’s say that we’re collecting training data that models the following function.
However, now assume that we’re a bit uncertain about our features. In particular, we’re going to assume that every x_i
value is not a point but a distribution instead. E.g.
Using a distributional kernel to deal with uncertain inputs¶
Rather than using a variational method (see the GP Regression with Uncertian Inputs tutorial in the variational examples), if we explicitly know the type of uncertainty in our inputs we can pass that into our kernel.
More specifically, assuming Gaussian inputs, we will compute the symmetrized KL divergence between the Gaussian inputs.
[2]:
import math
import torch
import tqdm
import gpytorch
from matplotlib import pyplot as plt
%matplotlib inline
%load_ext autoreload
%autoreload 2
[3]:
# Training data is 100 points in [0,1] inclusive regularly spaced
train_x_mean = torch.linspace(0, 1, 20)
# We'll assume the variance shrinks the closer we get to 1
train_x_stdv = torch.linspace(0.03, 0.01, 20)
# True function is sin(2*pi*x) with Gaussian noise
train_y = torch.sin(train_x_mean * (2 * math.pi)) + torch.randn(train_x_mean.size()) * 0.2
To effectively pass in the training distributional data, we will need to stack the mean and log variances.
[4]:
train_x_distributional = torch.stack((train_x_mean, (train_x_stdv**2).log()), dim=1)
[5]:
f, ax = plt.subplots(1, 1, figsize=(8, 3))
ax.errorbar(train_x_mean, train_y, xerr=(train_x_stdv * 2), fmt="k*", label="Train Data")
ax.legend()
[5]:
<matplotlib.legend.Legend at 0x7f0646c7ceb0>

We train the hyperparameters of the resulting distributional GP via type-II gradient descent, as is standard in many settings. We could also do fully Bayesian inference.
[6]:
from gpytorch.models import ExactGP
from gpytorch.kernels import GaussianSymmetrizedKLKernel, ScaleKernel
from gpytorch.means import ConstantMean
class ExactGPModel(ExactGP):
def __init__(self, train_x, train_y, likelihood):
super(ExactGPModel, self).__init__(train_x, train_y, likelihood)
self.mean_module = ConstantMean()
self.covar_module = ScaleKernel(GaussianSymmetrizedKLKernel())
def forward(self, x):
mean_x = self.mean_module(x)
covar_x = self.covar_module(x)
return gpytorch.distributions.MultivariateNormal(mean_x, covar_x)
# initialize likelihood and model
likelihood = gpytorch.likelihoods.GaussianLikelihood()
model = ExactGPModel(train_x_distributional, train_y, likelihood)
[7]:
# this is for running the notebook in our testing framework
import os
smoke_test = ('CI' in os.environ)
training_iter = 2 if smoke_test else 500
# Find optimal model hyperparameters
model.train()
likelihood.train()
# Use the adam optimizer
optimizer = torch.optim.Adam(model.parameters(), lr=0.25) # Includes GaussianLikelihood parameters
# "Loss" for GPs - the marginal log likelihood
mll = gpytorch.mlls.ExactMarginalLogLikelihood(likelihood, model)
for i in range(training_iter):
# Zero gradients from previous iteration
optimizer.zero_grad()
# Output from model
output = model(train_x_distributional)
# Calc loss and backprop gradients
loss = -mll(output, train_y)
loss.backward()
print('Iter %d/%d - Loss: %.3f lengthscale: %.3f noise: %.3f' % (
i + 1, training_iter, loss.item(),
model.covar_module.base_kernel.lengthscale.item(),
model.likelihood.noise.item()
))
optimizer.step()
Iter 1/500 - Loss: 1.290 lengthscale: 0.693 noise: 0.693
Iter 2/500 - Loss: 1.253 lengthscale: 0.826 noise: 0.576
Iter 3/500 - Loss: 1.194 lengthscale: 0.953 noise: 0.475
Iter 4/500 - Loss: 1.165 lengthscale: 1.098 noise: 0.389
Iter 5/500 - Loss: 1.148 lengthscale: 1.260 noise: 0.318
Iter 6/500 - Loss: 1.133 lengthscale: 1.431 noise: 0.264
Iter 7/500 - Loss: 1.141 lengthscale: 1.605 noise: 0.225
Iter 8/500 - Loss: 1.154 lengthscale: 1.783 noise: 0.203
Iter 9/500 - Loss: 1.147 lengthscale: 1.970 noise: 0.193
Iter 10/500 - Loss: 1.134 lengthscale: 2.172 noise: 0.193
Iter 11/500 - Loss: 1.122 lengthscale: 2.386 noise: 0.198
Iter 12/500 - Loss: 1.105 lengthscale: 2.611 noise: 0.207
Iter 13/500 - Loss: 1.089 lengthscale: 2.843 noise: 0.217
Iter 14/500 - Loss: 1.079 lengthscale: 3.083 noise: 0.225
Iter 15/500 - Loss: 1.074 lengthscale: 3.329 noise: 0.229
Iter 16/500 - Loss: 1.069 lengthscale: 3.579 noise: 0.227
Iter 17/500 - Loss: 1.061 lengthscale: 3.832 noise: 0.220
Iter 18/500 - Loss: 1.052 lengthscale: 4.087 noise: 0.208
Iter 19/500 - Loss: 1.041 lengthscale: 4.343 noise: 0.193
Iter 20/500 - Loss: 1.029 lengthscale: 4.600 noise: 0.175
Iter 21/500 - Loss: 1.015 lengthscale: 4.858 noise: 0.157
Iter 22/500 - Loss: 1.000 lengthscale: 5.115 noise: 0.138
Iter 23/500 - Loss: 0.984 lengthscale: 5.374 noise: 0.121
Iter 24/500 - Loss: 0.970 lengthscale: 5.633 noise: 0.105
Iter 25/500 - Loss: 0.958 lengthscale: 5.894 noise: 0.090
Iter 26/500 - Loss: 0.947 lengthscale: 6.155 noise: 0.078
Iter 27/500 - Loss: 0.937 lengthscale: 6.417 noise: 0.067
Iter 28/500 - Loss: 0.928 lengthscale: 6.679 noise: 0.058
Iter 29/500 - Loss: 0.920 lengthscale: 6.940 noise: 0.051
Iter 30/500 - Loss: 0.913 lengthscale: 7.199 noise: 0.045
Iter 31/500 - Loss: 0.907 lengthscale: 7.455 noise: 0.040
Iter 32/500 - Loss: 0.902 lengthscale: 7.706 noise: 0.036
Iter 33/500 - Loss: 0.899 lengthscale: 7.952 noise: 0.033
Iter 34/500 - Loss: 0.897 lengthscale: 8.191 noise: 0.031
Iter 35/500 - Loss: 0.896 lengthscale: 8.421 noise: 0.029
Iter 36/500 - Loss: 0.894 lengthscale: 8.643 noise: 0.028
Iter 37/500 - Loss: 0.893 lengthscale: 8.855 noise: 0.028
Iter 38/500 - Loss: 0.891 lengthscale: 9.058 noise: 0.028
Iter 39/500 - Loss: 0.888 lengthscale: 9.252 noise: 0.028
Iter 40/500 - Loss: 0.884 lengthscale: 9.438 noise: 0.029
Iter 41/500 - Loss: 0.879 lengthscale: 9.618 noise: 0.031
Iter 42/500 - Loss: 0.873 lengthscale: 9.793 noise: 0.034
Iter 43/500 - Loss: 0.867 lengthscale: 9.965 noise: 0.037
Iter 44/500 - Loss: 0.862 lengthscale: 10.136 noise: 0.041
Iter 45/500 - Loss: 0.856 lengthscale: 10.306 noise: 0.046
Iter 46/500 - Loss: 0.851 lengthscale: 10.478 noise: 0.051
Iter 47/500 - Loss: 0.848 lengthscale: 10.651 noise: 0.057
Iter 48/500 - Loss: 0.845 lengthscale: 10.826 noise: 0.063
Iter 49/500 - Loss: 0.843 lengthscale: 11.003 noise: 0.069
Iter 50/500 - Loss: 0.841 lengthscale: 11.181 noise: 0.075
Iter 51/500 - Loss: 0.840 lengthscale: 11.361 noise: 0.079
Iter 52/500 - Loss: 0.838 lengthscale: 11.542 noise: 0.082
Iter 53/500 - Loss: 0.836 lengthscale: 11.723 noise: 0.083
Iter 54/500 - Loss: 0.833 lengthscale: 11.905 noise: 0.083
Iter 55/500 - Loss: 0.830 lengthscale: 12.086 noise: 0.081
Iter 56/500 - Loss: 0.826 lengthscale: 12.268 noise: 0.078
Iter 57/500 - Loss: 0.822 lengthscale: 12.449 noise: 0.074
Iter 58/500 - Loss: 0.819 lengthscale: 12.630 noise: 0.070
Iter 59/500 - Loss: 0.816 lengthscale: 12.811 noise: 0.066
Iter 60/500 - Loss: 0.813 lengthscale: 12.991 noise: 0.063
Iter 61/500 - Loss: 0.811 lengthscale: 13.170 noise: 0.060
Iter 62/500 - Loss: 0.809 lengthscale: 13.347 noise: 0.057
Iter 63/500 - Loss: 0.807 lengthscale: 13.524 noise: 0.055
Iter 64/500 - Loss: 0.805 lengthscale: 13.700 noise: 0.055
Iter 65/500 - Loss: 0.803 lengthscale: 13.874 noise: 0.055
Iter 66/500 - Loss: 0.801 lengthscale: 14.046 noise: 0.055
Iter 67/500 - Loss: 0.798 lengthscale: 14.217 noise: 0.057
Iter 68/500 - Loss: 0.796 lengthscale: 14.387 noise: 0.058
Iter 69/500 - Loss: 0.793 lengthscale: 14.556 noise: 0.061
Iter 70/500 - Loss: 0.791 lengthscale: 14.724 noise: 0.063
Iter 71/500 - Loss: 0.789 lengthscale: 14.890 noise: 0.065
Iter 72/500 - Loss: 0.788 lengthscale: 15.056 noise: 0.068
Iter 73/500 - Loss: 0.786 lengthscale: 15.221 noise: 0.069
Iter 74/500 - Loss: 0.784 lengthscale: 15.385 noise: 0.070
Iter 75/500 - Loss: 0.782 lengthscale: 15.548 noise: 0.071
Iter 76/500 - Loss: 0.781 lengthscale: 15.710 noise: 0.071
Iter 77/500 - Loss: 0.779 lengthscale: 15.871 noise: 0.070
Iter 78/500 - Loss: 0.777 lengthscale: 16.031 noise: 0.069
Iter 79/500 - Loss: 0.775 lengthscale: 16.190 noise: 0.067
Iter 80/500 - Loss: 0.773 lengthscale: 16.348 noise: 0.066
Iter 81/500 - Loss: 0.771 lengthscale: 16.505 noise: 0.064
Iter 82/500 - Loss: 0.770 lengthscale: 16.662 noise: 0.063
Iter 83/500 - Loss: 0.768 lengthscale: 16.817 noise: 0.062
Iter 84/500 - Loss: 0.767 lengthscale: 16.971 noise: 0.062
Iter 85/500 - Loss: 0.765 lengthscale: 17.124 noise: 0.062
Iter 86/500 - Loss: 0.764 lengthscale: 17.276 noise: 0.062
Iter 87/500 - Loss: 0.762 lengthscale: 17.427 noise: 0.063
Iter 88/500 - Loss: 0.761 lengthscale: 17.577 noise: 0.064
Iter 89/500 - Loss: 0.759 lengthscale: 17.726 noise: 0.065
Iter 90/500 - Loss: 0.758 lengthscale: 17.874 noise: 0.066
Iter 91/500 - Loss: 0.756 lengthscale: 18.021 noise: 0.066
Iter 92/500 - Loss: 0.755 lengthscale: 18.168 noise: 0.067
Iter 93/500 - Loss: 0.754 lengthscale: 18.313 noise: 0.067
Iter 94/500 - Loss: 0.752 lengthscale: 18.458 noise: 0.068
Iter 95/500 - Loss: 0.751 lengthscale: 18.602 noise: 0.067
Iter 96/500 - Loss: 0.750 lengthscale: 18.745 noise: 0.067
Iter 97/500 - Loss: 0.748 lengthscale: 18.888 noise: 0.066
Iter 98/500 - Loss: 0.747 lengthscale: 19.029 noise: 0.066
Iter 99/500 - Loss: 0.746 lengthscale: 19.170 noise: 0.065
Iter 100/500 - Loss: 0.745 lengthscale: 19.310 noise: 0.065
Iter 101/500 - Loss: 0.744 lengthscale: 19.449 noise: 0.064
Iter 102/500 - Loss: 0.742 lengthscale: 19.587 noise: 0.064
Iter 103/500 - Loss: 0.741 lengthscale: 19.725 noise: 0.064
Iter 104/500 - Loss: 0.740 lengthscale: 19.862 noise: 0.064
Iter 105/500 - Loss: 0.739 lengthscale: 19.998 noise: 0.065
Iter 106/500 - Loss: 0.738 lengthscale: 20.133 noise: 0.065
Iter 107/500 - Loss: 0.737 lengthscale: 20.268 noise: 0.066
Iter 108/500 - Loss: 0.736 lengthscale: 20.402 noise: 0.066
Iter 109/500 - Loss: 0.734 lengthscale: 20.535 noise: 0.066
Iter 110/500 - Loss: 0.733 lengthscale: 20.668 noise: 0.066
Iter 111/500 - Loss: 0.732 lengthscale: 20.800 noise: 0.067
Iter 112/500 - Loss: 0.731 lengthscale: 20.931 noise: 0.066
Iter 113/500 - Loss: 0.730 lengthscale: 21.062 noise: 0.066
Iter 114/500 - Loss: 0.729 lengthscale: 21.192 noise: 0.066
Iter 115/500 - Loss: 0.728 lengthscale: 21.321 noise: 0.066
Iter 116/500 - Loss: 0.727 lengthscale: 21.450 noise: 0.065
Iter 117/500 - Loss: 0.726 lengthscale: 21.578 noise: 0.065
Iter 118/500 - Loss: 0.725 lengthscale: 21.706 noise: 0.065
Iter 119/500 - Loss: 0.724 lengthscale: 21.833 noise: 0.065
Iter 120/500 - Loss: 0.723 lengthscale: 21.959 noise: 0.065
Iter 121/500 - Loss: 0.722 lengthscale: 22.085 noise: 0.065
Iter 122/500 - Loss: 0.722 lengthscale: 22.210 noise: 0.066
Iter 123/500 - Loss: 0.721 lengthscale: 22.335 noise: 0.066
Iter 124/500 - Loss: 0.720 lengthscale: 22.459 noise: 0.066
Iter 125/500 - Loss: 0.719 lengthscale: 22.582 noise: 0.066
Iter 126/500 - Loss: 0.718 lengthscale: 22.705 noise: 0.066
Iter 127/500 - Loss: 0.717 lengthscale: 22.828 noise: 0.066
Iter 128/500 - Loss: 0.716 lengthscale: 22.950 noise: 0.066
Iter 129/500 - Loss: 0.715 lengthscale: 23.071 noise: 0.066
Iter 130/500 - Loss: 0.715 lengthscale: 23.192 noise: 0.066
Iter 131/500 - Loss: 0.714 lengthscale: 23.312 noise: 0.066
Iter 132/500 - Loss: 0.713 lengthscale: 23.432 noise: 0.066
Iter 133/500 - Loss: 0.712 lengthscale: 23.552 noise: 0.066
Iter 134/500 - Loss: 0.711 lengthscale: 23.671 noise: 0.066
Iter 135/500 - Loss: 0.710 lengthscale: 23.789 noise: 0.066
Iter 136/500 - Loss: 0.710 lengthscale: 23.907 noise: 0.066
Iter 137/500 - Loss: 0.709 lengthscale: 24.024 noise: 0.066
Iter 138/500 - Loss: 0.708 lengthscale: 24.141 noise: 0.066
Iter 139/500 - Loss: 0.707 lengthscale: 24.258 noise: 0.066
Iter 140/500 - Loss: 0.707 lengthscale: 24.374 noise: 0.066
Iter 141/500 - Loss: 0.706 lengthscale: 24.490 noise: 0.066
Iter 142/500 - Loss: 0.705 lengthscale: 24.605 noise: 0.066
Iter 143/500 - Loss: 0.704 lengthscale: 24.720 noise: 0.066
Iter 144/500 - Loss: 0.704 lengthscale: 24.834 noise: 0.066
Iter 145/500 - Loss: 0.703 lengthscale: 24.948 noise: 0.066
Iter 146/500 - Loss: 0.702 lengthscale: 25.062 noise: 0.066
Iter 147/500 - Loss: 0.702 lengthscale: 25.175 noise: 0.066
Iter 148/500 - Loss: 0.701 lengthscale: 25.288 noise: 0.066
Iter 149/500 - Loss: 0.700 lengthscale: 25.400 noise: 0.066
Iter 150/500 - Loss: 0.700 lengthscale: 25.512 noise: 0.066
Iter 151/500 - Loss: 0.699 lengthscale: 25.624 noise: 0.066
Iter 152/500 - Loss: 0.698 lengthscale: 25.735 noise: 0.066
Iter 153/500 - Loss: 0.698 lengthscale: 25.846 noise: 0.066
Iter 154/500 - Loss: 0.697 lengthscale: 25.956 noise: 0.066
Iter 155/500 - Loss: 0.696 lengthscale: 26.067 noise: 0.066
Iter 156/500 - Loss: 0.696 lengthscale: 26.176 noise: 0.066
Iter 157/500 - Loss: 0.695 lengthscale: 26.286 noise: 0.066
Iter 158/500 - Loss: 0.694 lengthscale: 26.395 noise: 0.066
Iter 159/500 - Loss: 0.694 lengthscale: 26.503 noise: 0.066
Iter 160/500 - Loss: 0.693 lengthscale: 26.612 noise: 0.066
Iter 161/500 - Loss: 0.692 lengthscale: 26.720 noise: 0.066
Iter 162/500 - Loss: 0.692 lengthscale: 26.827 noise: 0.066
Iter 163/500 - Loss: 0.691 lengthscale: 26.935 noise: 0.066
Iter 164/500 - Loss: 0.691 lengthscale: 27.042 noise: 0.066
Iter 165/500 - Loss: 0.690 lengthscale: 27.149 noise: 0.066
Iter 166/500 - Loss: 0.689 lengthscale: 27.255 noise: 0.066
Iter 167/500 - Loss: 0.689 lengthscale: 27.361 noise: 0.066
Iter 168/500 - Loss: 0.688 lengthscale: 27.467 noise: 0.066
Iter 169/500 - Loss: 0.688 lengthscale: 27.572 noise: 0.066
Iter 170/500 - Loss: 0.687 lengthscale: 27.677 noise: 0.066
Iter 171/500 - Loss: 0.686 lengthscale: 27.782 noise: 0.066
Iter 172/500 - Loss: 0.686 lengthscale: 27.887 noise: 0.066
Iter 173/500 - Loss: 0.685 lengthscale: 27.991 noise: 0.066
Iter 174/500 - Loss: 0.685 lengthscale: 28.095 noise: 0.066
Iter 175/500 - Loss: 0.684 lengthscale: 28.199 noise: 0.066
Iter 176/500 - Loss: 0.684 lengthscale: 28.302 noise: 0.066
Iter 177/500 - Loss: 0.683 lengthscale: 28.405 noise: 0.066
Iter 178/500 - Loss: 0.683 lengthscale: 28.508 noise: 0.066
Iter 179/500 - Loss: 0.682 lengthscale: 28.611 noise: 0.066
Iter 180/500 - Loss: 0.682 lengthscale: 28.713 noise: 0.066
Iter 181/500 - Loss: 0.681 lengthscale: 28.816 noise: 0.066
Iter 182/500 - Loss: 0.680 lengthscale: 28.917 noise: 0.066
Iter 183/500 - Loss: 0.680 lengthscale: 29.019 noise: 0.066
Iter 184/500 - Loss: 0.679 lengthscale: 29.120 noise: 0.066
Iter 185/500 - Loss: 0.679 lengthscale: 29.221 noise: 0.066
Iter 186/500 - Loss: 0.678 lengthscale: 29.322 noise: 0.066
Iter 187/500 - Loss: 0.678 lengthscale: 29.423 noise: 0.067
Iter 188/500 - Loss: 0.677 lengthscale: 29.523 noise: 0.067
Iter 189/500 - Loss: 0.677 lengthscale: 29.624 noise: 0.067
Iter 190/500 - Loss: 0.676 lengthscale: 29.723 noise: 0.067
Iter 191/500 - Loss: 0.676 lengthscale: 29.823 noise: 0.067
Iter 192/500 - Loss: 0.675 lengthscale: 29.923 noise: 0.067
Iter 193/500 - Loss: 0.675 lengthscale: 30.022 noise: 0.067
Iter 194/500 - Loss: 0.674 lengthscale: 30.121 noise: 0.067
Iter 195/500 - Loss: 0.674 lengthscale: 30.220 noise: 0.067
Iter 196/500 - Loss: 0.674 lengthscale: 30.318 noise: 0.067
Iter 197/500 - Loss: 0.673 lengthscale: 30.417 noise: 0.067
Iter 198/500 - Loss: 0.673 lengthscale: 30.515 noise: 0.067
Iter 199/500 - Loss: 0.672 lengthscale: 30.613 noise: 0.067
Iter 200/500 - Loss: 0.672 lengthscale: 30.711 noise: 0.067
Iter 201/500 - Loss: 0.671 lengthscale: 30.808 noise: 0.067
Iter 202/500 - Loss: 0.671 lengthscale: 30.906 noise: 0.067
Iter 203/500 - Loss: 0.670 lengthscale: 31.003 noise: 0.067
Iter 204/500 - Loss: 0.670 lengthscale: 31.100 noise: 0.067
Iter 205/500 - Loss: 0.669 lengthscale: 31.196 noise: 0.067
Iter 206/500 - Loss: 0.669 lengthscale: 31.293 noise: 0.067
Iter 207/500 - Loss: 0.668 lengthscale: 31.389 noise: 0.067
Iter 208/500 - Loss: 0.668 lengthscale: 31.486 noise: 0.067
Iter 209/500 - Loss: 0.668 lengthscale: 31.582 noise: 0.067
Iter 210/500 - Loss: 0.667 lengthscale: 31.678 noise: 0.067
Iter 211/500 - Loss: 0.667 lengthscale: 31.773 noise: 0.067
Iter 212/500 - Loss: 0.666 lengthscale: 31.869 noise: 0.067
Iter 213/500 - Loss: 0.666 lengthscale: 31.964 noise: 0.067
Iter 214/500 - Loss: 0.665 lengthscale: 32.059 noise: 0.067
Iter 215/500 - Loss: 0.665 lengthscale: 32.154 noise: 0.067
Iter 216/500 - Loss: 0.665 lengthscale: 32.249 noise: 0.067
Iter 217/500 - Loss: 0.664 lengthscale: 32.344 noise: 0.067
Iter 218/500 - Loss: 0.664 lengthscale: 32.438 noise: 0.067
Iter 219/500 - Loss: 0.663 lengthscale: 32.532 noise: 0.067
Iter 220/500 - Loss: 0.663 lengthscale: 32.627 noise: 0.067
Iter 221/500 - Loss: 0.662 lengthscale: 32.721 noise: 0.067
Iter 222/500 - Loss: 0.662 lengthscale: 32.814 noise: 0.067
Iter 223/500 - Loss: 0.662 lengthscale: 32.908 noise: 0.067
Iter 224/500 - Loss: 0.661 lengthscale: 33.002 noise: 0.067
Iter 225/500 - Loss: 0.661 lengthscale: 33.095 noise: 0.067
Iter 226/500 - Loss: 0.660 lengthscale: 33.188 noise: 0.067
Iter 227/500 - Loss: 0.660 lengthscale: 33.281 noise: 0.067
Iter 228/500 - Loss: 0.660 lengthscale: 33.374 noise: 0.067
Iter 229/500 - Loss: 0.659 lengthscale: 33.467 noise: 0.067
Iter 230/500 - Loss: 0.659 lengthscale: 33.560 noise: 0.067
Iter 231/500 - Loss: 0.658 lengthscale: 33.652 noise: 0.067
Iter 232/500 - Loss: 0.658 lengthscale: 33.744 noise: 0.067
Iter 233/500 - Loss: 0.658 lengthscale: 33.837 noise: 0.067
Iter 234/500 - Loss: 0.657 lengthscale: 33.929 noise: 0.067
Iter 235/500 - Loss: 0.657 lengthscale: 34.021 noise: 0.067
Iter 236/500 - Loss: 0.657 lengthscale: 34.112 noise: 0.067
Iter 237/500 - Loss: 0.656 lengthscale: 34.204 noise: 0.067
Iter 238/500 - Loss: 0.656 lengthscale: 34.296 noise: 0.067
Iter 239/500 - Loss: 0.655 lengthscale: 34.387 noise: 0.067
Iter 240/500 - Loss: 0.655 lengthscale: 34.478 noise: 0.067
Iter 241/500 - Loss: 0.655 lengthscale: 34.569 noise: 0.067
Iter 242/500 - Loss: 0.654 lengthscale: 34.660 noise: 0.067
Iter 243/500 - Loss: 0.654 lengthscale: 34.751 noise: 0.067
Iter 244/500 - Loss: 0.654 lengthscale: 34.842 noise: 0.067
Iter 245/500 - Loss: 0.653 lengthscale: 34.933 noise: 0.067
Iter 246/500 - Loss: 0.653 lengthscale: 35.023 noise: 0.067
Iter 247/500 - Loss: 0.652 lengthscale: 35.113 noise: 0.067
Iter 248/500 - Loss: 0.652 lengthscale: 35.204 noise: 0.067
Iter 249/500 - Loss: 0.652 lengthscale: 35.294 noise: 0.067
Iter 250/500 - Loss: 0.651 lengthscale: 35.384 noise: 0.067
Iter 251/500 - Loss: 0.651 lengthscale: 35.474 noise: 0.067
Iter 252/500 - Loss: 0.651 lengthscale: 35.564 noise: 0.067
Iter 253/500 - Loss: 0.650 lengthscale: 35.653 noise: 0.067
Iter 254/500 - Loss: 0.650 lengthscale: 35.743 noise: 0.067
Iter 255/500 - Loss: 0.650 lengthscale: 35.832 noise: 0.067
Iter 256/500 - Loss: 0.649 lengthscale: 35.922 noise: 0.067
Iter 257/500 - Loss: 0.649 lengthscale: 36.011 noise: 0.067
Iter 258/500 - Loss: 0.649 lengthscale: 36.100 noise: 0.067
Iter 259/500 - Loss: 0.648 lengthscale: 36.189 noise: 0.067
Iter 260/500 - Loss: 0.648 lengthscale: 36.278 noise: 0.067
Iter 261/500 - Loss: 0.648 lengthscale: 36.366 noise: 0.067
Iter 262/500 - Loss: 0.647 lengthscale: 36.455 noise: 0.067
Iter 263/500 - Loss: 0.647 lengthscale: 36.544 noise: 0.067
Iter 264/500 - Loss: 0.647 lengthscale: 36.632 noise: 0.067
Iter 265/500 - Loss: 0.646 lengthscale: 36.720 noise: 0.067
Iter 266/500 - Loss: 0.646 lengthscale: 36.809 noise: 0.067
Iter 267/500 - Loss: 0.646 lengthscale: 36.897 noise: 0.067
Iter 268/500 - Loss: 0.645 lengthscale: 36.985 noise: 0.067
Iter 269/500 - Loss: 0.645 lengthscale: 37.073 noise: 0.067
Iter 270/500 - Loss: 0.645 lengthscale: 37.161 noise: 0.067
Iter 271/500 - Loss: 0.644 lengthscale: 37.248 noise: 0.067
Iter 272/500 - Loss: 0.644 lengthscale: 37.336 noise: 0.067
Iter 273/500 - Loss: 0.644 lengthscale: 37.424 noise: 0.067
Iter 274/500 - Loss: 0.643 lengthscale: 37.511 noise: 0.067
Iter 275/500 - Loss: 0.643 lengthscale: 37.598 noise: 0.067
Iter 276/500 - Loss: 0.643 lengthscale: 37.686 noise: 0.067
Iter 277/500 - Loss: 0.642 lengthscale: 37.773 noise: 0.067
Iter 278/500 - Loss: 0.642 lengthscale: 37.860 noise: 0.067
Iter 279/500 - Loss: 0.642 lengthscale: 37.947 noise: 0.067
Iter 280/500 - Loss: 0.641 lengthscale: 38.034 noise: 0.067
Iter 281/500 - Loss: 0.641 lengthscale: 38.120 noise: 0.067
Iter 282/500 - Loss: 0.641 lengthscale: 38.207 noise: 0.067
Iter 283/500 - Loss: 0.640 lengthscale: 38.294 noise: 0.067
Iter 284/500 - Loss: 0.640 lengthscale: 38.380 noise: 0.067
Iter 285/500 - Loss: 0.640 lengthscale: 38.467 noise: 0.067
Iter 286/500 - Loss: 0.639 lengthscale: 38.553 noise: 0.067
Iter 287/500 - Loss: 0.639 lengthscale: 38.639 noise: 0.067
Iter 288/500 - Loss: 0.639 lengthscale: 38.725 noise: 0.067
Iter 289/500 - Loss: 0.639 lengthscale: 38.811 noise: 0.067
Iter 290/500 - Loss: 0.638 lengthscale: 38.897 noise: 0.067
Iter 291/500 - Loss: 0.638 lengthscale: 38.983 noise: 0.067
Iter 292/500 - Loss: 0.638 lengthscale: 39.069 noise: 0.067
Iter 293/500 - Loss: 0.637 lengthscale: 39.155 noise: 0.067
Iter 294/500 - Loss: 0.637 lengthscale: 39.240 noise: 0.067
Iter 295/500 - Loss: 0.637 lengthscale: 39.326 noise: 0.067
Iter 296/500 - Loss: 0.636 lengthscale: 39.411 noise: 0.067
Iter 297/500 - Loss: 0.636 lengthscale: 39.497 noise: 0.067
Iter 298/500 - Loss: 0.636 lengthscale: 39.582 noise: 0.067
Iter 299/500 - Loss: 0.636 lengthscale: 39.667 noise: 0.067
Iter 300/500 - Loss: 0.635 lengthscale: 39.752 noise: 0.067
Iter 301/500 - Loss: 0.635 lengthscale: 39.837 noise: 0.067
Iter 302/500 - Loss: 0.635 lengthscale: 39.922 noise: 0.067
Iter 303/500 - Loss: 0.634 lengthscale: 40.007 noise: 0.067
Iter 304/500 - Loss: 0.634 lengthscale: 40.092 noise: 0.067
Iter 305/500 - Loss: 0.634 lengthscale: 40.177 noise: 0.067
Iter 306/500 - Loss: 0.634 lengthscale: 40.261 noise: 0.067
Iter 307/500 - Loss: 0.633 lengthscale: 40.346 noise: 0.067
Iter 308/500 - Loss: 0.633 lengthscale: 40.430 noise: 0.067
Iter 309/500 - Loss: 0.633 lengthscale: 40.515 noise: 0.067
Iter 310/500 - Loss: 0.632 lengthscale: 40.599 noise: 0.067
Iter 311/500 - Loss: 0.632 lengthscale: 40.683 noise: 0.067
Iter 312/500 - Loss: 0.632 lengthscale: 40.767 noise: 0.067
Iter 313/500 - Loss: 0.632 lengthscale: 40.851 noise: 0.067
Iter 314/500 - Loss: 0.631 lengthscale: 40.935 noise: 0.067
Iter 315/500 - Loss: 0.631 lengthscale: 41.019 noise: 0.067
Iter 316/500 - Loss: 0.631 lengthscale: 41.103 noise: 0.067
Iter 317/500 - Loss: 0.630 lengthscale: 41.187 noise: 0.067
Iter 318/500 - Loss: 0.630 lengthscale: 41.271 noise: 0.067
Iter 319/500 - Loss: 0.630 lengthscale: 41.354 noise: 0.067
Iter 320/500 - Loss: 0.630 lengthscale: 41.438 noise: 0.067
Iter 321/500 - Loss: 0.629 lengthscale: 41.521 noise: 0.068
Iter 322/500 - Loss: 0.629 lengthscale: 41.605 noise: 0.068
Iter 323/500 - Loss: 0.629 lengthscale: 41.688 noise: 0.068
Iter 324/500 - Loss: 0.629 lengthscale: 41.771 noise: 0.068
Iter 325/500 - Loss: 0.628 lengthscale: 41.855 noise: 0.068
Iter 326/500 - Loss: 0.628 lengthscale: 41.938 noise: 0.068
Iter 327/500 - Loss: 0.628 lengthscale: 42.021 noise: 0.068
Iter 328/500 - Loss: 0.627 lengthscale: 42.104 noise: 0.068
Iter 329/500 - Loss: 0.627 lengthscale: 42.187 noise: 0.068
Iter 330/500 - Loss: 0.627 lengthscale: 42.269 noise: 0.068
Iter 331/500 - Loss: 0.627 lengthscale: 42.352 noise: 0.068
Iter 332/500 - Loss: 0.626 lengthscale: 42.435 noise: 0.068
Iter 333/500 - Loss: 0.626 lengthscale: 42.517 noise: 0.068
Iter 334/500 - Loss: 0.626 lengthscale: 42.600 noise: 0.068
Iter 335/500 - Loss: 0.626 lengthscale: 42.682 noise: 0.068
Iter 336/500 - Loss: 0.625 lengthscale: 42.765 noise: 0.068
Iter 337/500 - Loss: 0.625 lengthscale: 42.847 noise: 0.068
Iter 338/500 - Loss: 0.625 lengthscale: 42.929 noise: 0.068
Iter 339/500 - Loss: 0.625 lengthscale: 43.012 noise: 0.068
Iter 340/500 - Loss: 0.624 lengthscale: 43.094 noise: 0.068
Iter 341/500 - Loss: 0.624 lengthscale: 43.176 noise: 0.068
Iter 342/500 - Loss: 0.624 lengthscale: 43.258 noise: 0.068
Iter 343/500 - Loss: 0.624 lengthscale: 43.340 noise: 0.068
Iter 344/500 - Loss: 0.623 lengthscale: 43.422 noise: 0.068
Iter 345/500 - Loss: 0.623 lengthscale: 43.503 noise: 0.068
Iter 346/500 - Loss: 0.623 lengthscale: 43.585 noise: 0.068
Iter 347/500 - Loss: 0.623 lengthscale: 43.667 noise: 0.068
Iter 348/500 - Loss: 0.622 lengthscale: 43.748 noise: 0.068
Iter 349/500 - Loss: 0.622 lengthscale: 43.830 noise: 0.068
Iter 350/500 - Loss: 0.622 lengthscale: 43.911 noise: 0.068
Iter 351/500 - Loss: 0.622 lengthscale: 43.993 noise: 0.068
Iter 352/500 - Loss: 0.621 lengthscale: 44.074 noise: 0.068
Iter 353/500 - Loss: 0.621 lengthscale: 44.155 noise: 0.068
Iter 354/500 - Loss: 0.621 lengthscale: 44.237 noise: 0.068
Iter 355/500 - Loss: 0.621 lengthscale: 44.318 noise: 0.068
Iter 356/500 - Loss: 0.620 lengthscale: 44.399 noise: 0.068
Iter 357/500 - Loss: 0.620 lengthscale: 44.480 noise: 0.068
Iter 358/500 - Loss: 0.620 lengthscale: 44.561 noise: 0.068
Iter 359/500 - Loss: 0.620 lengthscale: 44.642 noise: 0.068
Iter 360/500 - Loss: 0.619 lengthscale: 44.722 noise: 0.068
Iter 361/500 - Loss: 0.619 lengthscale: 44.803 noise: 0.068
Iter 362/500 - Loss: 0.619 lengthscale: 44.884 noise: 0.068
Iter 363/500 - Loss: 0.619 lengthscale: 44.964 noise: 0.068
Iter 364/500 - Loss: 0.618 lengthscale: 45.045 noise: 0.068
Iter 365/500 - Loss: 0.618 lengthscale: 45.125 noise: 0.068
Iter 366/500 - Loss: 0.618 lengthscale: 45.206 noise: 0.067
Iter 367/500 - Loss: 0.618 lengthscale: 45.286 noise: 0.067
Iter 368/500 - Loss: 0.617 lengthscale: 45.367 noise: 0.067
Iter 369/500 - Loss: 0.617 lengthscale: 45.447 noise: 0.067
Iter 370/500 - Loss: 0.617 lengthscale: 45.527 noise: 0.067
Iter 371/500 - Loss: 0.617 lengthscale: 45.607 noise: 0.067
Iter 372/500 - Loss: 0.617 lengthscale: 45.687 noise: 0.067
Iter 373/500 - Loss: 0.616 lengthscale: 45.767 noise: 0.067
Iter 374/500 - Loss: 0.616 lengthscale: 45.847 noise: 0.067
Iter 375/500 - Loss: 0.616 lengthscale: 45.927 noise: 0.067
Iter 376/500 - Loss: 0.616 lengthscale: 46.007 noise: 0.067
Iter 377/500 - Loss: 0.615 lengthscale: 46.087 noise: 0.067
Iter 378/500 - Loss: 0.615 lengthscale: 46.166 noise: 0.067
Iter 379/500 - Loss: 0.615 lengthscale: 46.246 noise: 0.067
Iter 380/500 - Loss: 0.615 lengthscale: 46.325 noise: 0.067
Iter 381/500 - Loss: 0.614 lengthscale: 46.405 noise: 0.067
Iter 382/500 - Loss: 0.614 lengthscale: 46.484 noise: 0.067
Iter 383/500 - Loss: 0.614 lengthscale: 46.564 noise: 0.067
Iter 384/500 - Loss: 0.614 lengthscale: 46.643 noise: 0.067
Iter 385/500 - Loss: 0.614 lengthscale: 46.722 noise: 0.067
Iter 386/500 - Loss: 0.613 lengthscale: 46.802 noise: 0.067
Iter 387/500 - Loss: 0.613 lengthscale: 46.881 noise: 0.067
Iter 388/500 - Loss: 0.613 lengthscale: 46.960 noise: 0.067
Iter 389/500 - Loss: 0.613 lengthscale: 47.039 noise: 0.067
Iter 390/500 - Loss: 0.612 lengthscale: 47.118 noise: 0.067
Iter 391/500 - Loss: 0.612 lengthscale: 47.197 noise: 0.067
Iter 392/500 - Loss: 0.612 lengthscale: 47.276 noise: 0.067
Iter 393/500 - Loss: 0.612 lengthscale: 47.354 noise: 0.067
Iter 394/500 - Loss: 0.612 lengthscale: 47.433 noise: 0.067
Iter 395/500 - Loss: 0.611 lengthscale: 47.512 noise: 0.067
Iter 396/500 - Loss: 0.611 lengthscale: 47.590 noise: 0.067
Iter 397/500 - Loss: 0.611 lengthscale: 47.669 noise: 0.067
Iter 398/500 - Loss: 0.611 lengthscale: 47.747 noise: 0.067
Iter 399/500 - Loss: 0.610 lengthscale: 47.826 noise: 0.067
Iter 400/500 - Loss: 0.610 lengthscale: 47.904 noise: 0.067
Iter 401/500 - Loss: 0.610 lengthscale: 47.983 noise: 0.067
Iter 402/500 - Loss: 0.610 lengthscale: 48.061 noise: 0.067
Iter 403/500 - Loss: 0.610 lengthscale: 48.139 noise: 0.067
Iter 404/500 - Loss: 0.609 lengthscale: 48.217 noise: 0.067
Iter 405/500 - Loss: 0.609 lengthscale: 48.295 noise: 0.067
Iter 406/500 - Loss: 0.609 lengthscale: 48.373 noise: 0.067
Iter 407/500 - Loss: 0.609 lengthscale: 48.451 noise: 0.067
Iter 408/500 - Loss: 0.609 lengthscale: 48.529 noise: 0.067
Iter 409/500 - Loss: 0.608 lengthscale: 48.607 noise: 0.067
Iter 410/500 - Loss: 0.608 lengthscale: 48.685 noise: 0.067
Iter 411/500 - Loss: 0.608 lengthscale: 48.763 noise: 0.067
Iter 412/500 - Loss: 0.608 lengthscale: 48.840 noise: 0.067
Iter 413/500 - Loss: 0.608 lengthscale: 48.918 noise: 0.067
Iter 414/500 - Loss: 0.607 lengthscale: 48.996 noise: 0.067
Iter 415/500 - Loss: 0.607 lengthscale: 49.073 noise: 0.067
Iter 416/500 - Loss: 0.607 lengthscale: 49.151 noise: 0.067
Iter 417/500 - Loss: 0.607 lengthscale: 49.228 noise: 0.067
Iter 418/500 - Loss: 0.607 lengthscale: 49.305 noise: 0.067
Iter 419/500 - Loss: 0.606 lengthscale: 49.383 noise: 0.067
Iter 420/500 - Loss: 0.606 lengthscale: 49.460 noise: 0.067
Iter 421/500 - Loss: 0.606 lengthscale: 49.537 noise: 0.067
Iter 422/500 - Loss: 0.606 lengthscale: 49.614 noise: 0.067
Iter 423/500 - Loss: 0.606 lengthscale: 49.691 noise: 0.067
Iter 424/500 - Loss: 0.605 lengthscale: 49.768 noise: 0.067
Iter 425/500 - Loss: 0.605 lengthscale: 49.845 noise: 0.067
Iter 426/500 - Loss: 0.605 lengthscale: 49.922 noise: 0.067
Iter 427/500 - Loss: 0.605 lengthscale: 49.999 noise: 0.067
Iter 428/500 - Loss: 0.605 lengthscale: 50.076 noise: 0.067
Iter 429/500 - Loss: 0.604 lengthscale: 50.153 noise: 0.067
Iter 430/500 - Loss: 0.604 lengthscale: 50.230 noise: 0.067
Iter 431/500 - Loss: 0.604 lengthscale: 50.306 noise: 0.067
Iter 432/500 - Loss: 0.604 lengthscale: 50.383 noise: 0.067
Iter 433/500 - Loss: 0.604 lengthscale: 50.459 noise: 0.067
Iter 434/500 - Loss: 0.603 lengthscale: 50.536 noise: 0.067
Iter 435/500 - Loss: 0.603 lengthscale: 50.612 noise: 0.067
Iter 436/500 - Loss: 0.603 lengthscale: 50.689 noise: 0.067
Iter 437/500 - Loss: 0.603 lengthscale: 50.765 noise: 0.067
Iter 438/500 - Loss: 0.603 lengthscale: 50.841 noise: 0.067
Iter 439/500 - Loss: 0.602 lengthscale: 50.917 noise: 0.067
Iter 440/500 - Loss: 0.602 lengthscale: 50.994 noise: 0.067
Iter 441/500 - Loss: 0.602 lengthscale: 51.070 noise: 0.067
Iter 442/500 - Loss: 0.602 lengthscale: 51.146 noise: 0.067
Iter 443/500 - Loss: 0.602 lengthscale: 51.222 noise: 0.067
Iter 444/500 - Loss: 0.601 lengthscale: 51.298 noise: 0.067
Iter 445/500 - Loss: 0.601 lengthscale: 51.374 noise: 0.067
Iter 446/500 - Loss: 0.601 lengthscale: 51.449 noise: 0.067
Iter 447/500 - Loss: 0.601 lengthscale: 51.525 noise: 0.067
Iter 448/500 - Loss: 0.601 lengthscale: 51.601 noise: 0.067
Iter 449/500 - Loss: 0.600 lengthscale: 51.677 noise: 0.067
Iter 450/500 - Loss: 0.600 lengthscale: 51.752 noise: 0.067
Iter 451/500 - Loss: 0.600 lengthscale: 51.828 noise: 0.067
Iter 452/500 - Loss: 0.600 lengthscale: 51.903 noise: 0.067
Iter 453/500 - Loss: 0.600 lengthscale: 51.979 noise: 0.067
Iter 454/500 - Loss: 0.600 lengthscale: 52.054 noise: 0.067
Iter 455/500 - Loss: 0.599 lengthscale: 52.130 noise: 0.067
Iter 456/500 - Loss: 0.599 lengthscale: 52.205 noise: 0.067
Iter 457/500 - Loss: 0.599 lengthscale: 52.280 noise: 0.067
Iter 458/500 - Loss: 0.599 lengthscale: 52.355 noise: 0.067
Iter 459/500 - Loss: 0.599 lengthscale: 52.431 noise: 0.067
Iter 460/500 - Loss: 0.598 lengthscale: 52.506 noise: 0.067
Iter 461/500 - Loss: 0.598 lengthscale: 52.581 noise: 0.067
Iter 462/500 - Loss: 0.598 lengthscale: 52.656 noise: 0.067
Iter 463/500 - Loss: 0.598 lengthscale: 52.731 noise: 0.067
Iter 464/500 - Loss: 0.598 lengthscale: 52.806 noise: 0.067
Iter 465/500 - Loss: 0.598 lengthscale: 52.880 noise: 0.067
Iter 466/500 - Loss: 0.597 lengthscale: 52.955 noise: 0.067
Iter 467/500 - Loss: 0.597 lengthscale: 53.030 noise: 0.067
Iter 468/500 - Loss: 0.597 lengthscale: 53.105 noise: 0.067
Iter 469/500 - Loss: 0.597 lengthscale: 53.179 noise: 0.067
Iter 470/500 - Loss: 0.597 lengthscale: 53.254 noise: 0.067
Iter 471/500 - Loss: 0.596 lengthscale: 53.328 noise: 0.067
Iter 472/500 - Loss: 0.596 lengthscale: 53.403 noise: 0.067
Iter 473/500 - Loss: 0.596 lengthscale: 53.477 noise: 0.067
Iter 474/500 - Loss: 0.596 lengthscale: 53.552 noise: 0.067
Iter 475/500 - Loss: 0.596 lengthscale: 53.626 noise: 0.067
Iter 476/500 - Loss: 0.596 lengthscale: 53.700 noise: 0.067
Iter 477/500 - Loss: 0.595 lengthscale: 53.774 noise: 0.067
Iter 478/500 - Loss: 0.595 lengthscale: 53.849 noise: 0.067
Iter 479/500 - Loss: 0.595 lengthscale: 53.923 noise: 0.067
Iter 480/500 - Loss: 0.595 lengthscale: 53.997 noise: 0.067
Iter 481/500 - Loss: 0.595 lengthscale: 54.071 noise: 0.067
Iter 482/500 - Loss: 0.595 lengthscale: 54.145 noise: 0.067
Iter 483/500 - Loss: 0.594 lengthscale: 54.219 noise: 0.067
Iter 484/500 - Loss: 0.594 lengthscale: 54.293 noise: 0.067
Iter 485/500 - Loss: 0.594 lengthscale: 54.366 noise: 0.067
Iter 486/500 - Loss: 0.594 lengthscale: 54.440 noise: 0.067
Iter 487/500 - Loss: 0.594 lengthscale: 54.514 noise: 0.067
Iter 488/500 - Loss: 0.594 lengthscale: 54.587 noise: 0.067
Iter 489/500 - Loss: 0.593 lengthscale: 54.661 noise: 0.067
Iter 490/500 - Loss: 0.593 lengthscale: 54.735 noise: 0.067
Iter 491/500 - Loss: 0.593 lengthscale: 54.808 noise: 0.067
Iter 492/500 - Loss: 0.593 lengthscale: 54.882 noise: 0.067
Iter 493/500 - Loss: 0.593 lengthscale: 54.955 noise: 0.067
Iter 494/500 - Loss: 0.592 lengthscale: 55.028 noise: 0.067
Iter 495/500 - Loss: 0.592 lengthscale: 55.102 noise: 0.067
Iter 496/500 - Loss: 0.592 lengthscale: 55.175 noise: 0.067
Iter 497/500 - Loss: 0.592 lengthscale: 55.248 noise: 0.067
Iter 498/500 - Loss: 0.592 lengthscale: 55.321 noise: 0.067
Iter 499/500 - Loss: 0.592 lengthscale: 55.394 noise: 0.067
Iter 500/500 - Loss: 0.592 lengthscale: 55.467 noise: 0.067
Now, we test predictions. For simplicity, we will assume a fixed variance of \(0.01.\)
[8]:
# Get into evaluation (predictive posterior) mode
model.eval()
likelihood.eval()
# Test points are regularly spaced along [0,1]
# Make predictions by feeding model through likelihood
with torch.no_grad(), gpytorch.settings.fast_pred_var():
test_x = torch.linspace(0, 1, 51)
test_x_distributional = torch.stack((test_x, (1e-3 * torch.ones_like(test_x)).log()), dim=1)
observed_pred = likelihood(model(test_x_distributional))
with torch.no_grad():
# Initialize plot
f, ax = plt.subplots(1, 1, figsize=(8, 3))
# Get upper and lower confidence bounds
lower, upper = observed_pred.confidence_region()
# Plot training data as black stars
ax.errorbar(train_x_mean.numpy(), train_y.numpy(), xerr=train_x_stdv, fmt='k*')
# Plot predictive means as blue line
ax.plot(test_x.numpy(), observed_pred.mean.numpy(), 'b')
# Shade between the lower and upper confidence bounds
ax.fill_between(test_x.numpy(), lower.numpy(), upper.numpy(), alpha=0.5)
ax.set_ylim([-3, 3])
ax.legend(['Observed Data', 'Mean', 'Confidence'])

As a final note, we’ve made it very easy to extend the distributional kernel class by exposing a generic DistributionalInputKernel
class that takes as input any distance function over probability distributions.
[8]: