Gaussian process

Probability distribution

We have the follow impractical definition:

A Gaussian process is a collection of random variables, any finite number of which have consistent Gaussian distributions.

It is completely defined by its mean function \(m(x)\) and covariance function \(\Sigma(x, x')\). We write:

\begin{equation} f(x) \sim \mathcal{G}\left(m(x), \Sigma(x, x')\right) \end{equation}

A very import special case of Gaussian processes is Gaussian Markov processes


We often express the covariance function \(\Sigma(x, x')\) as a function of a kernel \(K(x,x')\)):

\begin{align*} \displaystyle \Sigma(x,x') = K(x, x') + \sigma^{2}_{y}\;\mathbb{I} \end{align*}

We can combine kernels. The multiplication of two covariance functions is a valid covariance functions, the addition of two correlation functions is a correlation function.





Here is a list of libraries implementating different flavors of Gaussian processes. The catch is that gaussian processes can quick get computation-intensive.

Links to this note