Graph-based Bayesian semi-supervised learning: prior design and posterior contraction.
In this talk I will introduce graphical representations of stochastic partial differential equations that allow to approximate Matern Gaussian fields on manifolds and generalize the Matern model to abstract point clouds. Under a manifold assumption, approximation error guarantees will be established building on the theory of spectral convergence of graph Laplacians. Graph-based Matern prior models facilitate computationally efficient inference and sampling exploiting sparsity of the precision matrix. Moreover, we will show that they are natural priors for Bayesian semi-supervised learning and can give optimal posterior contraction. This is joint work with Ruiyi Yang.