Random walks in graph-based learning
I will discuss several applications of random walks to graph-based learning, both for theoretical analysis and algorithm development. Graph-based learning is a field within machine learning that uses similarities between datapoints to create efficient representations of high-dimensional data for tasks like semi-supervised classification, clustering and dimension reduction. Our first application will be to use the random walk interpretation of the graph Laplacian to characterize the lowest label rate at which Laplacian regularized semi-supervised learning is well-posed. Second, we will show how analysis via random walks leads to a new algorithm that we call Poisson learning for semi-supervised learning at very low label rates. Finally, we will show how stochastic coupling of random walks can be used to prove Lipschitz estimates for graph Laplacian eigenfunctions on random geometric graphs, leading to new spectral convergence results.
This talk will cover joint work with many people, including Brendan Cook (UMN), Nicolas-Garcia Trillos (Wisconsin-Madison), Marta Lewicka (Pittsburgh), Dejan Slepcev (CMU), Matthew Thorpe (University Manchester).