skip to content

Researcher: Matthew Thorpe, Jeff Calder, Riccardo Cristoferi, Olly Crook, Matt Cunlop, Nicolás García Trillos, Tim Hurst, Ryan Murray, Carola-Bibiane Schönlieb, Dejan Slepcev, Andrew Stuart, Florian Theil, Kostas Zygalakis

In this project we look at variational problems on random geometric graphs which utilise Laplacian regularisation, for example probit and Bayesian level set [1], kriging [2], and Ginzburg-Landau models [3,4]. The key idea is to move from a discrete problem (on the graph) to a continuum problem (defined on a subset of R^d) in the data rich limit. The variational methods we study come from machine learning and the discrete-to-continuum topology is metrised in the TLp space introduced in [5]. Current research includes better understanding the semi-supervised learning problem [2], and in particular how much training data one needs for asymptotic consistency.

Related Publications 

Large Data and Zero Noise Limits of Graph-Based Semi-Supervised Learning Algorithms
MM Dunlop, AM Stuart, D Slepcev, ME Thorpe
Analysis of p-Laplacian Regularization in Semi-Supervised Learning
ME Thorpe, D Slepcev
Asymptotic Analysis of the Ginzburg-Landau Functional on Point Clouds
ME Thorpe, F Theil
Large data limit for a phase transition model with the p-Laplacian on point clouds
R Cristoferi, ME Thorpe