Speaker
Description
Discrete to continuum convergence results for graph-based learning have seen an increased interest in the last years. In particular, the connections between discrete machine learning and continuum partial differential equations or variational problems, lead to new insights and better algorithms.
This talk considers Lipschitz learning — which is the limit of $p$-Laplacian learning for $p$ to infinity — and introduces new proof strategies for the discrete to continuum limit. Our framework provides a convergence result in a sparse graph regime and additionally yields convergence rates. Employing a homogenized non-local operator with a much larger bandwidth allows us to extend uniform convergence rates to any graph length scale strictly above graph connectivity. We will sketch the ideas of the proof and indicate how the approach may be used in other problems, like spectral convergence of the graph Laplacian.