Jul 4 – 6, 2022
Laboratoire Paul Painlevé
Europe/Paris timezone

Uniform Convergence Rates for Lipschitz Learning Down to Graph Connectivity

Jul 5, 2022, 10:00 AM
30m
M2 building, Cité Scientifique - Meeting room, 1st floor (Laboratoire Paul Painlevé)

M2 building, Cité Scientifique - Meeting room, 1st floor

Laboratoire Paul Painlevé

Speaker

Tim Roith (Friedrich-Alexander-Universität Erlangen-Nürnberg)

Description

Discrete to continuum convergence results for graph-based learning have seen an increased interest in the last years. In particular, the connections between discrete machine learning and continuum partial differential equations or variational problems, lead to new insights and better algorithms.

This talk considers Lipschitz learning — which is the limit of $p$-Laplacian learning for $p$ to infinity — and introduces new proof strategies for the discrete to continuum limit. Our framework provides a convergence result in a sparse graph regime and additionally yields convergence rates. Employing a homogenized non-local operator with a much larger bandwidth allows us to extend uniform convergence rates to any graph length scale strictly above graph connectivity. We will sketch the ideas of the proof and indicate how the approach may be used in other problems, like spectral convergence of the graph Laplacian.

Primary author

Tim Roith (Friedrich-Alexander-Universität Erlangen-Nürnberg)

Presentation materials

There are no materials yet.