28 juillet 2025 à 1 août 2025
Fuseau horaire Europe/Paris

Asymptotic log-Sobolev constants and the Polyak-Łojasiewicz gradient domination condition

28 juil. 2025, 14:30
30m
F207

F207

Invited talk Machine learning ML

Orateur

Austin Stromme (ENSAE Paris)

Description

The Polyak-Łojasiewicz (PL) constant for a given function exactly characterizes the exponential rate of convergence of gradient flow uniformly over initializations, and has been of major recent interest in optimization and machine learning because it is strictly weaker than strong convexity yet implies many of the same results. In the world of sampling, the log-Sobolev inequality plays an analogous role, governing the convergence of Langevin dynamics from arbitrary initialization in Kullback-Leibler divergence. In this talk, we present a new connection between optimization and sampling by showing that the PL constant is exactly the low temperature limit of the re-scaled log-Sobolev constant, under mild assumptions. Based on joint work with Sinho Chewi.

Authors

Prof. Sinho Chewi (Yale) Austin Stromme (ENSAE Paris)

Documents de présentation

Aucun document.