28 juillet 2025 à 1 août 2025
Fuseau horaire Europe/Paris

An Improved Analysis of the Clipped Stochastic subGradient Method under Heavy-Tailed Noise

30 juil. 2025, 11:45
30m
F102

F102

Invited talk Stochastic Programming Stochastic Programming

Orateur

Saverio Salzo (Sapienza Università di Roma)

Description

In this talk, we show novel optimal (or near optimal) convergence rates for a clipped version of the projected stochastic subgradient method. We consider nonsmooth convex problems in Hilbert spaces over possibly unbounded domains, under heavy-tailed noise that possesses only the first $p$ moments for $p \in \left]1,2\right]$. For the last iterate, we establish convergence in expectation with rates of order $(\log^{1/p} k)/k^{(p-1)/p}$ and $1/k^{(p-1)/p}$ for infinite and finite-horizon respectively. We also derive new convergence rates, in expectation and with high probability, for the average iterate --- improving the state of the art. Those results are applied to the problem of supervised learning with kernels demonstrating the effectiveness of our theory. Finally, we give preliminary experiments.

Authors

Prof. Andrea Paudice (Aarhus University) Daniela A Parletta (Università degli Studi di Genova) Saverio Salzo (Sapienza Università di Roma)

Documents de présentation

Aucun document.