28 juillet 2025 à 1 août 2025
Fuseau horaire Europe/Paris

Contrasting and combining Wasserstein and Fisher-Rao flows for relative entropy minimization

31 juil. 2025, 11:45
30m
F202

F202

Invited talk Machine learning ML

Orateur

Jia-Jie Zhu (Weierstrass Institute, Berlin)

Description

Many problems in machine learning can be framed as variational problems that minimize the relative entropy between two probability measures. Many recent works have exploited the connection between the (Otto-)Wasserstein gradient flow of the Kullback-–Leibler (KL) divergence and various sampling, Bayesian inference, and generative modeling algorithms. In this talk, I will first contrast the Wasserstein flow with the Fisher-Rao flows of those divergences, and showcase their distinct analysis properties when working with different relative entropy driving energies, including the reverse and forward KL divergence. Building upon recent advances in the mathematical foundation of the Hellinger-Kantorovich (HK, a.k.a. Wasserstein-Fisher-Rao) gradient flows, I will then show the analysis of the HK flows and its implications for computational algorithms for machine learning and optimization.

Author

Jia-Jie Zhu (Weierstrass Institute, Berlin)

Documents de présentation

Aucun document.