28 juillet 2025 à 1 août 2025
Fuseau horaire Europe/Paris

Communication-efficient distributed optimization algorithms

28 juil. 2025, 14:00
45m
Navier

Navier

Invited talk Communication-Efficient methods for distributed optimization and federated learning Mini-symposium

Orateur

Dr Laurent Condat (KAUST)

Description

In distributed optimization and machine learning, a large number of machines perform computations in parallel and communicate back and forth with a server. In particular, in federated learning, the distributed training process is run on personal devices such as mobile phones. In this context, communication, that can be slow, costly and unreliable, forms the main bottleneck. To reduce communication, two strategies are popular: 1) local training, that consists in communicating less frequently; 2) compression. Also, a robust algorithm should allow for partial participation. I will present several randomized algorithms we developed recently, with proved convergence guarantees and accelerated complexity. Our most recent paper “LoCoDL: Communication-Efficient Distributed Learning with Local Training and Compression” has been presented at ICLR 2025 as a Spotlight.

Author

Dr Laurent Condat (KAUST)

Documents de présentation

Aucun document.