Orateur
Description
In this talk we address the minimization of a continuously differentiable convex function under linear equality constraints. We consider a second-order dynamical system with asymptotically vanishing damping term formulated in terms of the Augmented Lagrangian associated with the minimization problem. Time discretization leads to an inertial algorithm with a general rule for the inertial parameters that covers the classical ones by Nesterov, Chambolle-Dossal and Attouch-Cabot used in the context of fast gradient methods. In both settings we prove fast convergence of the primal-dual gap, the feasibility measure, and the objective function value along the generated trajectory/iterates, and also weak convergence of the primal-dual trajectory/iterates to a primal-dual optimal solution.
For the unconstrained minimization of a convex differentiable function we rediscover all convergence statements obtained in the literature for Nesterov’s accelerated gradient method. In addition, an alternative approach relying on the fast solving of monotone equations will be presented.