Séminaire de Statistique et Optimisation

Exact Continuous Relaxations of L0-Regularized Problems

par Emmanuel Soubies (CNRS IRIT)

Europe/Paris
Salle K. Johnson (1R3, 1er étage)

Salle K. Johnson

1R3, 1er étage

Description
Sparse models are widely used in fields such as statistics, computer vision, signal/image processing and machine learning. The natural sparsity promoting regularizer is the l0 pseudo-norm which is discontinuous and non-convex. In this talk, we will present the l0-Bregman relaxation (B-Rex), a general framework to compute exact continuous relaxations of such l0-regularized criteria. Although in general still non-convex, these continuous relaxations are qualified as exact in the sense that they let unchanged the set of global minimizer while enjoying a better optimization landscape. In particular, we will show that some local minimizers of the initial functional are eliminated by these relaxations. Finally, these properties will be illustrated on both sparse Kullback-Leibler regression and sparse logistic regression problems.

 

This is joint work with M'hamed Essafri and Luca Calatroni.