Choisissez le fuseau horaire
Le fuseau horaire de votre profil:
Promoting sparse connections in neural networks is a natural objective to control their complexity and allow flexible tradeoffs between performance and "frugal" resource consumption. In inverse problems and variable selection, sparsity has indeed led to well mastered algorithms with provably good performance and bounded complexity. To what extent can we leverage this body of knowledge in a deep context ? Through an overview of recent explorations around the theme of deep sparsity, I will compare and contrast classical sparse regularization for inverse problems with multilayer sparse regularization. During our journey, I will notably highlight the role of rescaling-invariances in deep ReLU parameterizations.