Séminaire de probabilités et statistiques

Learning heteroscedastic models via SOCP under group sparsity

par Katia Meziani (Paris Dauphine)




2 bouvevard Lavoisier 49000 Angers
Sparse estimation methods based on $\ell_1$ relaxation, such as the Lasso and the Dantzig selector, are powerful tools for estimating high dimensional linear models. However, in order to properly tune these methods, the variance of the noise is often required.
This constitutes a major obstacle for practical applications of these methods in various frameworks – such as time series, random fields, inverse problems – for which noise is rarely homoscedastic or with a level that is hard to know in advance. In this talk, we propose a new approach to the joint estimation of the conditional mean and the conditional variance in a high-dimensional (auto-)regression setting.
An attractive feature of our proposed estimator is that it is computable by solving a second-order cone program (SOCP).
We present numerical results assessing the performance of the proposed procedure both on simulations and on real data. We also establish non-asymptotic risk bounds which are nearly as strong as those for original $\ell_1$-penalized estimators.
Your browser is out of date!

Update your browser to view this website correctly. Update my browser now