Séminaire de Statistique et Optimisation
# Hermite regression estimation in noisy convolution model

→
Europe/Paris

Salle K. Johnson, 1er étage (1R3)
### Salle K. Johnson, 1er étage

#### 1R3

Description

In this work, we consider the following regression model: $y(x_k)=f\star g(x_k)+\varepsilon_k, x_k=kT/n, k=-n, \dots, n-1$, fixed $T$; $g$ is known and $f$ is the unknown function to be estimated. The errors $(\varepsilon_k)_{-n\le k\le n-1}$ are i.i.d. centered with finite known variance. We propose two estimation procedures by exploiting the properties of the Hermite basis. The first is a {\it deconvolution-projection} method based on the decomposition of $h$ in the Hermite basis and an inverse Fourier transform of $f$. The second is a {\it projection-projection} approach: it consists to decompose both functions $f$ and $h$ in the Hermite basis. For each method, a risk bound is proved. If $f$ belongs to Sobolev regularity spaces, we derive rates of convergence. Adaptive procedures to select the relevant parameters inspired by the Goldenshluger and Lepski (2011) method are also proposed and we prove that the resulting estimators satisfy oracle inequalities for sub-Gaussian $\varepsilon$'s. Finally, numerical studies are performed to illustrate the theoretical results.