Prof.
Arnak Dalalyan
(ENSAE / CREST)
21/06/2018 09:30
We study the maximum likelihood estimator of density of n independent observations, under the assumption that it is well approximated by a mixture with a large number of components. The main focus is on statistical properties with respect to the Kullback-Leibler loss. We establish risk bounds taking the form of sharp oracle inequalities both in deviation and in expectation. A simple...
Dr
Kaniav Kamary
(Universite Paris-Dauphine / CEREMADE / INRIA, Saclay)
21/06/2018 10:00
While mixtures of Gaussian distributions have been studied for
more than a century, the construction of a reference Bayesian analysis
of those models remains unsolved, with a general prohibition of improper
priors due to the ill-posed nature of such statistical objects. This diculty
is usually bypassed by an empirical Bayes resolution . By creating a new
parameterisation centred on the...
Prof.
Sébastien Gadat
(TSE)
21/06/2018 11:30
Dans ce travail théorique, nous étudions la question de l'estimation dans un modèle de contamination par translation. On observe un échantillon iid de loi à densité dans $R^d$
$$f^\star = (1-\lambda^\star) \phi + \lambda^\star \phi(.-\mu^\star)$$
et souhaitons étudier une méthode d'estimation de la probabilité de contamination $\lambda^\star$ et son effet $\mu^\star$.
Nous proposons...
Prof.
Béatrice Laurent
(IMT/INSA)
21/06/2018 12:00
We consider a d-dimensional i.i.d sample from a distribution with unknown density f. The problem of detection of a two-component mixture is considered. Our aim is to decide whether f is the density of a standard Gaussian random d-vector ($f=\phi_d$) against f is a two-component mixture: $f=(1−\varepsilon)\phi_{d}+\varepsilon \phi_{d}(.−\mu)$ where $(\varepsilon,\mu)$ are unknown parameters....