Orateur
Description
We propose a distributionally robust formulation for the simultaneous estimation of the covariance and precision matrix of a random vector. The proposed model minimizes the worst-case weighted sum of the Stein's loss of the precision matrix estimator and the Frobenius loss of the covariance estimator against all distributions from an ambiguity set centered at the empirical distribution. The radius of the ambiguity set is measured via convex spectral divergences. We show that the proposed distributionally robust estimation model reduces to convex optimization problems and thus gives rise to computationally tractable estimators. The estimators, as an outcome of distributional robustness, are shown to be nonlinear shrinkage estimators. The eigenvalues of the estimators are shrunk nonlinearly towards a scalar matrix, where the scalar is determined by the weight coefficient of the loss terms. We show that the shrinkage effect improves the condition number of the estimator. We provide the explicit expression of the shrinkage estimators induced by Kullback-Leibler divergence, Wasserstein divergence, Symmetrized Stein divergence, and Frobenius divergence. We also provide a parameter-tuning scheme that adjusts the shrinkage target and intensity that is asymptotically optimal. Numerical experiments on synthetic and real data show that our shrinkage estimators perform competitively against state-of-the-art estimators in practical applications.