Simultaneous estimation of the mean and the variance in heteroscedastic Gaussian regression
Résumé
Let Y be a Gaussian vector of ℝn of mean s and diagonal covariance matrix Γ. Our aim is to estimate both s and the entries σi=Γi,i, for i=1,…,n, on the basis of the observation of two independent copies of Y. Our approach is free of any prior assumption on s but requires that we know some upper bound γ on the ratio max iσi/min iσi. For example, the choice γ=1 corresponds to the homoscedastic case where the components of Y are assumed to have common (unknown) variance. In the opposite, the choice γ>1 corresponds to the heteroscedastic case where the variances of the components of Y are allowed to vary within some range. Our estimation strategy is based on model selection. We consider a family {Sm×Σm, m∈ℳ} of parameter sets where Sm and Σm are linear spaces. To each m∈ℳ, we associate a pair of estimators (ŝm,σ̂m) of (s,σ) with values in Sm×Σm. Then we design a model selection procedure in view of selecting some m̂ among ℳ in such a way that the Kullback risk of (ŝm̂,σ̂m̂) is as close as possible to the minimum of the Kullback risks among the family of estimators {(ŝm,σ̂m), m∈ℳ}. Then we derive uniform rates of convergence for the estimator (ŝm̂,σ̂m̂) over Hölderian balls. Finally, we carry out a simulation study in order to illustrate the performances of our estimators in practice.
Origine : Fichiers éditeurs autorisés sur une archive ouverte
Loading...