A new approach to estimator selection
Résumé
In the framework of an abstract statistical model, we discuss how to use the solution of one estimation problem (Problem A) in order to construct an estimator in another, completely different, Problem B. As a solution of Problem A we understand a data-driven selection from a given family of estimators A(H) = {(A) over cap (h), h is an element of H} and establishing for the selected estimator so-called oracle inequality. If (h) over cap is an element of H is the selected parameter and B(H) = {(B) over cap (h), h is an element of H} is an estimator's collection built in Problem B, we suggest to use the estimator (B) over cap ((h) over cap). We present very general selection rule led to selector (h) over cap and find conditions under which the estimator (B) over cap ((h) over cap) is reasonable. Our approach is illustrated by several examples related to adaptive estimation.