PCA-Kernel Estimation

Abstract : Many statistical estimation techniques for high-dimensional or functional data are based on a preliminary dimension reduction step, which consists in projecting the sample $\bX_1, \hdots, \bX_n$ onto the first $D$ eigenvectors of the Principal Component Analysis (PCA) associated with the empirical projector $\hat \Pi_D$. Classical nonparametric inference methods such as kernel density estimation or kernel regression analysis are then performed in the (usually small) $D$-di\-men\-sio\-nal space. However, the mathematical analysis of this data-driven dimension reduction scheme raises technical problems, due to the fact that the random variables of the projected sample $( \hat \Pi_D\bX_1,\hdots, \hat \Pi_D\bX_n )$ are no more independent. As a reference for further studies, we offer in this paper several results showing the asymptotic equivalencies between important kernel-related quantities based on the empirical projector and its theoretical counterpart. As an illustration, we provide an in-depth analysis of the nonparametric kernel regression case
Type de document :
Article dans une revue
Statistics and Risk Modeling, De Gruyter, 2012, pp.29, 19-46
Liste complète des métadonnées

Littérature citée [28 références]  Voir  Masquer  Télécharger

Contributeur : André Mas <>
Soumis le : jeudi 25 mars 2010 - 15:40:32
Dernière modification le : samedi 23 juin 2018 - 01:21:21
Document(s) archivé(s) le : dimanche 27 juin 2010 - 20:01:22


Fichiers produits par l'(les) auteur(s)


  • HAL Id : hal-00467013, version 1
  • ARXIV : 1003.5089


Gérard Biau, André Mas. PCA-Kernel Estimation. Statistics and Risk Modeling, De Gruyter, 2012, pp.29, 19-46. 〈hal-00467013〉



Consultations de la notice


Téléchargements de fichiers