PCA-Kernel Estimation

Abstract : Many statistical estimation techniques for high-dimensional or functional data are based on a preliminary dimension reduction step, which consists in projecting the sample $\bX_1, \hdots, \bX_n$ onto the first $D$ eigenvectors of the Principal Component Analysis (PCA) associated with the empirical projector $\hat \Pi_D$. Classical nonparametric inference methods such as kernel density estimation or kernel regression analysis are then performed in the (usually small) $D$-di\-men\-sio\-nal space. However, the mathematical analysis of this data-driven dimension reduction scheme raises technical problems, due to the fact that the random variables of the projected sample $( \hat \Pi_D\bX_1,\hdots, \hat \Pi_D\bX_n )$ are no more independent. As a reference for further studies, we offer in this paper several results showing the asymptotic equivalencies between important kernel-related quantities based on the empirical projector and its theoretical counterpart. As an illustration, we provide an in-depth analysis of the nonparametric kernel regression case
Complete list of metadatas

Cited literature [28 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-00467013
Contributor : André Mas <>
Submitted on : Thursday, March 25, 2010 - 3:40:32 PM
Last modification on : Tuesday, May 28, 2019 - 1:54:03 PM
Long-term archiving on : Sunday, June 27, 2010 - 8:01:22 PM

Files

KPCAv3.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-00467013, version 1
  • ARXIV : 1003.5089

Citation

Gérard Biau, André Mas. PCA-Kernel Estimation. Statistics & Risk Modeling with Applications in Finance and Insurance, De Gruyter, 2012, pp.29, 19-46. ⟨hal-00467013⟩

Share

Metrics

Record views

568

Files downloads

192