Skip to Main content Skip to Navigation
Journal articles

PCA-Kernel Estimation

Abstract : Many statistical estimation techniques for high-dimensional or functional data are based on a preliminary dimension reduction step, which consists in projecting the sample $\bX_1, \hdots, \bX_n$ onto the first $D$ eigenvectors of the Principal Component Analysis (PCA) associated with the empirical projector $\hat \Pi_D$. Classical nonparametric inference methods such as kernel density estimation or kernel regression analysis are then performed in the (usually small) $D$-di\-men\-sio\-nal space. However, the mathematical analysis of this data-driven dimension reduction scheme raises technical problems, due to the fact that the random variables of the projected sample $( \hat \Pi_D\bX_1,\hdots, \hat \Pi_D\bX_n )$ are no more independent. As a reference for further studies, we offer in this paper several results showing the asymptotic equivalencies between important kernel-related quantities based on the empirical projector and its theoretical counterpart. As an illustration, we provide an in-depth analysis of the nonparametric kernel regression case
Complete list of metadatas

Cited literature [28 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-00467013
Contributor : André Mas <>
Submitted on : Thursday, March 25, 2010 - 3:40:32 PM
Last modification on : Wednesday, October 14, 2020 - 3:58:01 AM
Long-term archiving on: : Sunday, June 27, 2010 - 8:01:22 PM

Files

KPCAv3.pdf
Files produced by the author(s)

Identifiers

Citation

Gérard Biau, André Mas. PCA-Kernel Estimation. Statistics & Risk Modeling with Applications in Finance and Insurance, De Gruyter, 2012, 29 (1), pp.19-46. ⟨10.1524/strm.2012.1084⟩. ⟨hal-00467013⟩

Share

Metrics

Record views

628

Files downloads

238