# PCA-Kernel Estimation

3 CLASSIC - Computational Learning, Aggregation, Supervised Statistical, Inference, and Classification
DMA - Département de Mathématiques et Applications - ENS Paris, ENS Paris - École normale supérieure - Paris, Inria Paris-Rocquencourt
Abstract : Many statistical estimation techniques for high-dimensional or functional data are based on a preliminary dimension reduction step, which consists in projecting the sample $\bX_1, \hdots, \bX_n$ onto the first $D$ eigenvectors of the Principal Component Analysis (PCA) associated with the empirical projector $\hat \Pi_D$. Classical nonparametric inference methods such as kernel density estimation or kernel regression analysis are then performed in the (usually small) $D$-di\-men\-sio\-nal space. However, the mathematical analysis of this data-driven dimension reduction scheme raises technical problems, due to the fact that the random variables of the projected sample $( \hat \Pi_D\bX_1,\hdots, \hat \Pi_D\bX_n )$ are no more independent. As a reference for further studies, we offer in this paper several results showing the asymptotic equivalencies between important kernel-related quantities based on the empirical projector and its theoretical counterpart. As an illustration, we provide an in-depth analysis of the nonparametric kernel regression case
Document type :
Journal articles
Domain :

Cited literature [28 references]

https://hal.archives-ouvertes.fr/hal-00467013
Contributor : André Mas <>
Submitted on : Thursday, March 25, 2010 - 3:40:32 PM
Last modification on : Wednesday, October 14, 2020 - 3:58:01 AM
Long-term archiving on: : Sunday, June 27, 2010 - 8:01:22 PM

### Files

KPCAv3.pdf
Files produced by the author(s)

### Citation

Gérard Biau, André Mas. PCA-Kernel Estimation. Statistics & Risk Modeling with Applications in Finance and Insurance, De Gruyter, 2012, 29 (1), pp.19-46. ⟨10.1524/strm.2012.1084⟩. ⟨hal-00467013⟩

Record views