Skip to Main content Skip to Navigation
Journal articles

Consistency of functional learning methods based on derivatives

Abstract : In some real world applications, such as spectrometry, functional models achieve better predictive performances if they work on the derivatives of order m of their inputs rather than on the original functions. As a consequence, the use of derivatives is a common practice in Functional Data Analysis, despite a lack of theoretical guarantees on the asymptotically achievable performances of a derivative based model. In this paper, we show that a smoothing spline approach can be used to preprocess multivariate observations obtained by sampling functions on a discrete and finite sampling grid in a way that leads to a consistent scheme on the original infinite dimensional functional problem. This work extends (Mas and Pumo, 2009) to nonparametric approaches and incomplete knowledge. To be more precise, the paper tackles two difficulties in a nonparametric framework: the information loss due to the use of the derivatives instead of the original functions and the information loss due to the fact that the functions are observed through a discrete sampling and are thus also unperfectly known: the use of a smoothing spline based approach solves these two problems. Finally, the proposed approach is tested on two real world datasets and the approach is experimentaly proven to be a good solution in the case of noisy functional predictors.
Complete list of metadatas
Contributor : Nathalie Vialaneix <>
Submitted on : Sunday, May 1, 2011 - 3:16:41 PM
Last modification on : Friday, July 31, 2020 - 10:44:07 AM
Long-term archiving on: : Tuesday, August 2, 2011 - 2:28:00 AM


Files produced by the author(s)



Fabrice Rossi, Nathalie Villa-Vialaneix. Consistency of functional learning methods based on derivatives. Pattern Recognition Letters, Elsevier, 2011, 32 (8), pp.1197-1209. ⟨10.1016/j.patrec.2011.03.001⟩. ⟨hal-00589738⟩



Record views


Files downloads