Skip to Main content Skip to Navigation
Journal articles

Learning with infinitely many features

Abstract : We propose a principled framework for learning with infinitely many features, situations that are usually induced by continuously parametrized feature extraction methods. Such cases occur for instance when considering Gabor-based features in computer vision problems or when dealing with Fourier features for kernel approximations. We cast the problem as the one of finding a finite subset of features that minimizes a regularized empirical risk. After having analyzed the optimality conditions of such a problem, we propose a simple algorithm which has the avour of a column-generation technique. We also show that using Fourier-based features, it is possible to perform approximate infinite kernel learning. Our experimental results on several datasets show the benefits of the proposed approach in several situations including texture classification and large-scale kernelized problems (involving about 100 thousand examples).
Document type :
Journal articles
Complete list of metadatas

Cited literature [40 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-00735926
Contributor : Rémi Flamary <>
Submitted on : Thursday, September 27, 2012 - 11:19:08 AM
Last modification on : Wednesday, October 14, 2020 - 4:22:58 AM
Long-term archiving on: : Friday, December 16, 2016 - 5:22:25 PM

File

ml2012.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-00735926, version 1

Citation

Alain Rakotomamonjy, Rémi Flamary, Florian Yger. Learning with infinitely many features. Machine Learning, Springer Verlag, 2012. ⟨hal-00735926⟩

Share

Metrics

Record views

498

Files downloads

300