Learning Kernel Perceptrons on Noisy Data and Random Projections - Archive ouverte HAL Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2007

Learning Kernel Perceptrons on Noisy Data and Random Projections

Résumé

In this paper, we address the issue of learning nonlinearly separable concepts with a kernel classifier in the situation where the data at hand are altered by a uniform classification noise. Our proposed approach relies on the combination of the technique of random or deterministic projections with a classification noise tolerant perceptron learning algorithm that assumes distributions defined over finite-dimensional spaces. Provided a sufficient separation margin characterizes the problem, this strategy makes it possible to envision the learning from a noisy distribution in any separable Hilbert space, regardless of its dimension; learning with any appropriate Mercer kernel is therefore possible. We prove that the required sample complexity and running time of our algorithm is polynomial in the classical PAC learning parameters. Numerical simulations on toy datasets and on data from the UCI repository support the validity of our approach.
Fichier principal
Vignette du fichier
kernelnoise.pdf (802.52 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-00137941 , version 1 (22-03-2007)

Identifiants

  • HAL Id : hal-00137941 , version 1

Citer

Guillaume Stempfel, Liva Ralaivola. Learning Kernel Perceptrons on Noisy Data and Random Projections. 2007. ⟨hal-00137941⟩
105 Consultations
343 Téléchargements

Partager

Gmail Facebook X LinkedIn More