On the Consistency of Max-Margin Losses - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

On the Consistency of Max-Margin Losses

Résumé

The foundational concept of Max-Margin in machine learning is ill-posed for output spaces with more than two labels such as in structured prediction. In this paper, we show that the Max-Margin loss can only be consistent to the classification task under highly restrictive assumptions on the discrete loss measuring the error between outputs. These conditions are satisfied by distances defined in tree graphs, for which we prove consistency, thus being the first losses shown to be consistent for Max-Margin beyond the binary setting. We finally address these limitations by correcting the concept of Max-Margin and introducing the Restricted-Max-Margin, where the maximization of the lossaugmented scores is maintained, but performed over a subset of the original domain. The resulting loss is also a generalization of the binary support vector machine and it is consistent under milder conditions on the discrete loss.
Fichier principal
Vignette du fichier
main.pdf (3.13 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03615096 , version 1 (21-03-2022)

Identifiants

  • HAL Id : hal-03615096 , version 1

Citer

Alex Nowak-Vila, Alessandro Rudi, Francis Bach. On the Consistency of Max-Margin Losses. AISTATS 2022 - 25th International Conference on Artificial Intelligence and Statistics, Mar 2022, València, Spain. ⟨hal-03615096⟩
99 Consultations
30 Téléchargements

Partager

Gmail Facebook X LinkedIn More