Skip to Main content Skip to Navigation
Conference papers

Algorithmic Robustness for Semi-Supervised (ε, γ, τ )-Good Metric Learning

Abstract : The importance of metrics in machine learning has attracted a growing interest for distance and similarity learning, and especially the Mahalanobis distance. However, it is worth noting that this research field lacks theoretical guarantees that can be expected on the generalization capacity of the classifier associated to a learned metric. The theoreti- cal framework of (ε,γ,τ)-good similarity functions [1] has been one of the first attempts to draw a link between the properties of a similarity function and those of a linear classifier making use of it. In this paper, we extend this theory to a method where the metric and the separator are jointly learned in a semi-supervised way, setting that has not been explored before. We furthermore provide a generalization bound for the associated classifier based on the algorithmic robustness framework. The behavior of our method is illustrated via some experimental results.
Document type :
Conference papers
Complete list of metadatas

https://hal.archives-ouvertes.fr/hal-01223411
Contributor : Amaury Habrard <>
Submitted on : Monday, November 2, 2015 - 3:51:32 PM
Last modification on : Thursday, July 9, 2020 - 9:45:14 AM

Identifiers

  • HAL Id : hal-01223411, version 1

Citation

Nicolae Irina, Marc Sebban, Amaury Habrard, Eric Gaussier, Massih-Reza Amini. Algorithmic Robustness for Semi-Supervised (ε, γ, τ )-Good Metric Learning. International Conference on Neural Information Processing ICONIP, Nov 2015, Istanbul, Turkey. pp.10. ⟨hal-01223411⟩

Share

Metrics

Record views

529