HAL will be down for maintenance from Friday, June 10 at 4pm through Monday, June 13 at 9am. More information
Skip to Main content Skip to Navigation
Conference papers

Joint Semi-supervised Similarity Learning for Linear Classification

Abstract : The importance of metrics in machine learning has attracted a growing interest for distance and similarity learning. We study here this problem in the situation where few labeled data (and potentially few unlabeled data as well) is available, a situation that arises in several practical contexts. We also provide a complete theoretical analysis of the proposed approach. It is indeed worth noting that the metric learning research field lacks theoretical guarantees that can be expected on the generalization capacity of the classifier associated to a learned metric. The theoretical framework of (e γ, τ)-good similarity functions has been one of the first attempts to draw a link between the properties of a similarity function and those of a linear classifier making use of it. In this paper, we extend this theory to a method where the metric and the separator are jointly learned in a semi-supervised way, setting that has not been explored before, and provide a theoretical analysis of this joint learning via Rademacher complexity. Experiments performed on standard datasets show the benefits of our approach over state-of-the- art methods
Document type :
Conference papers
Complete list of metadata

Contributor : Marc Sebban Connect in order to contact the contributor
Submitted on : Thursday, September 24, 2015 - 12:14:05 PM
Last modification on : Thursday, October 21, 2021 - 3:56:44 AM



Nicolae Irina, Gaussier Eric, Amaury Habrard, Marc Sebban. Joint Semi-supervised Similarity Learning for Linear Classification. ECML-PKDD 2015, Sep 2015, Porto, Portugal. ⟨10.1007/978-3-319-23528-837⟩. ⟨hal-01204642⟩



Record views