Discriminative Transfer Learning Using Similarities and Dissimilarities

Abstract : Correctly estimating the discrepancy between two data distributions has always been an important task in Machine Learning. Recently, Cuturi proposed the Sinkhorn distance which makes use of an approximate Optimal Transport cost between two distributions as a distance to describe distribution discrepancy. Although it has been successfully adopted in various machine learning applications (e.g. in Natural Language Processing and Computer Vision) since then, the Sinkhorn distance also suffers from two unnegligible limitations. The first one is that the Sinkhorn distance only gives an approximation of the real Wasserstein distance, the second one is the `divide by zero' problem which often occurs during matrix scaling when setting the entropy regularization coefficient to a small value. In this paper, we introduce a new Brenier approach for calculating a more accurate Wasserstein distance between two discrete distributions, this approach successfully avoids the two limitations shown above for Sinkhorn distance and gives an alternative way for estimating distribution discrepancy.
Document type :
Journal articles
Complete list of metadatas

https://hal.archives-ouvertes.fr/hal-02351587
Contributor : Alexandre Saidi <>
Submitted on : Wednesday, November 6, 2019 - 2:42:54 PM
Last modification on : Tuesday, November 19, 2019 - 2:37:46 AM

Links full text

Identifiers

Citation

Ying Lu, Liming Chen, Alexandre Saidi, Emmanuel Dellandréa, Yunhong Wang. Discriminative Transfer Learning Using Similarities and Dissimilarities. IEEE Transactions on Neural Networks and Learning Systems, IEEE, 2018, pp.1-14. ⟨10.1109/TNNLS.2017.2705760⟩. ⟨hal-02351587⟩

Share

Metrics

Record views

18