Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

Optimal Transport for Deep Joint Transfer Learning

Ying Lu 1 Liming Chen 2 Alexandre Saidi 2
2 imagine - Extraction de Caractéristiques et Identification
LIRIS - Laboratoire d'InfoRmatique en Image et Systèmes d'information
Abstract : Training a Deep Neural Network (DNN) from scratch requires a large amount of labeled data. For a classification task where only small amount of training data is available, a common solution is to perform fine-tuning on a DNN which is pre-trained with related source data. This consecutive training process is time consuming and does not consider explicitly the relatedness between different source and target tasks. In this paper, we propose a novel method to jointly fine-tune a Deep Neural Network with source data and target data. By adding an Optimal Transport loss (OT loss) between source and target classifier predictions as a constraint on the source classifier, the proposed Joint Transfer Learning Network (JTLN) can effectively learn useful knowledge for target classification from source data. Furthermore, by using different kind of metric as cost matrix for the OT loss, JTLN can incorporate different prior knowledge about the relatedness between target categories and source categories. We carried out experiments with JTLN based on Alexnet on image classification datasets and the results verify the effectiveness of the proposed JTLN in comparison with standard consecutive fine-tuning. This Joint Transfer Learning with OT loss is general and can also be applied to other kind of Neural Networks.
Document type :
Preprints, Working Papers, ...
Complete list of metadata
Contributor : Alexandre Saidi Connect in order to contact the contributor
Submitted on : Thursday, March 21, 2019 - 2:10:10 PM
Last modification on : Tuesday, June 1, 2021 - 2:08:09 PM

Links full text


  • HAL Id : hal-02075446, version 1
  • ARXIV : 1709.02995


Ying Lu, Liming Chen, Alexandre Saidi. Optimal Transport for Deep Joint Transfer Learning. 2019. ⟨hal-02075446⟩



Record views