Skip to Main content Skip to Navigation
Conference papers

Transfer Learning by Weighting Convolution

Stéphane Ayache 1 Ronan Sicre 1 Thierry Artières 1
1 QARMA - éQuipe d'AppRentissage de MArseille
LIS - Laboratoire d'Informatique et Systèmes
Abstract : Transferring pretrained deep architectures to datasets with few labels is still a challenge in many real-world situations. This paper presents a new framework to understand convolutional neural networks, by establishing connections between Kronecker factorization and convolutional layers. We then introduce Convolution Weighting Layers that learn a vector of weights for each channel, allowing efficient transfer learning in small training settings, as well as enabling pruning the transferred models. Experiments are conducted on two main settings with few labeled data: transfer learning for classification and transfer learning for retrieval. Two well known convolutional architectures are evaluated on five public datasets. We show that weighting convolutions is efficient to adapt pretrained models to new tasks and that pruned networks conserve good performance.
Complete list of metadatas

Cited literature [49 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-02544099
Contributor : Stephane Ayache <>
Submitted on : Thursday, April 16, 2020 - 1:41:34 AM
Last modification on : Friday, May 15, 2020 - 11:40:50 AM

File

wconv.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-02544099, version 1

Collections

Citation

Stéphane Ayache, Ronan Sicre, Thierry Artières. Transfer Learning by Weighting Convolution. International Joint Conference on Neural Networks (IJCNN), 2020, Glasgow, United Kingdom. ⟨hal-02544099⟩

Share

Metrics

Record views

30

Files downloads

52