Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

Pruning Convolutional Neural Networks with Self-Supervision

Abstract : Convolutional neural networks trained without supervision come close to matching performance with supervised pre-training, but sometimes at the cost of an even higher number of parameters. Extracting subnetworks from these large unsupervised convnets with preserved performance is of particular interest to make them less computationally intensive. Typical pruning methods operate during training on a task while trying to maintain the performance of the pruned network on the same task. However, in self-supervised feature learning, the training objective is agnostic on the representation transferability to downstream tasks. Thus, preserving performance for this objective does not ensure that the pruned subnetwork remains effective for solving downstream tasks. In this work, we investigate the use of standard pruning methods, developed primarily for supervised learning, for networks trained without labels (i.e. on self-supervised tasks). We show that pruned masks obtained with or without labels reach comparable performance when retrained on labels, suggesting that pruning operates similarly for self-supervised and supervised learning. Interestingly, we also find that pruning preserves the transfer performance of self-supervised subnetwork representations.
Document type :
Preprints, Working Papers, ...
Complete list of metadata

Cited literature [45 references]  Display  Hide  Download
Contributor : Mathilde Caron Connect in order to contact the contributor
Submitted on : Monday, June 29, 2020 - 2:05:23 PM
Last modification on : Friday, February 4, 2022 - 3:21:15 AM


Files produced by the author(s)


  • HAL Id : hal-02883772, version 1



Mathilde Caron, Ari Morcos, Piotr Bojanowski, Julien Mairal, Armand Joulin. Pruning Convolutional Neural Networks with Self-Supervision. 2020. ⟨hal-02883772⟩



Record views


Files downloads