Skip to Main content Skip to Navigation
Conference papers

Sparsifying Networks via Subdifferential Inclusion

Sagar Verma 1 Jean-Christophe Pesquet 1, 2
1 OPIS - OPtimisation Imagerie et Santé
Inria Saclay - Ile de France, CVN - Centre de vision numérique
Abstract : Sparsifying deep neural networks is of paramount interest in many areas, especially when those networks have to be implemented on lowmemory devices. In this article, we propose a new formulation of the problem of generating sparse weights for a pre-trained neural network. By leveraging the properties of standard nonlinear activation functions, we show that the problem is equivalent to an approximate subdifferential inclusion problem. The accuracy of the approximation controls the sparsity. We show that the proposed approach is valid for a broad class of activation functions (ReLU, sigmoid, softmax). We propose an iterative optimization algorithm to induce sparsity whose convergence is guaranteed. Because of the algorithm flexibility, the sparsity can be ensured from partial training data in a minibatch manner. To demonstrate the effectiveness of our method, we perform experiments on various networks in different applicative contexts: image classification, speech recognition, natural language processing, and time-series forecasting.
Document type :
Conference papers
Complete list of metadata

https://hal.archives-ouvertes.fr/hal-03294543
Contributor : Sagar Verma <>
Submitted on : Wednesday, July 21, 2021 - 4:23:27 PM
Last modification on : Monday, September 6, 2021 - 3:19:55 PM

File

ICML2021_SIS.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-03294543, version 1

Citation

Sagar Verma, Jean-Christophe Pesquet. Sparsifying Networks via Subdifferential Inclusion. ICML 2021 - Thirty-eighth International Conference on Machine Learning, Jul 2021, Virtual, France. ⟨hal-03294543⟩

Share

Metrics

Record views

21

Files downloads

29