Max-plus Operators Applied to Filter Selection and Model Pruning in Neural Networks

Abstract : Following recent advances in morphological neural networks, we propose to study in more depth how Max-plus operators can be exploited to define morphological units and how they behave when incorporated in layers of conventional neural networks. Besides showing that they can be easily implemented with modern machine learning frameworks , we confirm and extend the observation that a Max-plus layer can be used to select important filters and reduce redundancy in its previous layer, without incurring performance loss. Experimental results demonstrate that the filter selection strategy enabled by a Max-plus is highly efficient and robust, through which we successfully performed model pruning on different neural network architectures. We also point out that there is a close connection between Maxout networks and our pruned Max-plus networks by comparing their respective characteristics. The code for reproducing our experiments is available online.
Liste complète des métadonnées

https://hal.archives-ouvertes.fr/hal-02071450
Contributor : Samy Blusseau <>
Submitted on : Monday, April 1, 2019 - 10:01:53 AM
Last modification on : Wednesday, April 10, 2019 - 1:23:50 AM

Files

zhang-et-al.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-02071450, version 2
  • ARXIV : 1903.08072

Citation

Yunxiang Zhang, Samy Blusseau, Santiago Velasco-Forero, Isabelle Bloch, Jesus Angulo. Max-plus Operators Applied to Filter Selection and Model Pruning in Neural Networks. International Symposium on Mathematical Morphology, Jul 2019, Saarbrücken, Germany. ⟨hal-02071450v2⟩

Share

Metrics

Record views

28

Files downloads

10