HAL will be down for maintenance from Friday, June 10 at 4pm through Monday, June 13 at 9am. More information
Skip to Main content Skip to Navigation
Conference papers

Max-plus Operators Applied to Filter Selection and Model Pruning in Neural Networks

Abstract : Following recent advances in morphological neural networks, we propose to study in more depth how Max-plus operators can be exploited to define morphological units and how they behave when incorporated in layers of conventional neural networks. Besides showing that they can be easily implemented with modern machine learning frameworks , we confirm and extend the observation that a Max-plus layer can be used to select important filters and reduce redundancy in its previous layer, without incurring performance loss. Experimental results demonstrate that the filter selection strategy enabled by a Max-plus is highly efficient and robust, through which we successfully performed model pruning on different neural network architectures. We also point out that there is a close connection between Maxout networks and our pruned Max-plus networks by comparing their respective characteristics. The code for reproducing our experiments is available online.
Complete list of metadata

https://hal.archives-ouvertes.fr/hal-02071450
Contributor : Samy Blusseau Connect in order to contact the contributor
Submitted on : Monday, April 1, 2019 - 10:01:53 AM
Last modification on : Wednesday, November 17, 2021 - 12:27:17 PM
Long-term archiving on: : Tuesday, July 2, 2019 - 1:16:14 PM

Files

zhang-et-al.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-02071450, version 2
  • ARXIV : 1903.08072

Citation

Yunxiang Zhang, Samy Blusseau, Santiago Velasco-Forero, Isabelle Bloch, Jesus Angulo. Max-plus Operators Applied to Filter Selection and Model Pruning in Neural Networks. International Symposium on Mathematical Morphology, Jul 2019, Saarbrücken, Germany. ⟨hal-02071450v2⟩

Share

Metrics

Record views

2783

Files downloads

132