Tensor-based approach for training flexible neural networks - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2021

Tensor-based approach for training flexible neural networks

Résumé

Activation functions (AFs) are an important part of the design of neural networks (NNs), and their choice plays a predominant role in the performance of a NN. In this work, we are particularly inter- ested in the estimation of flexible activation functions using tensor-based solutions, where the AFs are expressed as a weighted sum of predefined basis functions. To do so, we propose a new learning algorithm which solves a constrained coupled matrix-tensor factorization (CMTF) problem. This technique fuses the first and zeroth order information of the NN, where the first-order information is contained in a Jacobian tensor, following a constrained canonical polyadic decomposition (CPD). The proposed algorithm can handle different decomposition bases. The goal of this method is to compress large pretrained NN models, by replacing subnetworks, i.e., one or multiple layers of the original network, by a new flexible layer. The approach is applied to a pretrained convolutional neural network (CNN) used for character classification.
Fichier non déposé

Dates et versions

hal-03518648 , version 1 (10-01-2022)

Identifiants

  • HAL Id : hal-03518648 , version 1

Citer

Yassine Zniyed, Konstantin Usevich, Sebastian Miron, David Brie. Tensor-based approach for training flexible neural networks. 55th Asilomar Conference on Signals, Systems and Computers, Oct 2021, Pacific Grove, CA, United States. ⟨hal-03518648⟩
46 Consultations
0 Téléchargements

Partager

Gmail Facebook X LinkedIn More