Note on Backpropagation in Neural Networks - Archive ouverte HAL Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2019

Note on Backpropagation in Neural Networks

Xin Jin
  • Fonction : Auteur
  • PersonId : 1052290

Résumé

This note intends to facilitate low level implementation by providing an analytical perspective on neural networks. Different feedforward and recurrent neural networks are dissected through a derivation of the backpropagation update. We choose Multilayer Perceptron (MLP) which possesses the basic architecture of deep artificial neural network as a departure of introduction. Sigmoid Cross-Entropy loss is applied to MLP for an exemplification of multi-label classification. We then turn to introduce Convolutional Neural Network (CNN)-an intricate architecture which adopts filter and sub-sampling to realize a form of regularization. In the end, we illustrate Backpropagation Through Time (BPTT) to elicit Exploding / Vanishing Gradients problem and Long short-term memory (LSTM).
Fichier principal
Vignette du fichier
BackpropagationNeuralNet.pdf (287.55 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02265247 , version 1 (09-08-2019)

Identifiants

  • HAL Id : hal-02265247 , version 1

Citer

Xin Jin. Note on Backpropagation in Neural Networks. 2019. ⟨hal-02265247⟩
133 Consultations
926 Téléchargements

Partager

Gmail Facebook X LinkedIn More