Note on Backpropagation in Neural Networks

Abstract : This note intends to facilitate low level implementation by providing an analytical perspective on neural networks. Different feedforward and recurrent neural networks are dissected through a derivation of the backpropagation update. We choose Multilayer Perceptron (MLP) which possesses the basic architecture of deep artificial neural network as a departure of introduction. Sigmoid Cross-Entropy loss is applied to MLP for an exemplification of multi-label classification. We then turn to introduce Convolutional Neural Network (CNN)-an intricate architecture which adopts filter and sub-sampling to realize a form of regularization. In the end, we illustrate Backpropagation Through Time (BPTT) to elicit Exploding / Vanishing Gradients problem and Long short-term memory (LSTM).
Document type :
Preprints, Working Papers, ...
Complete list of metadatas

Cited literature [6 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-02265247
Contributor : Xin Jin <>
Submitted on : Friday, August 9, 2019 - 3:20:44 AM
Last modification on : Saturday, August 10, 2019 - 1:20:36 AM

File

BackpropagationNeuralNet.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-02265247, version 1

Collections

Citation

Xin Jin. Note on Backpropagation in Neural Networks. 2019. ⟨hal-02265247⟩

Share

Metrics

Record views

29

Files downloads

57