Skip to Main content Skip to Navigation
Poster communications

Generalized Stochastic Backpropagation

Amine Echraibi 1, 2, 3 Joachim Cholet 2 Stéphane Gosselin 2 Sandrine Vaton 1, 3
3 Lab-STICC_MATHNET - Equipe Math & Net
Lab-STICC - Laboratoire des sciences et techniques de l'information, de la communication et de la connaissance : UMR6285
Abstract : Backpropagating gradients through random variables is at the heart of numerous machine learning applications. In this paper, we present a general framework for deriving stochastic backpropagation rules for any distribution, discrete or continuous. Our approach exploits the link between the characteristic function and the Fourier transform, to transport the derivatives from the parameters of the distribution to the random variable. Our method generalizes previously known estimators, and results in new estimators for the gamma, beta, Dirichlet and Laplace distributions. Furthermore, we show that the classical deterministic backproapagation rule and the discrete random variable case, can also be interpreted through stochastic backpropagation.
Document type :
Poster communications
Complete list of metadata

https://hal.archives-ouvertes.fr/hal-02968975
Contributor : Amine Echraibi Connect in order to contact the contributor
Submitted on : Wednesday, January 13, 2021 - 10:31:13 AM
Last modification on : Wednesday, November 3, 2021 - 6:14:57 AM

File

Generalized_Stochastic_Backpro...
Files produced by the author(s)

Identifiers

  • HAL Id : hal-02968975, version 3

Citation

Amine Echraibi, Joachim Cholet, Stéphane Gosselin, Sandrine Vaton. Generalized Stochastic Backpropagation. Beyond Backpropagation: Novel Ideas for Training Neural Architectures, Workshop at NeurIPS 2020 (2020 Conference on Neural Information Processing Systems), Dec 2020, Virtual Conférence, France. ⟨hal-02968975v3⟩

Share

Metrics

Les métriques sont temporairement indisponibles