Skip to Main content Skip to Navigation
Conference papers

TRADI: Tracking deep neural network weight distributions

Abstract : During training, the weights of a Deep Neural Network (DNN) are optimized from a random initialization towards a nearly optimum value minimizing a loss function. Only this final state of the weights is typically kept for testing, while the wealth of information on the geometry of the weight space, accumulated over the descent towards the minimum is discarded. In this work we propose to make use of this knowledge and leverage it for computing the distributions of the weights of the DNN. This can be further used for estimating the epistemic uncertainty of the DNN by aggregating predictions from an ensemble of networks sampled from these distributions. To this end we introduce a method for tracking the trajectory of the weights during optimization, that does neither require any change in the architecture, nor in the training procedure. We evaluate our method, TRADI, on standard classification and regression benchmarks, and on out-of-distribution detection for classification and semantic segmentation. We achieve competitive results, while preserving computational efficiency in comparison to ensemble approaches.
Complete list of metadata

Cited literature [56 references]  Display  Hide  Download
Contributor : Gianni Franchi Connect in order to contact the contributor
Submitted on : Wednesday, August 26, 2020 - 9:53:33 AM
Last modification on : Friday, January 14, 2022 - 3:41:32 AM
Long-term archiving on: : Friday, November 27, 2020 - 12:16:56 PM


Files produced by the author(s)


  • HAL Id : hal-02922336, version 1


Gianni Franchi, Andrei Bursuc, Emanuel Aldea, Séverine Dubuisson, Isabelle Bloch. TRADI: Tracking deep neural network weight distributions. 16th European Conference on Computer Vision. ECCV 2020, Aug 2020, Online Event, France. ⟨hal-02922336⟩



Les métriques sont temporairement indisponibles