Skip to Main content Skip to Navigation
New interface
Conference papers

Deep Geometric Knowledge Distillation with Graphs

Abstract : In most cases deep learning architectures are trained disregarding the amount of operations and energy consumption. However, some applications, like embedded systems, can be resource-constrained during inference. A popular approach to reduce the size of a deep learning architecture consists in distilling knowledge from a bigger network (teacher) to a smaller one (student). Directly training the student to mimic the teacher representation can be effective, but it requires that both share the same latent space dimensions. In this work, we focus instead on relative knowledge distillation (RKD), which considers the geometry of the respective latent spaces, allowing for dimension-agnostic transfer of knowledge. Specifically we introduce a graph-based RKD method, in which graphs are used to capture the geometry of latent spaces. Using classical computer vision benchmarks, we demonstrate the ability of the proposed method to efficiently distillate knowledge from the teacher to the student, leading to better accuracy for the same budget as compared to existing RKD alternatives.
Complete list of metadata

https://hal.archives-ouvertes.fr/hal-02871309
Contributor : Vincent Gripon Connect in order to contact the contributor
Submitted on : Wednesday, June 17, 2020 - 11:11:48 AM
Last modification on : Tuesday, November 29, 2022 - 11:50:04 AM

Links full text

Identifiers

Citation

Carlos Lassance, Myriam Bontonou, Ghouthi Boukli Hacene, Vincent Gripon, Jian Tang, et al.. Deep Geometric Knowledge Distillation with Graphs. ICASSP 2020: IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), May 2020, Barcelona, Spain. pp.8484-8488, ⟨10.1109/ICASSP40776.2020.9053986⟩. ⟨hal-02871309⟩

Share

Metrics

Record views

144