Federated Continual Learning through distillation in pervasive computing - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

Federated Continual Learning through distillation in pervasive computing

Résumé

Federated Learning has been introduced as a new machine learning paradigm enhancing the use of local devices. At a server level, FL regularly aggregates models learned locally on distributed clients to obtain a more general model. Current solutions rely on the availability of large amounts of stored data at the client side in order to fine-tune the models sent by the server. Such setting is not realistic in mobile pervasive computing where data storage must be kept low and data characteristic can change dramatically. To account for this variability, a solution is to use the data regularly collected by the client to progressively adapt the received model. But such naive approach exposes clients to the well-known problem of catastrophic forgetting. To address this problem, we have defined a Federated Continual Learning approach which is mainly based on distillation. Our approach allows a better use of resources, eliminating the need to retrain from scratch at the arrival of new data and reducing memory usage by limiting the amount of data to be stored. This proposal has been evaluated in the Human Activity Recognition (HAR) domain and has shown to effectively reduce the catastrophic forgetting effect.

Dates et versions

hal-03727252 , version 1 (19-07-2022)

Identifiants

Citer

Anastasiia Usmanova, François Portet, Philippe Lalanda, German Vega. Federated Continual Learning through distillation in pervasive computing. SMARTCOMP2022, Jun 2022, Espoo, Finland. pp.86 -- 91. ⟨hal-03727252⟩
15 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More