Studying Catastrophic Forgetting in Neural Ranking Models - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2021

Studying Catastrophic Forgetting in Neural Ranking Models

Oubli catastrophique et approches neuronales pour la Recherche d’Information

Résumé

Several deep neural ranking models have been proposed in the recent IR literature. While their transferability to one target domain held by a dataset has been widely addressed using traditional domain adaptation strategies, the question of their cross-domain transferability is still under-studied. We study here in what extent neural ranking models catastrophically forget old knowledge acquired from previously observed domains after acquiring new knowledge, leading to performance decrease on those domains. Our experiments show that the effectiveness of neural IR ranking models is achieved at the cost of catastrophic forgetting and that a lifelong learning strategy using a cross-domain regularizer successfully mitigates the problem. Using an explanatory approach built on a regression model, we also show the effect of domain characteristics on the rise of catastrophic forgetting. We believe that the obtained results can be useful for both theoretical and practical future work in neural IR.
Fichier principal
Vignette du fichier
ECIR_2021_LL.pdf (421.99 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03156630 , version 1 (02-03-2021)

Identifiants

Citer

Jesus Lovon, Laure Soulier, Karen Pinel-Sauvagnat, Lynda Tamine. Studying Catastrophic Forgetting in Neural Ranking Models. 43rd European Conference on Information Retrieval (ECIR 2021), Apr 2021, Lucca (on line), Italy. pp.375-390, ⟨10.1007/978-3-030-72113-8_25⟩. ⟨hal-03156630⟩
281 Consultations
162 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More