Does constituency analysis enhance domain-specific pre-trained BERT models for relation extraction? - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2021

Does constituency analysis enhance domain-specific pre-trained BERT models for relation extraction?

Résumé

Recently many studies have been conducted on the topic of relation extraction. The DrugProt track at BioCreative VII provides a manually-annotated corpus for the purpose of the development and evaluation of relation extraction systems, in which interactions between chemicals and genes are studied. We describe the ensemble system that we used for our submission, which combines predictions of fine-tuned bioBERT, sciBERT and const-bioBERT models by majority voting. We specifically tested the contribution of syntactic information to relation extraction with BERT. We observed that adding constituentbased syntactic information to BERT improved precision, but decreased recall, since relations rarely seen in the train set were less likely to be predicted by BERT models in which the syntactic information is infused. Our code is available online [https://github.com/Maple177/drugprot-relation-extraction].
Fichier principal
Vignette du fichier
Track1_pos_6_BC7_submission_149.pdf (213.78 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03447774 , version 1 (24-11-2021)

Licence

Paternité

Identifiants

Citer

Anfu Tang, Louise Deléger, Robert Bossy, Pierre Zweigenbaum, Claire Nédellec. Does constituency analysis enhance domain-specific pre-trained BERT models for relation extraction?. BioCreative VII Challenge Evaluation Workshop, Nov 2021, on-line, Spain. ⟨hal-03447774⟩
43 Consultations
39 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More