Skip to Main content Skip to Navigation
Conference papers

CharacterBERT: Reconciling ELMo and BERT for Word-Level Open-Vocabulary Representations From Characters

Abstract : Due to the compelling improvements brought by BERT, many recent representation models adopted the Transformer architecture as their main building block, consequently inheriting the wordpiece tokenization system despite it not being intrinsically linked to the notion of Transformers. While this system is thought to achieve a good balance between the flexibility of characters and the efficiency of full words, using predefined wordpiece vocabularies from the general domain is not always suitable, especially when building models for specialized domains (e.g., the medical domain). Moreover, adopting a wordpiece tokenization shifts the focus from the word level to the subword level, making the models conceptually more complex and arguably less convenient in practice. For these reasons, we propose CharacterBERT, a new variant of BERT that drops the wordpiece system altogether and uses a Character-CNN module instead to represent entire words by consulting their characters. We show that this new model improves the performance of BERT on a variety of medical domain tasks while at the same time producing robust, word-level, and open-vocabulary representations.
Complete list of metadata
Contributor : Pierre Zweigenbaum <>
Submitted on : Wednesday, January 6, 2021 - 5:17:58 PM
Last modification on : Monday, February 22, 2021 - 4:21:25 PM


Publisher files allowed on an open archive


  • HAL Id : hal-03100665, version 1


Hicham El Boukkouri, Olivier Ferret, Thomas Lavergne, Hiroshi Noji, Pierre Zweigenbaum, et al.. CharacterBERT: Reconciling ELMo and BERT for Word-Level Open-Vocabulary Representations From Characters. International Conference on Computational Linguistics, Dec 2020, Barcelona (on line), Spain. pp.6903-6915. ⟨hal-03100665⟩



Record views


Files downloads