Dynamic Neural Language Models - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2019

Dynamic Neural Language Models

Résumé

Language evolves over time with trends and shifts in technological, political, or cultural contexts. Capturing these variations is important to develop better language models. While recent works tackle temporal drifts by learning diachronic embeddings, we instead propose to integrate a temporal component into a recurrent language model. It takes the form of global latent variables, which are structured in time by a learned non-linear transition function. We perform experiments on three time-annotated corpora. Experimental results on language modeling and classification tasks show that our model performs consistently better than temporal word embedding methods in two temporal evaluation settings: prediction and modeling. Moreover, we empirically show that the system is able to predict informative latent representations in the future.
Fichier non déposé

Dates et versions

hal-02459622 , version 1 (29-01-2020)

Identifiants

Citer

Edouard Delasalles, Sylvain Lamprier, Ludovic Denoyer. Dynamic Neural Language Models. ICONIP 2019 - 26th International Conference on Neural Information Processing, Dec 2019, Sydney, Australia. pp.282-294, ⟨10.1007/978-3-030-36718-3_24⟩. ⟨hal-02459622⟩
87 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More