Breaking Sticks and Ambiguities with Adaptive Skip-gram - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2016

Breaking Sticks and Ambiguities with Adaptive Skip-gram

Résumé

Recently proposed Skip-gram model is a powerful method for learning high-dimensional word representations that capture rich semantic relationships between words. However, Skip-gram as well as most prior work on learning word representations does not take into account word ambiguity and maintain only single representation per word. Although a number of Skip-gram modifications were proposed to overcome this limitation and learn multi-prototype word representations, they either require a known number of word meanings or learn them using greedy heuristic approaches. In this paper we propose the Adaptive Skip-gram model which is a nonparametric Bayesian extension of Skip-gram capable to automatically learn the required number of representations for all words at desired semantic resolution. We derive efficient online variational learning algorithm for the model and empirically demonstrate its efficiency on word-sense induction task.

Dates et versions

hal-01404056 , version 1 (28-11-2016)

Identifiants

Citer

Sergey Bartunov, Dmitry Kondrashkin, Anton Osokin, Dmitry Vetrov. Breaking Sticks and Ambiguities with Adaptive Skip-gram. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics (AISTATS), May 2016, Cadiz, Spain. pp.130-138. ⟨hal-01404056⟩
139 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More