Skip to Main content Skip to Navigation
Conference papers

Unsupervised Concept Annotation Using Latent Dirichlet Allocation and Segmental Methods

Abstract : Training efficient statistical approaches for natural language understanding generally requires data with segmental semantic annotations. Unfortunately, building such resources is costly. In this paper, we propose an approach that produces annotations in an unsu-pervised way. The first step is an implementation of latent Dirichlet allocation that produces a set of topics with probabilities for each topic to be associated with a word in a sentence. This knowledge is then used as a bootstrap to infer a segmentation of a word sentence into topics using either integer linear optimisation or stochastic word alignment models (IBM models) to produce the final semantic annotation. The relation between automatically-derived topics and task-dependent concepts is evaluated on a spoken dialogue task with an available reference annotation.
Complete list of metadata

Cited literature [16 references]  Display  Hide  Download
Contributor : Bibliothèque Universitaire Déposants Hal-Avignon Connect in order to contact the contributor
Submitted on : Wednesday, February 27, 2019 - 2:01:19 PM
Last modification on : Monday, November 16, 2020 - 11:58:02 AM


Files produced by the author(s)


  • HAL Id : hal-01314555, version 1



Nathalie Camelin, Boris Detienne, Stéphane Huet, Dominique Quadri, Fabrice Lefèvre. Unsupervised Concept Annotation Using Latent Dirichlet Allocation and Segmental Methods. EMNLP Workshop on Unsupervised Learning in NLP (UNSUP), Jul 2011, Edinburgh, United Kingdom. pp.72-81. ⟨hal-01314555⟩



Record views


Files downloads