Skip to Main content Skip to Navigation
Conference papers

Unsupervised Concept Annotation Using Latent Dirichlet Allocation and Segmental Methods

Abstract : Training efficient statistical approaches for natural language understanding generally requires data with segmental semantic annotations. Unfortunately, building such resources is costly. In this paper, we propose an approach that produces annotations in an unsu-pervised way. The first step is an implementation of latent Dirichlet allocation that produces a set of topics with probabilities for each topic to be associated with a word in a sentence. This knowledge is then used as a bootstrap to infer a segmentation of a word sentence into topics using either integer linear optimisation or stochastic word alignment models (IBM models) to produce the final semantic annotation. The relation between automatically-derived topics and task-dependent concepts is evaluated on a spoken dialogue task with an available reference annotation.
Complete list of metadatas

Cited literature [16 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-01314555
Contributor : Bibliothèque Universitaire Déposants Hal-Avignon <>
Submitted on : Wednesday, February 27, 2019 - 2:01:19 PM
Last modification on : Monday, March 9, 2020 - 10:23:08 AM

File

UNSUP11b.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01314555, version 1

Collections

Citation

Nathalie Camelin, Boris Detienne, Stéphane Huet, Dominique Quadri, Fabrice Lefèvre. Unsupervised Concept Annotation Using Latent Dirichlet Allocation and Segmental Methods. EMNLP Workshop on Unsupervised Learning in NLP (UNSUP), Jul 2011, Edinburgh, United Kingdom. pp.72-81. ⟨hal-01314555⟩

Share

Metrics

Record views

248

Files downloads

19