Representing Multimodal Linguistics Annotated Data - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2014

Representing Multimodal Linguistics Annotated Data

Brigitte Bigi
Laurent Prévot

Résumé

The question of interoperability for linguistic annotated resources requires to cover different aspects. First, it requires a representation framework making it possible to compare, and potentially merge, different annotation schema. In this paper, a general description level representing the multimodal linguistic annotations is proposed. It focuses on time and data content representation: This paper reconsiders and enhances the current and generalized representation of annotations. An XML schema of such annotations is proposed. A Python API is also proposed. This framework is implemented in a multi-platform software and distributed under the terms of the GNU Public License.
Fichier non déposé

Dates et versions

hal-01500719 , version 1 (03-04-2017)

Identifiants

  • HAL Id : hal-01500719 , version 1

Citer

Brigitte Bigi, Tatsuya Watanabe, Laurent Prévot. Representing Multimodal Linguistics Annotated Data. The 9th edition of the Language Resources and Evaluation Conference, May 2014, Reykjavik, Iceland. pp.3386-3392. ⟨hal-01500719⟩
111 Consultations
0 Téléchargements

Partager

Gmail Facebook X LinkedIn More