What can we learn from natural and artificial dependency trees - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2019

What can we learn from natural and artificial dependency trees

Résumé

This paper is centered around two main contributions : the first one consists in introducing several procedures for generating random dependency trees with constraints; we later use these artificial trees to compare their properties with the properties of natural trees (i.e trees extracted from treebanks) and analyze the relationships between these properties in natural and artificial settings in order to find out which relationships are formally constrained and which are linguistically motivated. We take into consideration five metrics: tree length, height, maximum arity, mean dependency distance and mean flux weight, and also look into the distribution of local configurations of nodes. This analysis is based on UD treebanks (version 2.3, Nivre et al. 2018) for four languages: Chinese, English, French and Ja-panese.
Fichier principal
Vignette du fichier
W19-7915.pdf (297.04 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02416656 , version 1 (17-12-2019)

Identifiants

Citer

Marine Courtin, Chunxiao Yan. What can we learn from natural and artificial dependency trees. Quasy 2019, Quantitative Syntax, Syntaxfest, Aug 2019, Paris, France. pp.125-135, ⟨10.18653/v1/W19-7915⟩. ⟨hal-02416656⟩
110 Consultations
80 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More