Compositional Languages Emerge in a Neural Iterated Learning Model - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2020

Compositional Languages Emerge in a Neural Iterated Learning Model

Résumé

The principle of compositionality, which enables natural language to represent complex concepts via a structured combination of simpler ones, allows us to convey an open-ended set of messages using a limited vocabulary. If compositionality is indeed a natural property of language, we may expect it to appear in communication protocols that are created by neural agents via grounded language learning. Inspired by the iterated learning framework, which simulates the process of language evolution, we propose an effective neural iterated learning algorithm that, when applied to interacting neural agents, facilitates the emergence of a more structured type of language. Indeed, these languages provide specific advantages to neural agents during training, which translates as a larger posterior probability, which is then incrementally amplified via the iterated learning procedure. Our experiments confirm our analysis, and also demonstrate that the emerged languages largely improve the generalization of the neural agent communication.
Fichier principal
Vignette du fichier
compositional_languages_emerge.pdf (1.35 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-02914840 , version 1 (07-09-2021)

Identifiants

  • HAL Id : hal-02914840 , version 1

Citer

Yi Ren, Shangmin Guo, Matthieu Labeau, Shay B Cohen, Simon Kirby. Compositional Languages Emerge in a Neural Iterated Learning Model. 8th International Conference on Learning Representations, Apr 2020, Addis Ababa, Ethiopia. ⟨hal-02914840⟩
26 Consultations
90 Téléchargements

Partager

Gmail Facebook X LinkedIn More