Expressive Power of Evolving Neural Networks Working on Infinite Input Streams - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2017

Expressive Power of Evolving Neural Networks Working on Infinite Input Streams

Résumé

Evolving recurrent neural networks represent a natural model of computation beyond the Turing limits. Here, we consider evolving recurrent neural networks working on infinite input streams. The expressive power of these networks is related to their attractor dynamics and is measured by the topological complexity of their underlying neural ω-languages. In this context, the deterministic and non-deterministic evolving neural networks recognize the (boldface) topological classes of BC(Π^0_2) and Σ^1_1 ω-languages, respectively. These results can actually be significantly refined: the deterministic and nondeterministic evolving networks which employ α ∈ 2^ω as sole binary evolving weight recognize the (lightface) relativized topological classes of BC(Π^0_2)(α) and Σ^1_1 (α) ω-languages, respectively. As a consequence, a proper hierarchy of classes of evolving neural nets, based on the complexity of their underlying evolving weights, can be obtained. The hierarchy contains chains of length ω_1 as well as uncountable antichains.
Fichier principal
Vignette du fichier
FCT_2017_v5.pdf (383.08 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01590650 , version 1 (19-09-2017)

Identifiants

Citer

Jérémie Cabessa, Olivier Finkel. Expressive Power of Evolving Neural Networks Working on Infinite Input Streams. 21st International Symposium on Fundamentals of Computation Theory (FCT 2017), Sep 2017, Bordeaux, France. pp.150-163, ⟨10.1007/978-3-662-55751-8_13⟩. ⟨hal-01590650⟩
170 Consultations
162 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More