Expressive power of first-order recurrent neural networks determined by their attractor dynamics

Détails

ID Serval
serval:BIB_C1E72C518FB6
Type
Article: article d'un périodique ou d'un magazine.
Collection
Publications
Institution
Titre
Expressive power of first-order recurrent neural networks determined by their attractor dynamics
Périodique
Journal of Computer and System Sciences
Auteur⸱e⸱s
Cabessa J., Villa A.E.P.
ISSN
0022-0000
Statut éditorial
Publié
Date de publication
12/2016
Peer-reviewed
Oui
Volume
82
Numéro
8
Pages
1232-1250
Langue
anglais
Résumé
We provide a characterization of the expressive powers of several models of deterministic and nondeterministic first-order recurrent neural networks according to their attractor dynamics. The expressive power of neural nets is expressed as the topological complexity of their underlying neural ω-languages, and refers to the ability of the networks to perform more or less complicated classification tasks via the manifestation of specific attractor dynamics. In this context, we prove that most neural models under consideration are strictly more powerful than Muller Turing machines. These results provide new insights into the computational capabilities of recurrent neural networks.
Mots-clé
Theoretical Computer Science, Computer Networks and Communications, Computational Theory and Mathematics, Applied Mathematics
Web of science
Création de la notice
03/08/2017 13:38
Dernière modification de la notice
20/08/2019 15:36
Données d'usage