Computational capabilities of recurrent neural networks based on their attractor dynamics

Details

Serval ID
serval:BIB_D6092D88FAF5
Type
Inproceedings: an article in a conference proceedings.
Collection
Publications
Institution
Title
Computational capabilities of recurrent neural networks based on their attractor dynamics
Title of the conference
2015 International Joint Conference on Neural Networks (IJCNN)
Author(s)
Cabessa J., Villa A.E.P.
Publisher
IEEE
Address
Killarney, Ireland
ISBN
978-1-4799-1960-4
ISSN
2161-4393
Publication state
Published
Issued date
07/2015
Peer-reviewed
Oui
Language
english
Abstract
We consider a model of so-called hybrid recurrent neural networks composed with Boolean input and output cells as well as sigmoid internal cells. When subjected to some infinite binary input stream, the Boolean output cells necessarily exhibit some attractor dynamics, which is assumed to be of two possible kinds, namely either meaningful or spurious, and which underlies the arising of spatiotemporal patterns of output discharges. In this context, we show that rational-weighted neural networks are computationally equivalent to deterministic Muller Turing machines, whereas all other models of real-weighted or evolving neural networks are equivalent to each other, and strictly more powerful than deterministic Muller Turing machines. In this precise sense, the analog and evolving neural networks are super-Turing. We further provide some precise mathematical characterization of the expressive powers of all these neural models. These results constitute a generalization to the current computational context of those obtained in the cases of classical as well as interactive computations. They support the idea that recurrent neural networks represent a natural model of computation beyond the Turing limits.
Keywords
Computational modeling, Algorithms
Web of science
Create date
03/08/2017 16:04
Last modification date
21/08/2019 6:12
Usage data