Book contents
- Frontmatter
- Contents
- Preface
- Acknowledgments
- 1 Introduction
- 2 The biology of neural networks: a few features for the sake of non-biologists
- 3 The dynamics of neural networks: a stochastic approach
- 4 Hebbian models of associative memory
- 5 Temporal sequences of patterns
- 6 The problem of learning in neural networks
- 7 Learning dynamics in ‘visible’ neural networks
- 8 Solving the problem of credit assignment
- 9 Self-organization
- 10 Neurocomputation
- 11 Neurocomputers
- 12 A critical view of the modeling of neural networks
- References
- Index
5 - Temporal sequences of patterns
Published online by Cambridge University Press: 30 November 2009
- Frontmatter
- Contents
- Preface
- Acknowledgments
- 1 Introduction
- 2 The biology of neural networks: a few features for the sake of non-biologists
- 3 The dynamics of neural networks: a stochastic approach
- 4 Hebbian models of associative memory
- 5 Temporal sequences of patterns
- 6 The problem of learning in neural networks
- 7 Learning dynamics in ‘visible’ neural networks
- 8 Solving the problem of credit assignment
- 9 Self-organization
- 10 Neurocomputation
- 11 Neurocomputers
- 12 A critical view of the modeling of neural networks
- References
- Index
Summary
Something essential is missing in the description of memory we have introduced in previous chapters. A neural network, even isolated, is a continuously evolving system which never settles indefinitely in a steady state. We are able to retrieve not only single patterns but also ordered strings of patterns. For example, a few notes are enough for an entire song to be recalled, or, after training, one is able to go through the complete set of movements which are necessary for serving in tennis. Several schemes have been proposed to account for the production of memorized strings of patterns. Simulations show that they perform well, but this does not mean anything as regards the biological relevance of the mechanisms they involve. In actual fact no observation supporting one or the other of the schemes has been reported so far.
Parallel dynamics
Up to now the dynamics has been built so as to make the memorized patterns the fixed points of the dynamics. Once the network settles in one pattern it stays there indefinitely, at least for low noise levels. We have seen that fixed points are the asymptotic behaviors of rather special neural networks, namely those which are symmetrically connected. In asymmetrically connected neural networks whose dynamics is deterministic and parallel (the Little dynamics at zero noise level), the existence of limit cycles is the rule. It is then tempting to imagine that the retrieval of temporal sequences of patterns occurs through limit cycles.
- Type
- Chapter
- Information
- An Introduction to the Modeling of Neural Networks , pp. 153 - 172Publisher: Cambridge University PressPrint publication year: 1992