Book contents
- Frontmatter
- Contents
- Preface
- Dedication
- 1 Introduction
- 2 The Basic Attractor Neural Network
- 3 General Ideas Concerning Dynamics
- 4 Symmetric Neural Networks at Low Memory Loading
- 5 Storage and Retrieval of Temporal Sequences
- 6 Storage Capacity of ANN's
- 7 Robustness - Getting Closer to Biology
- 8 Memory Data Structures
- 9 Learning
- 10 Hardware Implementations of Neural Networks
- Glossary
- Index
5 - Storage and Retrieval of Temporal Sequences
Published online by Cambridge University Press: 05 August 2012
- Frontmatter
- Contents
- Preface
- Dedication
- 1 Introduction
- 2 The Basic Attractor Neural Network
- 3 General Ideas Concerning Dynamics
- 4 Symmetric Neural Networks at Low Memory Loading
- 5 Storage and Retrieval of Temporal Sequences
- 6 Storage Capacity of ANN's
- 7 Robustness - Getting Closer to Biology
- 8 Memory Data Structures
- 9 Learning
- 10 Hardware Implementations of Neural Networks
- Glossary
- Index
Summary
Motivations: Introspective, Biological, Philosophical
The introspective motivation
The type of neural network described in the previous chapter is a first prototype in the sense that:
it stores a small number of patterns;
it recalls single patterns only;
once a pattern has been recalled, the system will linger on it until the coming of some unspecified dramatic event.
Such a system may provide some useful technical applications as rapid, robust and reliable pattern recognizers. Such devices are discussed in Chapter 10. It seems rather unlikely that they can satisfy one's expectations of a cognitive system.
Very rudimentary introspection gives rise to the impression that, with or without explicit instruction, a single stimulus (or a very short string of stimuli) usually gives rise to a retrieval (or recall) of a whole cascade of connected ‘patterns’. Most striking are effects such as the recall of a tune, which can be provoked by a very simple stimulus, not directly related to the tune itself. Similarly, rather simple stimuli bring about the recall of sequences of numbers, especially in children, or of the alphabet. Similarly, much of the input into the cognitive system seems to be in the form of temporal sequences, rather than single patterns. This appears to be accepted in the study of speech recognition (see e.g., ref. [1]), as well as in vision, where a strong paradigm has it that form is deciphered from motion (see e.g., ref. [2]).
- Type
- Chapter
- Information
- Modeling Brain FunctionThe World of Attractor Neural Networks, pp. 215 - 270Publisher: Cambridge University PressPrint publication year: 1989
- 1
- Cited by