Book contents
- Frontmatter
- Contents
- Acknowledgements
- Conventions on notation
- Tour d'Horizon
- Part I Distributional networks
- Part II Artificial neural networks
- Part III Processing networks
- Part IV Communication networks
- Appendix 1 Spatial integrals for the telephone problem
- Appendix 2 Bandit and tax processes
- Appendix 3 Random graphs and polymer models
- References
- Index
Part II - Artificial neural networks
Published online by Cambridge University Press: 23 November 2009
- Frontmatter
- Contents
- Acknowledgements
- Conventions on notation
- Tour d'Horizon
- Part I Distributional networks
- Part II Artificial neural networks
- Part III Processing networks
- Part IV Communication networks
- Appendix 1 Spatial integrals for the telephone problem
- Appendix 2 Bandit and tax processes
- Appendix 3 Random graphs and polymer models
- References
- Index
Summary
Artificial neural networks (with the recognised abbreviation of ANN) embody a fascinating concept: a stripped-down idealisation of the biological neural network, with the promise of genuine application. The state of the net is described by the activities at its nodes, rather than, as hitherto, by the flows on its arcs. There are such flows, however: the output of a node is a particular nonlinear function (the activation function) of a linear combination of its inputs from other nodes. It is the coefficients of these linear functions (the weights) that parameterise the net. The specimen aim to which we shall largely restrict ourselves is: to choose the weights so that the system output of the network shall approximate as well as possible a prescribed function of system input. Here by ‘system output’ we mean the output from a recognised subset of ‘output nodes’, and by ‘system input’ the environmental input to a recognised subset of ‘input nodes’. However, this design problem is solved, not by a deliberate optimisation algorithm, but by an adaptive procedure that modifies the weights iteratively in such a way as to bring net input and net output into the desired correspondence.
To clarify the biological parallel: the activation function provides a crude version of the operation of a biological neuron; the weightings represent the strengths of interneuronal connections, the axons, and so define the net. The animal neural system is a remarkable processor, converting sensory inputs into what one might term inferences on the environment and then into stimuli for appropriate action.
- Type
- Chapter
- Information
- NetworksOptimisation and Evolution, pp. 135 - 136Publisher: Cambridge University PressPrint publication year: 2007