Hostname: page-component-586b7cd67f-gb8f7 Total loading time: 0 Render date: 2024-11-24T20:33:47.834Z Has data issue: false hasContentIssue false

Graphs and Markov chains

Published online by Cambridge University Press:  22 September 2016

J. V. Greenman*
Affiliation:
University of Essex, Wivenhoe Park, Colchester CO4 3SQ

Extract

In its simplest form the theory of Markov chains states that, if the (i,j) element of matrix M is the probability pij of a one-step transition from state i to state j, then the (ij) element of the matrix Mk gives the probability of transition from state i to state j in k steps. Since the probabilities of transition to the different states add to 1, the sum of the elements in any row of M is necessarily equal to 1. (Readers are warned that in some elementary treatments the alternative convention is adopted of writing pij in the (j,i) position, so that the column sums are 1, in order that the probability vectors should be column matrices, which are more familiar in school mathematics.)

Type
Research Article
Copyright
Copyright © Mathematical Association 1977

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1. Wilson, R., Introduction to graph theory. Oliver and Boyd (1972).Google Scholar
2. Coates, C. L., Flow graph solutions of linear algebraic equations, Trans. Inst. Radio Engrs, CT-6, 170187 (1959).Google Scholar
3. Cox, D. R. and Miller, H. D., The theory of stochastic processes. Methuen (1965).Google Scholar