Introduction
Published online by Cambridge University Press: 05 October 2015
Summary
A brief overview of state space analysis
Mathematical background
In probability theory and statistics, a random variable is a variable whose value is subject to variations due to chance. A random variable can take on a set of possible different values. The mathematical function describing the possible values of a random variable and their associated probabilities is known as a probability distribution. Random variables can be discrete, that is, taking any of a specified finite or countable list of values, endowed with a probability mass function (pmf); or continuous, taking any numerical value in an interval or collection of intervals, via a probability density function (pdf); or a mixture of both types. From either the pmf or the pdf, one can characterize the cumulant or moment statistics of the random variables, such as the mean, variance, covariance, skewness and kurtosis.
To represent the evolution of the random variable over time, a random process is further introduced. A stochastic process is a collection of random values, which is the probabilistic counterpart to a deterministic process. Whereas the deterministic process is governed by an ordinary differential equation, there is some indeterminacy in the stochastic process: given the identical initial condition, the evolution of the process may vary due to the presence of noise. In discrete time, a stochastic process involves a sequence of random variables and the time series associated with these random variables. A stochastic process is said strictly stationary if the joint probability distribution does not change when shifted in time; whereas a stochastic process is said wide-sense stationary (WSS) if its first moment and covariance statistics do not vary with respect to time. Any strictly stationary process which has a mean and a covariance is also WSS.
A Markov chain (or Markov process), named after Russian mathematician Andrey Markov (1856–1922), is a random process that undergoes transitions from one state to another on a state space. The Markov chain is memoryless: namely, the next state depends only on the current state and not on the sequence of events that preceded it. This specific kind of memoryless property is called a Markovian property.
- Type
- Chapter
- Information
- Publisher: Cambridge University PressPrint publication year: 2015