Markov chain
A
Markov chain (named in honor of
Andrei Andreevich Markov) is a
stochastic process with what is called the
Markov property, of which there is a "discretetime" version and a "continuoustime" version. In the discretetime case, the process consists of a sequence
X_{1},X_{2},X_{3},.... of
random variables taking values in a "state space", the value of
X_{n} being "the state of the system at time
n". The (discretetime)
Markov property says that the conditional distribution of the "future"

given the "past",
X_{1},...,X_{n}, depends on the past
only through
X_{n}. In other words, knowledge of the most recent past state of the system renders knowledge of less recent history irrelevant. Each particular Markov chain may be identified with its matrix of "transition probabilities", often called simply its
transition matrix. The entries in the transition matrix are given by

= the probability that the system will be in state
j "tomorrow" given that it is in state
i "today". The
ij entry in the
kth power of the matrix of transition probabilities is the conditional probability that
k "days" in the future the system will be in state
j, given that it is in state
i "today". A matrix is a
stochastic matrix if and only if it is the matrix of transition probabilities of some Markov chain.
Markov chains are used to model various processes in queuing theory and statistics, and can also be used as a signal model in entropy coding techniques such as arithmetic coding. Markov chains also have many biological applications, particularly population processes, which are useful in modelling processes that are (at least) analogous to biological populations. Furthemore, the concept of Markov chains has been used in bioinformatics as well. An example is the genemark algorithm for coding region/gene prediction.
Markov processes can also be used to generate superficially "reallooking" text given a sample document: they are used in various pieces of recreational "parody generator" software (see Jeff Harrison).
External links