A Markov chain is collection of random variables (where the index runs through 0, 1, ...) having the property that, given the
present, the future is conditionallyindependent of the past.
In other words,
If a Markov sequence of random variates take the discrete values , ..., , then
and the sequence
is called a Markov chain (Papoulis 1984, p. 532).
A simple random walk is an example of a Markov chain.
The Season 1 episode "Man Hunt" (2005) of the television crime drama NUMB3RS
features Markov chains.