A sequence , , ... of random variates is called Markov (or Markoff) if, for any ,
i.e., if the conditional distribution of assuming , , ..., equals the conditional distribution of assuming only (Papoulis 1984, pp. 528-529). The transitional densities of a Markov sequence satisfy the Chapman-Kolmogorov equation.