statistics

Definition

Markov Chain

A Markov chain is a discrete-time stochastic process that satisfies the Markov property: the conditional distribution of the next state depends only on the current state, not on the full history.

Its evolution is fully specified by:

  • A countable or finite state space .
  • Transition probabilities for all .
  • An initial distribution .

The transition probabilities are often collected in a transition matrix with entries .

Properties

Time-homogeneity

A Markov chain is time-homogeneous if the transition probabilities do not change over time: is the same for all . Otherwise it is time-inhomogeneous.

Stationary distribution

A probability distribution over is stationary if it satisfies . Under certain conditions (irreducibility and aperiodicity), the chain converges to its unique stationary distribution regardless of the initial state.