Absorbing Markov chain

A (finite) drunkard's walk is an example of an absorbing Markov chain.[1]

In the mathematical theory of probability, an absorbing Markov chain is a Markov chain in which every state can reach an absorbing state. An absorbing state is a state that, once entered, cannot be left.

Like general Markov chains, there can be continuous-time absorbing Markov chains with an infinite state space. However, this article concentrates on the discrete-time discrete-state-space case.

  1. ^ Grinstead, Charles M.; Snell, J. Laurie (July 1997). "Ch. 11: Markov Chains" (PDF). Introduction to Probability. American Mathematical Society. ISBN 978-0-8218-0749-1.