language-icon Old Web
English
Sign In

Absorbing Markov chain

In the mathematical theory of probability, an absorbing Markov chain is a Markov chain in which every state can reach an absorbing state. An absorbing state is a state that, once entered, cannot be left. In the mathematical theory of probability, an absorbing Markov chain is a Markov chain in which every state can reach an absorbing state. An absorbing state is a state that, once entered, cannot be left.

[ "Variable-order Markov model", "Balance equation", "Markov property" ]
Parent Topic
Child Topic
    No Parent Topic