language-icon Old Web
English
Sign In

Markov renewal process

In probability and statistics a Markov renewal process is a random process that generalizes the notion of Markov jump processes. Other random processes like Markov chain, Poisson process, and renewal process can be derived as a special case of an MRP (Markov renewal process). In probability and statistics a Markov renewal process is a random process that generalizes the notion of Markov jump processes. Other random processes like Markov chain, Poisson process, and renewal process can be derived as a special case of an MRP (Markov renewal process). Consider a state space S . {displaystyle mathrm {S} .} Consider a set of random variables ( X n , T n ) {displaystyle (X_{n},T_{n})} , where T n {displaystyle T_{n}} are the jump times and X n {displaystyle X_{n}} are the associated states in the Markov chain (see Figure). Let the inter-arrival time, τ n = T n − T n − 1 {displaystyle au _{n}=T_{n}-T_{n-1}} . Then the sequence ( X n , T n ) {displaystyle (X_{n},T_{n})} is called a Markov renewal process if

[ "Variable-order Markov model", "Markov property" ]
Parent Topic
Child Topic
    No Parent Topic