language-icon Old Web
English
Sign In

Stopping time

In probability theory, in particular in the study of stochastic processes, a stopping time (also Markov time, Markov moment, optional stopping time or optional time) is a specific type of “random time”: a random variable whose value is interpreted as the time at which a given stochastic process exhibits a certain behavior of interest. A stopping time is often defined by a stopping rule, a mechanism for deciding whether to continue or stop a process on the basis of the present position and past events, and which will almost always lead to a decision to stop at some finite time.Local martingale process. A process X is a local martingale if it is càdlàg and there exists a sequence of stopping times τn increasing to infinity, such that Locally integrable process. A non-negative and increasing process X is locally integrable if there exists a sequence of stopping times τn increasing to infinity, such that In probability theory, in particular in the study of stochastic processes, a stopping time (also Markov time, Markov moment, optional stopping time or optional time) is a specific type of “random time”: a random variable whose value is interpreted as the time at which a given stochastic process exhibits a certain behavior of interest. A stopping time is often defined by a stopping rule, a mechanism for deciding whether to continue or stop a process on the basis of the present position and past events, and which will almost always lead to a decision to stop at some finite time. Stopping times occur in decision theory, and the optional stopping theorem is an important result in this context. Stopping times are also frequently applied in mathematical proofs to “tame the continuum of time”, as Chung put it in his book (1982). Let τ {displaystyle au } be a random variable, which is defined on the filtered probability space ( Ω , F , ( F n ) n ∈ N , P ) {displaystyle (Omega ,{mathcal {F}},({mathcal {F}}_{n})_{nin mathbb {N} },P)} with values in N ∪ { + ∞ } {displaystyle mathbb {N} cup {+infty }} . Then τ {displaystyle au } is called a stopping time (with respect to the filtration F = ( ( F n ) n ∈ N {displaystyle mathbb {F} =(({mathcal {F}}_{n})_{nin mathbb {N} }} ), if the following condition holds: Intuitively, this condition means that the 'decision' of whether to stop at time n {displaystyle n} must be based only on the information present at time n {displaystyle n} , not on any future information. Let τ {displaystyle au } be a random variable, which is defined on the filtered probability space ( Ω , F , ( F t ) t ∈ T , P ) {displaystyle (Omega ,{mathcal {F}},({mathcal {F}}_{t})_{tin T},P)} with values in T {displaystyle T} . In most cases, T = [ 0 , + ∞ ) {displaystyle T=[0,+infty )} . Then τ {displaystyle au } is called a stopping time (with respect to the filtration F = ( F t ) t ∈ T {displaystyle mathbb {F} =({mathcal {F}}_{t})_{tin T}} ), if the following condition holds: Let τ {displaystyle au } be a random variable, which is defined on the filtered probability space ( Ω , F , ( F t ) t ∈ T , P ) {displaystyle (Omega ,{mathcal {F}},({mathcal {F}}_{t})_{tin T},P)} with values in T {displaystyle T} . Then τ {displaystyle au } is called a stopping time iff the stochastic process X = ( X t ) t ∈ T {displaystyle X=(X_{t})_{tin T}} , defined by is adapted to the filtration F = ( F t ) t ∈ T {displaystyle mathbb {F} =({mathcal {F}}_{t})_{tin T}} Some authors explicitly exclude cases where τ {displaystyle au } can be + ∞ {displaystyle +infty } , whereas other authors allow τ {displaystyle au } to take any value in the closure of T {displaystyle T} . To illustrate some examples of random times that are stopping rules and some that are not, consider a gambler playing roulette with a typical house edge, starting with $100 and betting $1 on red in each game:

[ "Applied mathematics", "Statistics", "Mathematical optimization", "Mathematical analysis", "Optional stopping theorem", "Stopped process", "optimal stopping time" ]
Parent Topic
Child Topic
    No Parent Topic