Asymptotic Bayesian Theory of Quickest Change Detection for Hidden Markov Models

2019 
In the 1960s, Shiryaev developed a Bayesian theory of change-point detection in the i.i.d. case, which was generalized in the early 2000s by Tartakovsky and Veeravalli and recently by Tartakovsky (2017) for general stochastic models assuming a certain stability of the log-likelihood ratio process. Hidden Markov models represent a wide class of stochastic processes in a variety of applications. In this paper, we investigate the performance of the Bayesian Shiryaev change-point detection rule for hidden Markov models. We propose a set of regularity conditions under which the Shiryaev procedure is first-order asymptotically optimal in a Bayesian context, minimizing moments of the detection delay up to certain order asymptotically as the probability of false alarm goes to zero. The developed theory for hidden Markov models is based on Markov chain representation for the likelihood ratio and r-quick convergence for Markov random walks. In addition, applying Markov nonlinear renewal theory, we present a high-order asymptotic approximation for the expected delay to detection and a first-order asymptotic approximation for the probability of false alarm of the Shiryaev detection rule. We also study asymptotic properties of another popular change detection rule, the Shiryaev-Roberts rule, and provide some interesting examples.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    35
    References
    0
    Citations
    NaN
    KQI
    []