Assessing significance in a Markov chain without mixing

2016 
We present a new statistical test to detect that a presented state of a reversible Markov chain was not chosen from a stationary distribution. In particular, given a value function for the states of the Markov chain, we would like to demonstrate rigorously that the presented state is an outlier with respect to the values, by establishing a $p$-value for observations we make about the state under the null hypothesis that it was chosen uniformly at random. A simple heuristic used in practice is to sample ranks of states from long random trajectories on the Markov chain, and compare these to the rank of the presented state; if the presented state is a $0.1\%$-outlier compared to the sampled ranks (i.e., its rank is in the bottom $0.1\%$ of sampled ranks) then this should correspond to a $p$-value of $0.001$. This test is not rigorous, however, without good bounds on the mixing time of the Markov chain, as one must argue that the observed states on the trajectory approximate the stationary distribution. Our test is the following: given the presented state in the Markov chain, take a random walk from the presented state for any number of steps. We prove that observing that the presented state is an $\varepsilon$-outlier on the walk is significant at $p=\sqrt {2\varepsilon}$, under the null hypothesis that the state was chosen from a stationary distribution. Our result assumes nothing about the structure of the Markov chain beyond reversibility, and we construct examples to show that significance at $p\approx\sqrt \varepsilon$ is essentially best possible in general. We illustrate the use of our test with a potential application to the rigorous detection of gerrymandering in Congressional districtings.
    • Correction
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []