On the Secret Key Capacity of Sibling Hidden Markov Models

2019 
Traditional approaches to secret key establishment based on common randomness have been based on certain restrictive assumptions, such as considering the available common randomness to consist of independent and identically distributed (i.i.d) repetitions of correlated random variables. Unfortunately, the i.i.d assumption does not generally reflect the conditions of real-life scenarios. For this reason, the current paper investigates the key-establishment potential of a more pragmatic model, in which all parties have access to imperfect information about a common source modeled as a Markov chain. Each party’s information thus comes in the form of a hidden Markov model and, since the different parties share the same underlying Markov chain, we call the overall model a sibling hidden Markov model (SHMM). This paper studies upper and lower bounds on the secret key capacity for various types of SHMM. The difficulty of the problem emerges from its prohibitive computational cost. To address this obstacle, we represent the joint probability of the observations as the $L_{1}$ norm of a Markov random matrix, and use its convergence to a Lyapunov exponent.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    30
    References
    0
    Citations
    NaN
    KQI
    []