Tight Bounds on the Weighted Sum of MMSEs with Applications in Distributed Estimation

2019 
In this paper, tight upper and lower bounds are derived on the weighted sum of minimum mean-squared errors for additive Gaussian noise channels. The bounds are obtained by constraining the input distribution to be close to a Gaussian reference distribution in terms of the Kullback-Leibler divergence. The distributions that attain these bounds are shown to be Gaussian whose covariance matrices are defined implicitly via systems of matrix equations. Furthermore, the estimators that attain the upper bound are shown to be minimax robust against deviations from the assumed input distribution. The lower bound provides a potentially tighter alternative to well-known inequalities such as the Cramer-Rao lower bound. Numerical examples are provided to verify the theoretical findings of the paper. The results derived in this paper can be used to obtain performance bounds, robustness guarantees, and engineering guidelines for the design of local estimators for distributed estimation problems which commonly arise in wireless communication systems and sensor networks.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    12
    References
    0
    Citations
    NaN
    KQI
    []