A Class of Lower Bounds for Bayesian Risk with a Bregman Loss
2020
A general class of Bayesian lower bounds when the underlying loss function is a Bregman divergence is demonstrated. This class can be considered as an extension of the Weinstein–Weiss family of bounds for the mean squared error and relies on finding a variational characterization of Bayesian risk. The approach allows for the derivation of a version of the Cramer–Rao bound that is specific to a given Bregman divergence. The effectiveness of the new bound is evaluated in the Poisson noise setting.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
15
References
3
Citations
NaN
KQI