Information Functionals in Statistics
2014
Around 1960, I. Csisz ar introduced the notion of divergence, which generalizes the previously known distances between probability measures, including the Kullback-Leibler relative entropy. An extended Kullback-Leibler inequality leads to characterizations of various versions of suciency which altogether can be rephrased in analytic terms within Le Cam's decision-theoretic approach to the comparison of statistical experiments.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
0
References
0
Citations
NaN
KQI