Hellinger Entropy Concept: multidisciplinary applications.
2021
The use of a metric to assess distance between probability densities is an important practical problem used in artificial intelligence or recommendation systems. The generalized α-formalisms introduced by Renyi and Tsallis are the basis of well-known entropies and divergence models. A particular α-divergence that, was presented in a previous work from the co-authors. This particular α-divergence, in our perspective, was already essentially defined by Hellinger. The concept of Hellinger entropy makes it possible, through a maximum-entropy syllogism, to state a bound for the Hellinger metric. The square root divergence is a metric, and its nonparametric estimator has information-theoretic bounds, that can be directly computed from the data. Information-theoretic bounds for Hellinger distance are developed in this work. The asymptotic behavior allows to use this metric, in a competitive scenario with three or more densities, like clustering. The bound can be directly computed from the data making this method suitable for streaming data.
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
0
References
0
Citations
NaN
KQI