Information characteristics signals and noise with non-Gaussian distribution
2018
The issues of information characteristics of processed signals and noise influencing them and having non-Gaussian character of distribution are considered and analyzed. It is shown that to describe statistical characteristics of useful signals and influencing noise Shannon, Fisher and Kullback information can be used. Information loss caused by noise influencing a useful signal can be estimated with the help of the above mentioned statistical characteristics. When using Shannon information the degree of uncertainty of a random process can be characterized by the entropy (Shannon information measure) relative to the probability density function (PDF). It is shown that taking into account the real form of the PDF of random processes allows us to obtain additional information when signals are processed. To describe the form of the PDF the skewness and kurtosis coefficients can be used as well as the entropy coefficient and the coefficient of counter-kurtosis. It is shown that Shannon information measure can be represented using the entropy of uniform distribution, which, in turn, depends solely on the range of variation of a random process. The expressions of Fisher information for the PDFs most widely used in practice are presented. The concept of quasi-Fisher information is considered. Fisher variance information used to describe random amplitudes of narrowband radio signals is considered as well. It is shown that the Kullback information distances can serve as a measure of accuracy of approximation of the true distribution of the observed data and the desired signal.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
10
References
11
Citations
NaN
KQI