Minimum Fisher information of moment-constrained distributions with application to robust blind identification

1998 
Abstract In this paper, the minimization of the Fisher information measure under constraints on higher-order moments of the distribution is treated. It is well known that in the case of minimizing the moment-constrained Shannon information, i.e. maximizing the entropy, the solution, if exists, it is always of generalized exponential-type matching the given moments. Applying the same procedure to the Fisher information yields a set of densities which satisfy the Riccati differential equation. Although in the general case the solution can be found by numerical means only, it is shown that a closed form approximate solution can be easily obtained if we constrain the solutions to generalized exponential distributions. The analysis of this class is provided and limitations are identified. If applied to distributions with infinite support, the approximation holds for moment sequences of sub-Gaussian distributions only. In order to cover the class of super-Gaussian distributions, Gaussian mixtures are introduced as an alternative distribution class for which moment matching can be easily done. The developed theory is applied to the problem of robust accuracy improvement of a moment-based blind identification algorithm. The information about the moments is used to improve the criterion function of the inverse filtering algorithm. The new blind identification procedure delivering more accurate parameter estimates is verified on simulated examples.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    29
    References
    12
    Citations
    NaN
    KQI
    []