Minimization of the Kullback-Leibler Divergence for Nonlinear Estimation

2016 
A nonlinear approximate Bayesian filter, named the minimum divergence filter, is developed in which the state density is approximated by an assumed density. The parameters of the assumed density are found by minimizing the Kullback–Leibler divergence from the state density, whose evolution is defined by the Chapman–Kolmogorov equation and Bayes’ rule, to the assumed density. When an assumed Gaussian density is used and the dynamical system and measurement models possess additive, Gaussian-distributed noise, the predictor of the minimum divergence filter is equivalent to the predictor used under the Kalman framework, and the corrector defines the mean and covariance of the posterior Gaussian density as the first and second central moments of the posterior density defined by Bayes’ rule. To evaluate the efficacy of the minimum divergence filter, its corrector is first compared to that of standard Kalman-type filters. Finally, the minimum divergence filter is compared to the quadrature Kalman filter to estim...
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []