Information upper bound for McKean–Vlasov stochastic differential equations
0
Citation
13
Reference
10
Related Paper
Abstract:
We develop an information-theoretic framework to quantify information upper bound for the probability distributions of the solutions to the McKean–Vlasov stochastic differential equations. More precisely, we derive the information upper bound in terms of Kullback–Leibler divergence, which characterizes the entropy of the probability distributions of the solutions to McKean–Vlasov stochastic differential equations relative to the joint distributions of mean-field particle systems. The order of information upper bound is also figured out.Keywords:
Kullback–Leibler divergence
Divergence (linguistics)
Differential entropy
Kullback–Leibler divergence
Differential entropy
Divergence (linguistics)
Goodness of fit
Kernel density estimation
Cite
Citations (5)
Different information theoretic sensor management approaches are compared in a Bayesian target-tracking problem. Specifically, the performance using the expected Renyi divergence with different parameter values is compared theoretically and experimentally. Included is the special case in which the expected Renyi divergence is equal to the expected Kullback-Leibler divergence, which is also equivalent to both the mutual information and the expected change in differential entropy for this Bayesian updating problem. The example problem involves a single target moving in a circle, four bearing-only sensors, and two time-delay sensors. A particle filter based tracker is used.
Kullback–Leibler divergence
Differential entropy
Divergence (linguistics)
Information Theory
Cite
Citations (27)
The concept of Shannon Entropy for probability distributions and associated Maximum Entropy Principle are extended here to the concepts of Relative Divergence of one Grading Function from another and Maximum Relative Divergence Principle for grading functions on direct products of totally ordered chains (chain bundles). Several Operations Research applications are analyzed.
Kullback–Leibler divergence
Grading (engineering)
Divergence (linguistics)
Cite
Citations (0)
The asymptotic convergence of probability density function (pdf) and convergence of differential entropy are examined for the non-stationary processes that follow the maximum entropy principle (MaxEnt) and maximum entropy production principle (MEPP). Asymptotic convergence of pdf provides new justification of MEPP while convergence of differential entropy is important in asymptotic analysis of communication systems. A set of equations describing the dynamics of pdf under mass conservation and energy conservation constraints is derived. It is shown that for pdfs with compact carrier the limit pdf is unique and can be obtained from Jaynes’s MaxEnt principle.
Differential entropy
Entropy production
Cite
Citations (0)
The most considerable purpose for this study is to provide a useful algorithm combined of Maximum Entropy Method (MEM) and a computational method to predict the unique form of bivariate probability distributions. The new algorithm provides reasonable estimations for target distributions which have maximum entropy. The MEM is a powerful implement for reconstructing distribution from many types of data. In this study, we introduce this technique to estimate the important bivariate distributions which are very effective in industrial and engineering fields especially in Cybernetics and internet systems. To examine the effectiveness of our algorithm, some different simulation studies were conducted. This method will provide you the unique solution to find a probability distribution based on given information. Possessing the simple and accurate mathematical formulation and using presence-only data, MEM has become a well-suited method for different kinds of distribution modeling.
Probability and statistics
Cite
Citations (0)
Entropy methods enable a convenient general approach to providing a probability distribution with partial information. The minimum cross-entropy principle selects the distribution that minimizes the Kullback–Leibler divergence subject to the given constraints. This general principle encompasses a wide variety of distributions, and generalizes other methods that have been proposed independently. There remains, however, some confusion about the breadth of entropy methods in the literature. In particular, the asymmetry of the Kullback–Leibler divergence provides two important special cases when the target distribution is uniform: the maximum entropy method and the maximum log-probability method. This paper compares the performance of both methods under a variety of conditions. We also examine a generalized maximum log-probability method as a further demonstration of the generality of the entropy approach.
Kullback–Leibler divergence
Generality
Divergence (linguistics)
Cite
Citations (23)
Kullback–Leibler divergence
Differential entropy
Cite
Citations (0)
The asymptotic convergence of probability density function (pdf) and convergence of differential entropy are examined for the non-stationary processes that follow the maximum entropy principle (MaxEnt) and maximum entropy production principle (MEPP). Asymptotic convergence of pdf provides new justification of MEPP while convergence of differential entropy is important in asymptotic analysis of communication systems. A set of equations describing the dynamics of pdf under mass conservation and energy conservation constraints is derived. It is shown that for pdfs with compact carrier the limit pdf is unique and can be obtained from Jaynes's MaxEnt principle.
Differential entropy
Entropy production
Cite
Citations (0)
Probability and statistics
Cite
Citations (1)
Kullback–Leibler divergence
Differential entropy
Divergence (linguistics)
Cite
Citations (0)