language-icon Old Web
English
Sign In

Limiting density of discrete points

In information theory, the limiting density of discrete points is an adjustment to the formula of Claude Shannon for differential entropy. In information theory, the limiting density of discrete points is an adjustment to the formula of Claude Shannon for differential entropy. It was formulated by Edwin Thompson Jaynes to address defects in the initial definition of differential entropy. Shannon originally wrote down the following formula for the entropy of a continuous distribution, known as differential entropy: Unlike Shannon's formula for the discrete entropy, however, this is not the result of any derivation (Shannon simply replaced the summation symbol in the discrete version with an integral) and it turns out to lack many of the properties that make the discrete entropy a useful measure of uncertainty. In particular, it is not invariant under a change of variables and can even become negative. In addition, it is not even dimensionally correct. Since P ( x ) {displaystyle P(x)} would be dimensionless, p ( x ) {displaystyle p(x)} must have units of 1 d x {displaystyle {frac {1}{dx}}} , which means that the argument to the logarithm is not dimensionless as required. Jaynes (1963, 1968) argued that the formula for the continuous entropy should be derived by taking the limit of increasingly dense discrete distributions. Suppose that we have a set of N {displaystyle N} discrete points { x i } {displaystyle {x_{i}}} , such that in the limit N → ∞ {displaystyle N o infty } their density approaches a function m ( x ) {displaystyle m(x)} called the 'invariant measure'.

[ "Maximum entropy thermodynamics", "Differential entropy", "Entropy rate", "Transfer entropy", "Von Neumann entropy" ]
Parent Topic
Child Topic
    No Parent Topic