Chain rule for Kolmogorov complexity

The chain rule for Kolmogorov complexity is an analogue of the chain rule for information entropy, which states: The chain rule for Kolmogorov complexity is an analogue of the chain rule for information entropy, which states: That is, the combined randomness of two sequences X and Y is the sum of the randomness of X plus whatever randomness is left in Y once we know X.This follows immediately from the definitions of conditional and joint entropy, and the fact from probability theory that the joint probability is the product of the marginal and conditional probability: The equivalent statement for Kolmogorov complexity does not hold exactly; it is true only up to a logarithmic term: (An exact version, KP(x, y) = KP(x) + KP(y|x*) + O(1),holds for the prefix complexity KP, where x* is a shortest program for x.) It states that the shortest program printing X and Y is obtained by concatenating a shortest program printing X with a program printing Y given X, plus at most a logarithmic factor. The results implies that algorithmic mutual information, an analogue of mutual information for Kolmogorov complexity is symmetric: I(x:y) = I(y:x) + O(log K(x,y)) for all x,y. The ≤ direction is obvious: we can write a program to produce x and y by concatenating a program to produce x, a program to produce y givenaccess to x, and (whence the log term) the length of one of the programs, sothat we know where to separate the two programs for x and y|x (log(K(x, y)) upper-bounds this length).

[ "Kolmogorov structure function", "Borel–Kolmogorov paradox", "Kolmogorov's zero–one law" ]
Parent Topic
Child Topic
    No Parent Topic