Trading information complexity for error II: the case of a large error and external information complexity.

2019 
Two problems are studied in this paper. (1) How much external or internal information cost is required to compute a Boolean-valued function with an error at most $1/2-\epsilon$ for a small $\epsilon$? It is shown that information cost of order $\epsilon^2$ is necessary and of order $\epsilon$ is sufficient. (2) How much external information cost can be saved to compute a function with a small error $\epsilon>0$ comparing to the case when no error is allowed? It is shown that information cost of order at least $\epsilon$ and at most $h(\sqrt{\epsilon})$ can be saved. Except the $O(h(\sqrt{\epsilon}))$ upper bound, the other three bounds are tight. For distribution $\mu$ that is equally distributed on $(0,0)$ and $(1,1)$, it is shown that $IC^{ext}_\mu(XOR, \epsilon)=1-2\epsilon$ where XOR is the two-bit xor function. This equality seems to be the first example of exact information complexity when an error is allowed.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    27
    References
    0
    Citations
    NaN
    KQI
    []