logo
    Individual Sensing can Gain more Fitness than its Information
    0
    Citation
    20
    Reference
    10
    Related Paper
    Abstract:
    Mutual information and its causal variant, directed information, have been widely used to quantitatively characterize the performance of biological sensing and information transduction. However, once coupled with selection in response to decision-making, the sensing signal could have more or less evolutionary value than its mutual or directed information. In this work, we show that an individually sensed signal always has a better fitness value, on average, than its mutual or directed information. The fitness gain, which satisfies fluctuation relations (FRs), is attributed to the selection of organisms in a population that obtain a better sensing signal by chance. A new quantity, similar to the coarse-grained entropy production in information thermodynamics, is introduced to quantify the total fitness gain from individual sensing, which also satisfies FRs. Using this quantity, the optimizing fitness gain from individual sensing is shown to be related to fidelity allocations for individual environmental histories. Our results are supplemented by numerical verifications of FRs, and a discussion on how this problem is linked to information encoding and decoding.
    Keywords:
    Information gain
    Information Theory
    Value of information
    SIGNAL (programming language)
    Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding. Asymptotic formulas based on Fisher information sometimes provide accurate approximations to the mutual information but this approach is restricted to continuous variables because the calculation of Fisher information requires derivatives with respect to the encoded variables. In this paper, we consider information-theoretic bounds and approximations of the mutual information based on Kullback--Leibler divergence and R\'{e}nyi divergence. We propose several information metrics to approximate Shannon mutual information in the context of neural population coding. While our asymptotic formulas all work for discrete variables, one of them has consistent performance and high accuracy regardless of whether the encoded variables are discrete or continuous. We performed numerical simulations and confirmed that our approximation formulas were highly accurate for approximating the mutual information between the stimuli and the responses of a large neural population. These approximation formulas may potentially bring convenience to the applications of information theory to many practical and theoretical problems.
    Information Theory
    Neural coding
    Citations (5)
    In this paper, we focus on the mutual information, which can characterize the transmission ability because it shows correlation between channel input and channel output. Shannon entropy and mutual information are the cornerstones of information theory. In addition, Chernoff information is another fundamental channel information measure, and it describe the maximum achievable exponent of the error probability in hypothesis testing. Uased on alternating conditional expectation (ACE) algorithm, we decompose these two mutual information. In fact, their decomposition results are similar in big data prespective. In this sense, these two kinds of mutual information are just different measures of the same information quantity. This paper also deduces that the channel performance only depends on channel parameters and the decomposition results of a new proposed mutual information should agree with the impact of the parameters.
    Conditional mutual information
    Information Theory
    Conditional entropy
    Total correlation
    A fundamental relationship between information theory and estimation theory was recently unveiled for the Gaussian channel, relating the derivative of mutual information with the minimum mean-square error. This paper generalizes this fundamental link between information theory and estimation theory to arbitrary channels and in particular encompasses the discrete memoryless channel (DMC). In addition to the intrinsic theoretical interest of such a result, it naturally leads to an efficient numerical computation of mutual information for cases in which it was previously infeasible such as with LDPC codes.
    Information Theory
    Representation
    Citations (0)
    Conditional mutual information
    Information gain
    Information Theory
    Interaction information
    Conditional entropy
    In this paper, we discuss the problem of feature selection for the purpose of classification and propose a solution based on the concept of mutual information. In addition, we propose a new evaluation function to measure the ability of feature subsets in distinguishing between class labels. The proposed function is based on the information gain taking into consideration how features work. Finally, we discuss the performance of this function compared to that of other measures which evaluate features individually.
    Information gain
    Feature (linguistics)
    Pointwise mutual information
    Citations (34)
    Information theoretic measures (entropies, entropy rates, mutual information) are nowadays commonly used in statistical signal processing for real-world data analysis. The present work proposes the use of Auto Mutual Information (Mutual Information between subsets of the same signal) and entropy rate as powerful tools to assess refined dependencies of any order in signal temporal dynamics. Notably, it is shown how two-point Auto Mutual Information and entropy rate unveil information conveyed by higher order statistic and thus capture details of temporal dynamics that are overlooked by the (two-point) correlation function. Statistical performance of relevant estimators for Auto Mutual Information and entropy rate are studied numerically, by means of Monte Carlo simulations, as functions of sample size, dependence structures and hyper parameters that enter their definition. Further, it is shown how Auto Mutual Information permits to discriminate between several different non Gaussian processes, having exactly the same marginal distribution and covariance function. Assessing higher order statistics via multipoint Auto Mutual Information is also shown to unveil the global dependence structure fo these processes, indicating that one of the non Gaussian actually has temporal dynamics that ressembles that of a Gaussian process with same covariance while the other does not.
    Information Theory
    Total correlation
    Citations (10)
    Total correlation
    Information Theory
    Interaction information
    Conditional mutual information
    Conditional entropy
    Pointwise mutual information
    Information flow
    Information transmission
    Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding. Asymptotic formulas based on Fisher information sometimes provide accurate approximations to the mutual information but this approach is restricted to continuous variables because the calculation of Fisher information requires derivatives with respect to the encoded variables. In this paper, we consider information-theoretic bounds and approximations of the mutual information based on Kullback--Leibler divergence and Rényi divergence. We propose several information metrics to approximate Shannon mutual information in the context of neural population coding. While our asymptotic formulas all work for discrete variables, one of them has consistent performance and high accuracy regardless of whether the encoded variables are discrete or continuous. We performed numerical simulations and confirmed that our approximation formulas were highly accurate for approximating the mutual information between the stimuli and the responses of a large neural population. These approximation formulas may potentially bring convenience to the applications of information theory to many practical and theoretical problems.
    Information Theory
    Kullback–Leibler divergence
    Divergence (linguistics)
    Fisher information
    Neural coding
    Total correlation
    In this chapter, we discuss the extension of three concepts of classical information theory, namely, conditional information, transinformation (also called mutual information), and information gain (also called the Kullback–Leibler distance) from descriptions to (reasonably large classes of) covers. This extension will also extend these concepts from discrete to continuous random variables.
    Conditional mutual information
    Information gain
    Information Theory
    Interaction information
    Pointwise mutual information
    Conditional entropy
    Information diagram
    Mutual information $I(X;Y)$ is a useful definition in information theory to estimate how much information the random variable $Y$ holds about the random variable $X$. One way to define the mutual information is by comparing the joint distribution of $X$ and $Y$ with the product of the marginals through the KL-divergence. If the two distributions are close to each other there will be almost no leakage of $X$ from $Y$ since the two variables are close to being independent. In the discrete setting the mutual information has the nice interpretation of how many bits $Y$ reveals about $X$ and if $I(X;Y)=H(X)$ (the Shannon entropy of $X$) then $X$ is completely revealed. However, in the continuous case we do not have the same reasoning. For instance the mutual information can be infinite in the continuous case. This fact enables us to try different metrics or divergences to define the mutual information. In this paper, we are evaluating different metrics or divergences such as Kullback-Liebler (KL) divergence, Wasserstein distance, Jensen-Shannon divergence and total variation distance to form alternatives to the mutual information in the continuous case. We deploy different methods to estimate or bound these metrics and divergences and evaluate their performances.
    Kullback–Leibler divergence
    Information Theory
    Divergence (linguistics)
    Interaction information
    Total correlation
    Conditional mutual information
    Statistical distance
    Pointwise mutual information
    Citations (0)