S.21.04 The mechanism of action of deep brain stimulation: preclinical and clinical evidence
0
Citation
0
Reference
10
Related Paper
Keywords:
Total correlation
Information Theory
Interaction information
Conditional mutual information
Conditional entropy
Pointwise mutual information
Information flow
Information transmission
Pointwise mutual information
Conditional mutual information
Interaction information
Relevance
Information diagram
Information Theory
Cite
Citations (163)
Among all measures of independence between random variables, mutual information is the only one that is based on information theory. Mutual information takes into account of all kinds of dependencies between variables, i.e., both the linear and non-linear dependencies. In this paper we have classified some well-known bivariate distributions into two classes of distributions based on their mutual information. The distributions within each class have the same mutual information. These distributions have been used extensively as survival distributions of two component systems in reliability theory.
Pointwise mutual information
Independence
Total correlation
Conditional mutual information
Interaction information
Information Theory
Cite
Citations (0)
Mutual information I(X;Y) is a useful definition in information theory to estimate how much information the random variable Y holds about the random variable X. One way to define the mutual information is by comparing the joint distribution of X and Y with the product of the marginals through the Kullback-Leibler (KL) divergence. If the two distributions are close to each other there will be almost no leakage of X from Y since the two variables are close to being independent. In the discrete setting the mutual information has the nice interpretation of how many bits Y reveals about X. However, in the continuous case we do not have the same reasoning. This fact enables us to try different metrics or divergences to define the mutual information. In this paper, we are evaluating different metrics and divergences to form alternatives to the mutual information in the continuous case. We deploy different methods to estimate or bound these metrics and divergences and evaluate their performances.
Interaction information
Kullback–Leibler divergence
Conditional mutual information
Information Theory
Divergence (linguistics)
Pointwise mutual information
Total correlation
Cite
Citations (3)
Mutual information is an important information measure for feature subset. In this paper, a hashing mechanism is proposed to calculate the mutual information on the feature subset. Redundancy-synergy coefficient, a novel redundancy and synergy measure of features to express the class feature, is defined by mutual information. The information maximization rule was applied to derive the heuristic feature subset selection method based on mutual information and redundancy-synergy coefficient. Our experiment results showed the good performance of the new feature selection method.
Pointwise mutual information
Interaction information
Maximization
Feature (linguistics)
Conditional mutual information
Total correlation
Cite
Citations (0)
Pointwise mutual information
Feature (linguistics)
Interaction information
Conditional mutual information
Total correlation
Information Theory
Cite
Citations (73)
Interaction information
Pointwise mutual information
Maximization
Feature (linguistics)
Conditional mutual information
Total correlation
Cite
Citations (11)
Abstract In light of the issues related to the omission of crucial features and the incorrect selection of redundant features in existing feature selection methods for zero-crossing (ZC) features, this paper presents a feature selection method based on Dynamic Weights Condition Mutual Information (DWCMI). In this method, the main factor of the objective function for feature evaluation is conditional mutual information, while also incorporating a complementary evaluation criterion based on conditional mutual information to address the issue of misselecting redundant features. By introducing a dynamic weight coefficient, we can accurately measure the importance of selected features by assessing their dynamic change in mutual information, thereby avoiding any oversight of crucial features. In the process of designing the algorithm, computational efficiency is improved by buffering and reusing previously calculated mutual information. This approach avoids the issue of repeatedly calculating the mutual information. The necessity, effectiveness, and high efficiency of the DWCMI method have been verified through simulation and experimentation.
Conditional mutual information
Interaction information
Feature (linguistics)
Total correlation
Pointwise mutual information
Cite
Citations (0)
Measures of relevance between features play an important role in classification and regression analysis. Mutual information has been proved to be an effective measure for categorical features. However, there is a limitation in computing relevance between numerical features with mutual information. In this work, we generalize Shannon's information entropy to neighborhood information entropy and propose a measure of neighborhood mutual information. It is shown that the new measure is a natural extension of classical mutual information which reduces to the classical one if features are discrete; thus the new measure can also be used to compute the relevance between discrete variables. In experiment, we show that neighborhood mutual information produces the nearly same outputs as mutual information. However, unlike mutual information, no discretization is required in computing relevance when used the proposed algorithm.
Information diagram
Pointwise mutual information
Relevance
Information Theory
Conditional mutual information
Categorical variable
Interaction information
Total correlation
Cite
Citations (11)
Total correlation
Information Theory
Interaction information
Conditional mutual information
Conditional entropy
Pointwise mutual information
Information flow
Information transmission
Cite
Citations (0)
Information Theory
Interaction information
Total correlation
Pointwise mutual information
Conditional mutual information
Feature (linguistics)
Value (mathematics)
Cite
Citations (0)