logo
    Data-driven identification model for associated fault propagation path
    17
    Citation
    41
    Reference
    10
    Related Paper
    Citation Trend
    An information-theoretic approach is described for detecting damage-induced nonlinearities in structures. Both the time-delayed mutual information and time-delayed transfer entropy are presented as methods for computing the amount of information transported between points on a structure. By comparing these measures to "linearized" surrogate data sets, the presence and degree of nonlinearity in a system may be deduced. For a linear, five-degree-of-freedom system both mutual information and transfer entropy are derived. An algorithm is then described for computing both quantities from time-series data and is shown to be in agreement with theory. The approach successfully deduces the amount of damage to the structure even in the presence of simulated temperature fluctuations. We then demonstrate the approach to be effective in detecting varying levels of impact damage in a thick composite plate structure.
    Information Theory
    Degree (music)
    Information Transfer
    Citations (0)
    Entropy is a measure of information content or complexity. Information-theoretic modeling has been successfully used in various biological data analyses including functional magnetic resonance (fMRI). Several studies have tested and evaluated entropy measures on simulated datasets and real fMRI data. The efficiency of entropy algorithms has been compared to classical methods based on the linear model. Here we explain and summarize entropy algorithms that have been used in fMRI analysis, their advantages over classical methods and their potential use in event-related and block design fMRI.
    Information Theory
    Citations (4)
    We present a methodology for detecting effective connections between simultaneously recorded neurons using an information transmission measure to identify the presence and direction of information flow from one neuron to another. Using simulated and experimentally-measured data, we evaluate the performance of our proposed method and compare it to the traditional transfer entropy approach. In simulations, our measure of information transmission outperforms transfer entropy in identifying the effective connectivity structure of a neuron ensemble. For experimentally recorded data, where ground truth is unavailable, the proposed method also yields a more plausible effective connectivity structure than transfer entropy.
    Information Transfer
    Information transmission
    Ground truth
    Information Theory
    Information flow
    One challenging problem in the study of complex networks is the quantification of relationships between time series recorded across the network. Two information-theoretic measures, i.e., transfer entropy and directed information, have been extensively studied to capture the causality relationship between subsystems of a network. However, the relationship between these two measures have not been fully investigated to date. In this paper, we derive a formula to show the relationship between these two measures, in particular show that, transfer entropy is equal to the upper bound of directed information rate, and verify it through simulations.
    Information Transfer
    Information diagram
    Causality
    Information Theory
    Citations (20)
    The explosion in social media adoption has opened up new opportunities to understand human interaction and information flow at an unprecedented scale. Influence between people represented as nodes of a social graph is best characterized in terms of the direction, the volume and the delay associated with the information flow. In this work we investigate the relatively new information-theoretic measure called transfer entropy as a measure of directed causal influence in online social interactions. The classical definition of transfer entropy is extended to a form applicable to activity on social graphs characterized by causal influence through delayed responses. For fixed but arbitrary interaction delays, we show that the swept delayed transfer entropy (DTE) profile peaks at the true delay. By extending the results to discrete and continuous distributions of interaction delays, the efficacy of DTE in recovering the interaction delay distributions between two causally related signals is demonstrated. An information theoretic annotation of social graphs that captures the volume and velocity of information transfer is presented based on the swept DTE.
    Information flow
    Information Transfer
    Information Theory
    Citations (7)
    The objective of this study was to characterize heterogeneous flow patterns using the information theory. The information content of heterogeneous flow was measured with Shannon information entropy and the main information gain, and the flow complexity was measured with the effective measure complexity and fluctuation complexity. The mean information gain, the effective measure complexity and fluctuation complexity increased with the information entropy for the flow sequence. As more heterogeneity information was included, the flow system became more complex and uncertain. The information measures appeared to be a more versatile tool to describe heterogeneous flow patterns.
    Information flow
    Information Theory
    Information gain
    Information diagram
    Citations (0)
    The spread and evolution of web hot events is due to the interaction between users. And the behaviour of users is particularly important in micro log. The information spread through the behaviour of users and information exchange among users lead to event evolution. So the behaviour of users plays an important role in information spread and evolution of web hot event. The analysis of behaviour is a key method to obtain the development direction and trend of web hot events. However, it is a challenge to quantify the behaviour of users, especially to properly predict the direction and trend of web hot events according to the behaviour of users. This paper proposes a way to build information spread network according to behaviour of users and have a statistics of degrees of node in network. Then we use information entropy to measure the stability of behaviour network according to node degrees. And we can predict the direction and trend of web hot events by their information entropy. Experimental results show that information entropy can efficiently measure the stability of network, popular events have high entropy and their structure of network is unstable, normal events have low entropy and their structure is stable, which means events with high entropy are more likely to be popular events and may evolution in different directions.
    Microblogging
    Citations (1)
    Firstly,the shortcomings of traditional statistical analysis using network flow data are discussed,and it is pointed out that the entropy analysis can reflect more potential information to find out more network anomaly that can not be found by the traditional statistical analysis.Secondly,the difference between the flow entropy and count entropy is discussed and it is proposed that they should be used cooperatively and that using one of them just as existing studies is not recommended.Finally,features of the two kinds of entropy are studied bymutual information analysis.The simulations show that there is redundant in them.After redundant features are eliminated,the detection efficiency is increased significantly while the detection accuracy is maintained.
    Statistical Analysis
    Conditional entropy
    Citations (0)