logo
    A Hierarchical Fusion SAR Image Change-Detection Method Based on HF-CRF Model
    10
    Citation
    61
    Reference
    10
    Related Paper
    Citation Trend
    Abstract:
    The mainstream methods for change detection in synthetic-aperture radar (SAR) images use difference images to define the initial change regions. However, methods can suffer from semantic collapse, which makes it difficult to determine semantic information about the changes. In this paper, we proposed a hierarchical fusion SAR image change-detection model based on hierarchical fusion conditional random field (HF-CRF). This model introduces multimodal difference images and constructs the fusion energy potential function using dynamic convolutional neural networks and sliding window entropy information. By using an iterative convergence process, the proposed method was able to accurately detect the change-detection regions. We designed a dynamic region convolutional semantic segmentation network with a two-branch structure (D-DRUNet) to accomplish feature fusion and the segmentation of multimodal difference images. The proposed network adopts a dual encoder–single decoder structure where the baseline is the UNet network that utilizes dynamic convolution kernels. D-DRUNet extracts multimodal difference features and completes semantic-level fusion. The Sobel operator is introduced to strengthen the multimodal difference-image boundary information and construct the dynamic fusion pairwise potential function, based on local boundary entropy. Finally, the final change result is stabilized by iterative convergence of the CRF energy potential function. Experimental results demonstrate that the proposed method outperforms existing methods in terms of the overall number of detection errors, and reduces the occurrence of false positives.
    A novel method for high-dimensional mutual information registration is proposed. This method first calculates high-dimensional mutual information matrix, and then calculates the entropy of that matrix. The maximal entropy corresponds to the optimal registration solution. The method was qualitatively and quantitatively evaluated on simulated and real brain images. The obtained results show that the proposed method can improve registration accuracy and decrease registration time.
    Image registration
    Co-occurrence matrix
    Citations (7)
    Previous image registration schemes based on mutual information use Shannon's entropy measure, and they have been successfully applied for mono- and multimodality registration. There are cases, however, where maximization of mutual information does not lead to the correct spatial alignment of a pair of images. Some failures are due to the presence of local or spurious global maxima. In this paper we explore whether the normalization of mutual information via the use of a weight based on the size of region of overlap, improves the rate of successful alignments by reducing the presence of suboptimal extrema. In addition, we examine the utility of a deterministic entropy measure. The results of the present study indicate that: (1) the normalized mutual information provides a larger capture range and is more robust, with respect to optimization parameters, than the non-normalized mutual information, and (2) the optimization of mutual information with the deterministic entropy measure takes, on average, fewer iterations than when using Shannon's entropy measure. We conclude that the normalized mutual information using the deterministic entropy measure is a faster and more robust function for registration than the traditional mutual information.
    Maximization
    Information Theory
    Image registration
    Pointwise mutual information
    Spurious relationship
    Maxima and minima
    Entropy maximization
    Citations (33)
    Mutual information has been widely used as a similarity metric for biomedical image registration. Although usually based on the Shannon definition of entropy, mutual information may be computed from other entropy definitions. Mutual information similarity metrics computed from fractional order Renyi entropy and entropy kind t are presented as novel similarity metrics for ultrasound/MRI registration. These metrics are shown to be more accurate than Shannon mutual information in many cases, and frequently facilitate faster convergence to the optimum. They are particularly effective for local optimization, but some measures may potentially be exploited for global searches.
    Image registration
    Information Theory
    Similarity (geometry)
    Owing to the unique all-weather, day-night imaging capabilities of Synthetic Aperture Radar (SAR) imaging, the modality is advantageous in the detection of anthropogenic activity. In this paper we present a fully unsupervised change detection framework that operates on Very High Resolution (VHR) SAR image pairs to produce a binary change map, without a need for per-image parameter setting. The framework is demonstrated on a pair of Capella-2 VHR X-band imagery acquired over San Diego, USA.
    Modality (human–computer interaction)
    Measures of relevance between features play an important role in classification and regression analysis. Mutual information has been proved to be an effective measure for categorical features. However, there is a limitation in computing relevance between numerical features with mutual information. In this work, we generalize Shannon's information entropy to neighborhood information entropy and propose a measure of neighborhood mutual information. It is shown that the new measure is a natural extension of classical mutual information which reduces to the classical one if features are discrete; thus the new measure can also be used to compute the relevance between discrete variables. In experiment, we show that neighborhood mutual information produces the nearly same outputs as mutual information. However, unlike mutual information, no discretization is required in computing relevance when used the proposed algorithm.
    Information diagram
    Pointwise mutual information
    Relevance
    Information Theory
    Conditional mutual information
    Categorical variable
    Interaction information
    Total correlation
    Citations (11)
    This paper investigated the influence of the entropy order of the mutual information Renyi and Tsallis on the convergence rate of alignment parameter estimates when designing stochastic image alignment algorithms based on these types of mutual information, including under conditions of additive noise. It is shown that the optimal entropy order is found a priori before the design of the alignment algorithm based on the analysis of the slope of the mutual information. The results are compared with those obtained in the case of the Shannon mutual-information-based estimation procedure.
    Tsallis entropy
    Information Theory
    In this paper, an embedded entropy based image registration scheme has been proposed. Here, Tsallis and Renyi's entropy have been embedded to form a new entropic measure. This parametrized entropy has been used to determine the weighted mutual information (MI) for the CT and MR brain images. The embedded mutual information has been maximized to obtain registration. This notion of embedded mutual information has also been validated in feature space registration. The mutual information with respect to the registration parameter has been found to be a nonlinear curve. It has been found that the feature space registration resulted in higher value mutual information and hence registration process could be smoother. We have used Simulated Annealing algorithm to determine the maximum of this embedded mutual information and hence register the images.
    Image registration
    Tsallis entropy
    Multimodal imaging systems demand sophisticated registration routines. Due to computation time and non-convergence issues, the use of traditional mutual information (MI)-based registration is impractical. We propose our sampling optimization technique with selective high entropy MI computation as a rapid and robust image registration method for real-time applications.
    Image registration
    Mutual information is a generally used evaluation function for feature selection. But research showed that it might lead to the low classification accuracy. To overcome the problem that the mutual information is incline to selecting low-frequency words,this paper proposes a new feature evaluation function TFMIIE for feature selection,which combines the information entropy with the improved mutual information. The improved mutual information avoids to selecting the low-frequency unfamiliar words,and the entropy of feature favors to removing the feature words with unclear class properties. Experimental results show that using TFMIIE to select feature,and to repesent text and build classifiers can achieve better precision ratio and recall ratio of text classification with about 40% increasing,which validated the proposed text feature selection method using the improved mutual information and the information entropy.
    Information gain
    Interaction information
    Feature (linguistics)
    Information Theory
    Pointwise mutual information
    Citations (2)
    An upper bound is derived for the mutual information between a fixed image and a deformable template containing a fixed number of gray-levels. The bound can be calculated by maximizing the entropy of the template under the constraint that the conditional entropy of the template, given the fixed image, be zero. This bound provides useful insight into the properties of mutual information as a similarity metric for deformable image registration. Specifically, it indicates that maximizing mutual information may not necessarily produce an optimal solution when the deformable transform is too flexible.
    Conditional entropy
    Image registration
    Information Theory
    Citations (43)