Grouped SMOTE With Noise Filtering Mechanism for Classifying Imbalanced Data

2019 
SMOTE (Synthetic Minority Oversampling TEchnique) is one of the most popular and well-known sampling algorithms for addressing class imbalance learning problem. The merits of SMOTE reflect at that in comparison with the random oversampling technique, it can alleviate the problem of overfitting to a large extent. However, two drawbacks of SMOTE have also been observed as follows, 1) it tends to propagate the noisy information in the procedure of oversampling; 2) it always assigns a global neighborhood parameter $K$ but neglects the local distribution characteristics. To synchronously deal with these two problems, a grouped SMOTE algorithm with noise filtering mechanism (GSMOTE-NFM) is presented in this article. The algorithm firstly adopts Gaussian-Mixture Model (GMM) to explore the real distributions of the majority and minority classes, respectively. Then, most noisy instances can be removed by comparing the probability densities of the same instance in two different classes. Next, two new GMMs are constructed on the rest majority and minority class instances, respectively. Furthermore, all minority class instances can be divided into three different groups: safety, boundary and outlier, based on the corresponding probability density information. Finally, we assign an individual parameter $K$ to the instances belonging to each specific group to generate new instances. We tested GSMOTE-NFM algorithm on 24 benchmark binary-class data sets with three popular classification models, and compared it with several state-of-the-art oversampling algorithms. The results indicate that our algorithm is significantly superior than the original SMOTE algorithm and several SMOTE-based modified methods.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    31
    References
    14
    Citations
    NaN
    KQI
    []