Simultaneous feature selection and discretization based on mutual information

2019 
Abstract Recently mutual information based feature selection criteria have gained popularity for their superior performances in different applications of pattern recognition and machine learning areas. However, these methods do not consider the correction while computing mutual information for finite samples. Again, finding appropriate discretization of features is often a necessary step prior to feature selection. However, existing researches rarely discuss both discretization and feature selection simultaneously. To solve these issues, Joint Bias corrected Mutual Information (JBMI) is firstly proposed in this paper for feature selection. Secondly, a framework namely modified discretization and feature selection based on mutual information is proposed that incorporates JBMI based feature selection and dynamic discretization, both of which use a χ 2 based searching method. Experimental results on thirty benchmark datasets show that in most of the cases, the proposed methods outperform the state-of-the-art methods.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    49
    References
    45
    Citations
    NaN
    KQI
    []