Ensemble Feature Selection Based on Normalized Mutual Information and Diversity

2013 
How to generate classifiers with higher diversity is an important problem in ensemble learning,consequently,an iterative algorithm was proposed as follows:base classifier is trained using optimal feature subset which is selected by maximum normalized mutual information,simultaneously,the attained base classifier is measured by the diversity based on the number of miss classified samples.The algorithm stops if satisfy,otherwise iterates until end.Finally,weighted voting method is utilized to fusion the base classifiers' recognition results.To attest the validity,we made experiments on UCI data sets with support vector machine as the classifier,and compared it with Single-SVM,Bagging-SVM and AB-SVM.Experimental results suggest that our algorithm can get higher classification accuracy.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []