Investigation of a Joint Splitting Criteria for Decision Tree Classifier Use of Information Gain and Gini Index

2018 
Decision Tree is a well-accepted supervised classifier in machine learning. It splits the given data points based on features and considers a threshold value. In general, a single predefined splitting criterion is used which may lead to poor performance. To this end, in this paper, we investigate joint splitting criteria using two of the most used criterion i.e. Information Gain and Gini index. We propose to split the data points when Information Gain is maximum and Gini index is minimum. The proposed approach is rigorously tested and compared by constructing decision tree based random forests. All the experiments are performed on UCI machine learning datasets.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    18
    References
    6
    Citations
    NaN
    KQI
    []