Anticipative Hybrid Extreme Rotation Forest

2016 
This paper introduces an improvement on the recently published Hybrid Extreme Rotation Forest (HERF), consisting in the anticipative determination of the the fraction of each classifier architecture included in the ensemble. We call it AHERF. Both HERF and AHERF are heterogeneous classifier ensembles, which aim to profit from the diverse problem domain specificities of each classifier architecture in order to achieve improved generalization over a larger spectrum of problem domains. In this paper AHERF are built from a pool of Decision Trees (DT), Extreme Learning Machines (ELM), Support Vector Machines (SVM), k-Nearest Neighbors (k-NN), Adaboost, Random Forests (RF), and Gaussian Naive Bayes (GNB) classifiers. Given a problem dataset, the process of anticipative determination of the ensemble composition is as follows: First, we estimate the performance of each classifier architecture by independent pilot cross-validation experiments on a small subsample of the data. Next, classifier architectures are ranked according to their accuracy results. A probability distribution of classifier architectures appearing in the ensemble is built from this ranking. Finally, the type of each individual classifier is decided by sampling this probability distribution. Computational experiments on a collection of benchmark classification problems shows improvement on the original HERF, and other state-of-the-art approaches.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    30
    References
    4
    Citations
    NaN
    KQI
    []