A Fixed-Size Pruning Approach for Optimum-Path Forest

2019 
Optimum-Path Forest (OPF) is a graph-based classifier that has achieved remarkable results in various applications. OPF has many advantages when compared to other supervised classifiers, since it is free of parameters, achieves zero classification errors on the training set without overfitting, handles multiple classes without modifications or extensions, and does not make assumptions about the shape and separability of the classes. Despite these advantages, it still suffers with a high computational cost required to execute its classification process, which grows proportionally to the size of the training set. In order to overcome this drawback, we propose a new approach based on genetic algorithms to prune irrelevant training samples and still preserve accuracy in OPF classification. In our proposal, named FSGAP-OPF, the standard reproduction and mutation operators are modified so as to maintain the number of pruned patterns with a fixed-size. To evaluate the performance of our method, we tested its generalization capabilities on datasets obtained from the UCI repository. On the basis of our experiments, we can say that FSGAP-OPF is a good alternative for classification tasks and can also be used in problems where the memory consuming is crucial.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    16
    References
    0
    Citations
    NaN
    KQI
    []