Cost-Constrained Classifier Cascade using Sparse Probit Models : Invited Presentation

2019 
Feature selection is a common problem in pattern recognition. Though often motivated by the curse of dimensionality, feature selection also has the added benefit of reducing the cost of extracting features from test data. In this work, sparse probit models are modified to incorporate feature costs. A single-classifier approach, Cost-Constrained Feature optimization (CCFO), is compared to a new ensemble method referred to as the Cost-Constrained Classifier Cascade (C4). The C4 method utilizes a boosting framework that accommodates per-sample feature selection. Experimental results compare C4, CCFO, and baseline sparse kernel classification on two data sets with asymmetric feature costs, illustrating that C4 can yield similar or better accuracy and more economical use of expensive features.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    10
    References
    0
    Citations
    NaN
    KQI
    []