Distributed Boosting Classifiers over Noisy Channels

2020 
We present a principled framework to address resource allocation for realizing boosting algorithms on substrates with communication noise. Boosting classifiers (e.g., AdaBoost) make a final decision via a weighted vote from local decisions of many base classifiers (weak classifiers). Suppose the base classifiers' outputs are communicated over noisy channels; these noisy outputs will degrade the final classification accuracy. We show this degradation can be effectively reduced by allocating more system resources for more important base classifiers. We formulate resource optimization problems in terms of importance metrics for boosting. Moreover, we show that the optimized noisy boosting classifiers can be more robust than bagging for noise during inference (test stage). We provide numerical evidence to demonstrate the benefits of our approach.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    19
    References
    1
    Citations
    NaN
    KQI
    []