A CMA-ES-Based Adversarial Attack Against Black-Box Object Detectors

2021 
Object detection is one of the essential tasks of computer vision. Object detectors based on the deep neural network have been used more and more widely in safe-sensitive applications, like face recognition, video surveillance, autonomous driving, and other tasks. It has been proved that object detectors are vulnerable to adversarial attacks. We propose a novel black-box attack method, which can successfully attack regression-based and region-based object detectors. We introduce methods to reduce search dimensions, reduce the dimension of optimization problems and reduce the number of queries by using the Covariance matrix adaptation Evolution strategy (CMA-ES) as the primary method to generate adversarial examples. Our method only adds adversarial perturbations in the object box to achieve a precise attack. Our proposed attack can hide the specified object with an attack success rate of 86% and an average number of queries of 5, 124, and hide all objects with a success rate of 74% and an average number of queries of 6, 154. Our work illustrates the effectiveness of the CMA-ES method to generate adversarial examples and proves the vulnerability of the object detectors against the adversarial attacks.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    23
    References
    0
    Citations
    NaN
    KQI
    []