Adaptive Queue Management in Embedded Edge Devices for Object Detection with Low Latency

2020 
With the advancement in artificial intelligence (AI) and microchips, we have been witnessing growing use of AI in object detection from video frames captured from Internet of thing (IoT) devices (e.g., IP cameras). From the video capturing sources to Cloud, generally, the packet forwarding delay could reach up to several hundreds of milliseconds, thereby making Cloud based objection detection as an impractical solution when the latency requirement for object detection from video frames is stringent. Embedded edge devices which are situated at the network access segment are considered to be a promising solution as they perform object detection locally and impose a negligible amount of packet forwarding delay. Nevertheless, due to lack of computation resources on such embedded edge devices, processing delay is likely to be increased as the frame rates of incoming images increase. To solve this issue, we design an adaptive queue management-based object detecting (QMOD) scheme for achieving low latency AI services with limited computation resources. We introduce a target delay requirement of AI services, and map such requirement when deciding optimal queue size so as to contain frames prior to object detection procedures. The optimal queue size, which represents a desired value at the embedded devices, is dynamically determined by the target delay requirement and weighted-average of processing time. Thus, maximum queue size is adaptively controlled, and some redundant frames are automatically dropped due to overflow of the queue. We demonstrate through experiments that the proposed QMOD scheme achieves 80%-90% reduction of latency for queuing and object detection in the commercial edge devices such as NVIDIA Xavier, Jetson TX2, and Nano, compared to the various benchmark schemes (i.e., static queue size).
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    7
    References
    2
    Citations
    NaN
    KQI
    []