Learning Accurate, Speedy, Lightweight CNNs via Instance-Specific Multi-Teacher Knowledge Distillation for Distracted Driver Posture Identification

2022 
For deployment on an embedded processor for distracted driver classification, the model should satisfy the demand for both high accuracy, real-time inference, and limited storage resources. Conventional deep CNN models such as VGG, ResNet, DenseNet, often aim for high accuracy, making their model heavy for an embedded system with limited memory space and computing resources. In contrast, lightweight models are greatly compressed but at a significant sacrifice of accuracy. To bridge this gap, we propose an instance-specific multi-teacher knowledge distillation model (IsMt-KD) to learn more accurate, speedy, and lightweight CNNs for distracted driver posture classification. Specifically, in multi-teacher knowledge distillation, most of the current approaches either randomly select a teacher model and apply the prediction of such teacher model as the soft-label or allocate an equal weight to every teacher model and average all the predictions of the teachers as the soft label. In this paper, we observe that, when facing the same instance, the outputs of different teachers vary greatly, in which some teachers can predict it right whereas the others may give pretty high probabilities to the irrelevant classes. Thus, it is inappropriate to set fixed weights or the same weights for teachers. To this end, a simple yet effective instance-specific teacher grading module is designed to dynamically assign weights to teacher models based on individual instances. In this way, we can dynamically distill the knowledge from multiple teachers by considering both instance-specific high-level and instance-specific intermediate-level information. Our extensive experimental results on AUC and StateFarm datasets, and our implementation on edge hardware platforms including HUAWEI MediaPad c5 and Nvidia Jetson TX2, verify the effectiveness and feasibility of our approach.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    79
    References
    0
    Citations
    NaN
    KQI
    []