On Merging MobileNets for Efficient Multitask Inference

2019 
When deploying two or more well-trained deep-learningmodels on a system, we would hope to unify them into asingle deep model for the execution in the inference stage, sothat the computation time can be increased and the energyconsumption can be saved. This paper presents an effective method to build a single deep neural network that canexecute multiple tasks. Our approach can merge two well-trained feed-forward neural networks of the same architecture into a single one, where the required on-line storageis reduced and the inference speed is enhanced. We evalu-ate our approach by using MobileNets and show that ourapproach can improve both compression ratio and speedup. The experimental results demonstrate the satisfactory per-formance and verify the feasibility of the method.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    12
    References
    0
    Citations
    NaN
    KQI
    []