Everything Leaves Footprints: Hardware Accelerated Intermittent Deep Inference
2020
Current peripheral execution approaches for intermittently powered systems require full access to the internal hardware state for checkpointing or rely on application-level energy estimation for task partitioning to make correct forward progress. Both requirements present significant practical challenges for energy-harvesting, intelligent edge Internet-of-Things devices, which perform hardware-accelerated deep neural network (DNN) inference. Sophisticated compute peripherals may have an inaccessible internal state, and the complexity of DNN models makes it difficult for programmers to partition the application into suitably sized tasks that fit within an estimated energy budget. This article presents the concept of inference footprinting for intermittent DNN inference, where accelerator progress is accumulatively preserved across power cycles. Our middleware stack, HAWAII, tracks and restores inference footprints efficiently and transparently to make inference forward progress, without requiring access to the accelerator internal state and application-level energy estimation. Evaluations were carried out on a Texas Instruments device, under varied energy budgets and network workloads. Compared to a variety of task-based intermittent approaches, HAWAII improves the inference throughput by 5.7%–95.7%, particularly achieving higher performance on heavily accelerated DNNs.
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
37
References
7
Citations
NaN
KQI