Deep Reinforcement Learning based Energy Scheduling for Edge Computing

2020 
Edge computing is viewed as a new paradigm that makes computing capabilities, storage, caching close to end devices whose reliability and availability severely depend on the support from electrical power. Although the smart grid is known to be highly reliable in urban and rural areas, it still incurs electricity outage due to disasters (e.g., fire, earthquake, hurricane) or human factors (e.g., malicious destroy). To prevent service interruptions, edge nodes are equipped with lithium-ion battery groups as the backup power because of their high rated voltage and high power endurance. In this paper, we propose an energy scheduling for the jobs at edge nodes suffering from the power outage. The goal is to fully exploit the capacities of battery groups to maximize the number of finished jobs. Considering the dynamically arriving of jobs, Deep Reinforcement Learning- based (DRL) method is employed to address the online scenarios. Furthermore, simulation experiments are conducted to evaluate the proposed scheme, and the results show that DRL strategies are more efficient than general heuristics.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    22
    References
    1
    Citations
    NaN
    KQI
    []