Offloading Method Based on Reinforcement Learning in Mobile Edge Computing

2021 
Mobile Edge Computing (MEC) has the potential to enable computation-intensive applications in 5G networks. MEC can extend the computational capacity at the edge of a wireless network by offloading computation-intensive tasks to the MEC server. This paper considers a multi-mobile equipment (Mobile Equipment, ME) MEC system, where multiple mobiles -equipment can perform computational offloading via a wireless channel to a MEC server. To reduce the total cost during the offloading process, an algorithm based on reinforcement learning, Pre-Sort Q, is proposed. First, the transmission delay and calculation delay that computation jobs may experience, the transmission energy and computation energy that the computing system would consume were modeled. Then, the weighted sum of the delay and energy consumption and use preprocessing to determine the offloading decision to minimize system cost. Pre-Sort Q can reduce the weighted sum of delay and energy consumption through experimental simulation analysis and comparison compared with three benchmarks and one method.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    13
    References
    0
    Citations
    NaN
    KQI
    []