Low load DIDS task scheduling based on Q-learning in edge computing environment

2021 
Abstract Edge computing, as a new computing model, is facing new challenges in network security while developing rapidly. Due to the limited performance of edge nodes, the distributed intrusion detection system (DIDS), which relies on high-performance devices in cloud computing, needs to be improved to low load to detect packets nearby the network edge. This paper proposes a low load DIDS task scheduling method based on Q-Learning algorithm in reinforcement learning, which can dynamically adjust scheduling strategies according to network changes in the edge computing environment to keep the overall load of DIDS at a low level, while maintaining a balance between the two contradictory indicators of low load and packet loss rate. Simulation experiments show that the proposed method has better low-load performance than other scheduling methods, and indicators such as malicious feature detection rate are not significantly reduced.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    30
    References
    1
    Citations
    NaN
    KQI
    []