Can Stochastic Dispatching Rules Evolved by Genetic Programming Hyper-heuristics Help in Dynamic Flexible Job Shop Scheduling?

2019 
Dynamic flexible job shop scheduling (DFJSS) considers making machine assignment and operation sequencing decisions simultaneously with dynamic events. Genetic programming hyper-heuristics (GPHH) have been successfully applied to evolving dispatching rules for DFJSS. However, existing studies mainly focus on evolving deterministic dispatching rules, which calculate priority values for the candidate machines or jobs and select the one with the best priority. Inspired by the effectiveness of training stochastic policies in reinforcement learning, and the fact that a dispatching rule in DFJSS is similar to a policy in reinforcement learning, we investigate the effectiveness of evolving stochastic dispatching rules for DFJSS in this paper. Instead of using the "winner-takes-all" mechanism, we define a range of probability distributions based on the priority values of the candidates to be used by the stochastic dispatching rules. These distributions introduce varying degrees of randomness. We empirically compare the effectiveness of GPHH in evolving the stochastic dispatching rules with different probability distributions, as well as evolving the deterministic dispatching rules. The results show that the evolved deterministic rules perform the best. We argue that this is because unlike the traditional reinforcement learning methods, the current GPHH does not store the quality (value function) of any particular state and action during the simulation, and thus cannot fully take advantage of the feedback given by the simulation. In the future, we will investigate better ways to make better use of the information during the simulation in GPHH to further improve its effectiveness.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    22
    References
    3
    Citations
    NaN
    KQI
    []