Dynamic Jobshop Scheduling Algorithm Based on Deep Q Network

2021 
Jobshop scheduling is a classic instance in the field of production scheduling. Solving and optimizing the scheduling problem of the jobshop can greatly reduce the production cost of the workshop and improve the processing efficiency, thereby improving the market competitiveness of the manufacturing enterprises. In order to make decisions on the complex dynamic scheduling process more accurately and simplify the solution process, the jobshop scheduling problem can be transformed into a reinforcement learning problem based on the Markov decision process. The performance of the adaptive scheduling algorithm in a dynamic manufacturing environment is improved based on the Deep Q Network (DQN). In the proposed scheduling algorithm, five state features of continuous value ranges are designed for input to a Deep Neural Network (DNN), as well as ten well-known heuristic dispatching rules are selected as the action set of the DQN. In the proposed scheduling algorithm, the target network and the prediction network are used to train the parameters. An action selection strategy based on the “softmax” function is designed in DQN. It selects dispatching rules with the largest action value as the execution action, thereby solving the problem that the suboptimal action value is greater than the optimal action Q value in the early learning stage. Furthermore, the non-optimal action is selected with a greater probability in the later learning stage. Ten benchmark jobshop test instances called “LA” used as simulation objects and operated in a simulation environment composed of Python. The simulation results confirm that the proposed scheduling algorithm based on DQN has better performance and universality than a single dispatching rule or traditional Q learning algorithm.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    32
    References
    0
    Citations
    NaN
    KQI
    []