Logistics-involved QoS-aware service composition in cloud manufacturing with deep reinforcement learning

2021 
Abstract Cloud manufacturing is a new manufacturing model that aims to provide on-demand manufacturing services to consumers over the Internet. Service composition is an essential issue as well as an important technique in cloud manufacturing (CMfg) that supports construction of larger-granularity, value-added services by combining a number of smaller-granularity services to satisfy consumers’ complex requirements. Meta-heuristics algorithms such as genetic algorithm, particle swarm optimization, and ant colony algorithm are frequently employed for addressing service composition issues in cloud manufacturing. These algorithms, however, require complex design flows and painstaking parameter tuning, and lack adaptability to dynamic environment. Deep reinforcement learning (DRL) provides an alternative approach for solving cloud manufacturing service composition (CMfg-SC) issues. DRL as model-free artificial intelligent methods enables a system to learn optimal service composition solutions through training, which can therefore circumvent the aforementioned problems with meta-heuristics algorithms. This paper is dedicated to exploring possible applications of DRL in CMfg-SC. A logistics-involved QoS-aware DRL-based CMfg-SC is proposed. A dueling Deep Q-Network (DQN) with prioritized replay named PD-DQN is designed as the DRL algorithm. Effectiveness, robustness, adaptability, and scalability of PD-DQN are investigated, and compared with that of the basic DQN and Q-learning. Experimental results indicate that PD-DQN is able to effectively address the CMfg-SC problem.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    46
    References
    19
    Citations
    NaN
    KQI
    []