Markov Decision Process to Optimise Long-term Asset Maintenance and Technologies Investment in Chemical Industry

2021 
Abstract The decisions on synthesising a process network are often to optimise the payback periods based on investment cost. In addition to the core investment and the cost of used resources, the long-term reliable operation of the process is also crucial. Given available states and technologies of the assets, this study aims to identify the long-term optimal asset planning policy. Markov Decision Process (MDP) is a promising tool in identifying the optimal policy under different states of the assets or equipment. The failure probability of the unit is modelled with the ‘bathtub’ model and each of the condition states are incorporated in the MDP. The decisions to implement the redundant units in the process with variety of technologies are allowed. This paper applied the MDP into an equivalent Mixed Integer Non-linear Programming (MINLP) to solve for the optimal long-term assets decision and the maintenance policy. The applicability of the method is tested on a real case study from Sinopec Petrochemical Plant. The capital and expected operational cost that accounts for equipment maintenance for an infinite time horizon are determined.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    10
    References
    0
    Citations
    NaN
    KQI
    []