Recurrent Neural Network Model with Self-Attention Mechanism for Fault Detection and Diagnosis

2019 
Fault detection and diagnosis (FDD) plays an important role in production safety and efficiency. The recurrent neural network (RNN) based FDD method can automatically extract features between input sequences to accomplish end-to-end FDD. RNN based FDD method can be regarded as an encoder-decoder framework. An encoder reads input sequences to generate features with RNN, and a decoder uses the features to recognize fault. Using the final hidden state of RNN is a common approach to obtain features in previous methods. In this paper, we apply Self-attention (SA) mechanism to the gated recurrent unit (GRU), as a kind of RNN, and the GRU-SA based FDD method is proposed. The method is illustrated on Tennessee Eastman process, and experimental results show GRU-SA method can improve FDD performance.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    13
    References
    2
    Citations
    NaN
    KQI
    []