Unsupervised deep representation learning for motor fault diagnosis by mutual information maximization
2020
Data-driven deep learning technology has gained many achievements in the field of motor fault diagnosis and prognostics. However, the application objects of those previous studies are commonly limited to the faulty data sharing the similar distribution under unvarying stable working condition. Unfortunately, this limitation is nearly invalid in the real-world scenario, where the working condition is complicated and changes invariably, resulting in the unfavourable situation that the deep representation learning methods of the previous studies always fail in extracting the effective representations for fault diagnosis in real applications. To tackle this issue, inspired by f-divergence estimation, this work takes a different route and proposes an unsupervised deep representation learning approach, named Deep Mutual Information Maximization (DMIM), using variational divergence estimation approach to maximize mutual information (MI) between the input and output of a deep neural network. Meanwhile the representation distribution is automatically tuned by matching to a prior distribution with the same philosophy of Variational Autoencoder. Opposite to previous works which learn representations basically with supervised feedback regulation or unsupervised reconstruction, the proposed unsupervised MI maximization framework aims to make representational characteristics like independence play a bigger role to capture the most unique representations. To verify the effectiveness of our proposal, faulty motor data from the motor tests under European driving cycle for simulating the real working scenario, are collected for validation. It turns out that DMIM outperforms many popular unsupervised and fully-supervised learning methods. It opens new avenues for unsupervised learning of representations for motor fault diagnosis.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
33
References
7
Citations
NaN
KQI