Incremental Prediction Model of Disk Failures Based on the Density Metric of Edge Samples

2019 
Disks are the main equipment for data storage in data centers. The prediction of disk failure is of great significance for the reliability and security of data. On account of the few abnormal samples in the disk datasets, it is difficult to satisfy the requirement of supervised and semi-supervised algorithms for the number of abnormal data while the unsupervised algorithms have poor performance on recall rate when solving the problems of local anomalies and wrapped a nomalies. This paper presents an incremental learning disk failure prediction model using the density metric of edge samples. An isolation region is built by searching the nearest neighbor of each sample. We calculate the nearest training point of the test point which is not a global anomaly and the nearest training point of the obtained nearest training point by Euclidean distance. The global metric of abnormal degree of the test sample comes from the ratio of the radius of the region where the two nearest training points are located. Then, the local metric of abnormal degree of the test sample comes from the ratio between the nearest distance from the test point to the edge of the training point region and the radius of the region. Abnormal scores of test points can be obtained by combining two measurements. We identify the SMART attributes that are significantly related to disk failures and promote their weights in the next time the attributes are inputted. The experiments are carried on the synthetic and public datasets which contain local anomalies and wrapped anomalies. The proposed method outperforms the typical unsupervised algorithms such as iNNE, iForest and LOF, and the achieved recall rates increase at most 7%. Furthermore, the contrast tests on the public disk datasets also verify the proposed method has better performance on recall rate.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    2
    Citations
    NaN
    KQI
    []