Trustworthiness study of HDFS data storage based on trustworthiness metrics and KMS encryption

2021 
Since its introduction in 2006, Hadoop technology has evolved dramatically and the Hadoop ecosystem has flourished. The Hadoop ecosystem is now composed of more than 60 components, ranging from HDFS and MapReduce. Hadoop’s architecture makes it far superior to other products in large-scale data processing and analysis, making it the best choice for all industries. This makes Hadoop the preferred framework for data analysis and processing in a variety of industries. With the widespread application of Hadoop, people are increasingly concerned about the data trustworthiness of HDFS such as the explicit storage of data and over-reliance on authentication mechanisms. In order to ensure the trustworthiness of Hadoop data and to take into account the performance factors of the big data framework, this study focuses on the HDFS-based data trustworthiness problem and the use of node classification, data encryption, trustworthiness measures and other measures to comprehensively enhance the trustworthiness of HDFS file system, improve the existing HDFS system, and ensure the data storage communication in a trustworthy environment.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    3
    References
    0
    Citations
    NaN
    KQI
    []