Design and Implementation of System Which Efficiently Retrieve Useful Data for Detection of Dementia Disease

2021 
To analyze Hadoop techniques like MapReduce, which will help to process the data faster and in efficient way to detect dementia. For given voluminous dementia dataset, current solution uses different data partitioning strategies which experiences large communication cost and expensive mining process due to duplicate and unnecessary transactions transferred among computing nodes. To clear this issues proposed algorithm uses data partitioning techniques such as Min-Hash and Locality Sensitive Hashing which will reduce processing time and improve efficiency of final result. We are taking help of MapReduce programming model of Hadoop [3]. We implement this technique on a Hadoop platform. For pattern matching we use FPgrowth algorithm. Finally we shows that the proposed system requires less time to finding frequent item sets. The idea behind research is to adopt to cope with the special requirement of health domain related with patients.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    11
    References
    0
    Citations
    NaN
    KQI
    []