FPGA in Core Calculation for Big Datasets

2021 
The rough sets theory developed by Prof. Z. Pawlak is one of the tools used in intelligent systems for data analysis and processing. In modern systems, the amount of the collected data is increasing quickly, so the computation speed becomes the critical factor. This paper shows FPGA and softcore CPU based hardware solution for big datasets core calculation focusing on rough set methods. Core represents attributes cannot be removed without affecting the classification power of all condition attributes. Presented architectures have been tested on real datasets by running presented solutions inside two different FPGA chips. Datasets had 1 000 to 1 000 000 objects. The same operations were performed in software implementation. Results show the up to 15.83 times increase factor in computation time using hardware supporting core generation in comparison to pure software implementation.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    25
    References
    0
    Citations
    NaN
    KQI
    []