Near real-time analysis of big fusion data on HPC systems

2020 
We are developing the Delta framework that aims to tackle big-data challenges specific to fusion energy sciences. Delta can be used to connect fusion experiments to remote supercomputers. Streaming measurements to distributed compute resources allows to automatically perform high-dimensional data analysis on a cadence that exceeds experimental schedules. Making data analysis results available before the next experiments allows scientists to make more informed decisions about configuration of upcoming experiments. Here we describe how Delta uses database and virtualization facilities, as well as high-performance computing, at the National Energy Research Compute Center to offer a vertically integrated near real-time data analysis and visualization. We also report on ongoing efforts to port the data analysis part of Delta to graphical processing units, which show a reduction of the analysis wall-time for a benchmark workflow by about 35% when compared to a serial implementation.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    11
    References
    1
    Citations
    NaN
    KQI
    []