A Dask Distributed Radio Astronomy Reduction Framework

2021 
Interferometric coherency measurements scale quadratically with the number of stations in the interferometer. This, combined with high spectro temporal resolution of the data necessitates the use of modern computing strategies such as MapReduce, and cluster computing frameworks to reduce data in tractable amounts of time. Frameworks such as Spark and Dask [1] lean towards a streaming, chunked, functional programming style with minimal shared state. Individual tasks processing chunks of data are flexibly scheduled on multiple cores and nodes. To process the quantities of data produced by contemporary radio telescopes such as MeerKAT, and future telescopes such as the SKA, radio astronomy codes must adapt to these paradigms. In what follows, we describe two Python libraries, dask-ms and codex-africanus, which enable the development of distributed High-Performance Radio Astronomy code.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    5
    References
    0
    Citations
    NaN
    KQI
    []