IOP : Design of the data quality control system for the ALICE O$^2$

2017 
ALICE (A Large Ion Collider Experiment) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). A major upgrade of the experiment is planned for 2019-20. In order to cope with a 100 times higher data rate and with the continuous readout of the Time Projection Chamber (TPC), it is necessary to upgrade the Online and Offline computing to a new common system called O2. The online Data Quality Monitoring (DQM) and the offline Quality Assurance (QA) are critical aspects of the data acquisition and reconstruction software chains. The former intends to provide shifters with precise and complete information in order to quickly identify and overcome problems while the latter aims at providing good quality data for physics analyses. DQM and QA typically involve the gathering of data, its distributed analysis by user-defined algorithms, the merging of the resulting objects and their visualization. This paper discusses the architecture and the design of the data Quality Control system that regroups the DQM and QA. In addition it presents the main design requirements and early results of a working prototype. A special focus is put on the merging of monitoring objects generated by the QC tasks. The merging is a crucial and challenging step of the O2 system, not only for QC but also for the calibration. Various scenarios and implementations have been made and large-scale tests carried out. This document presents the final results of this extensive work on merging. We conclude with the plan of work for the coming years that will bring the QC to production by 2019.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    2
    References
    3
    Citations
    NaN
    KQI
    []