A Generic and Highly Scalable Framework for the Automation and Execution of Scientific Data Processing and Simulation Workflows

2018 
In order to perform complex data processing and co-simulation workflows for research on data driven energy systems, a generic, modular and highly scalable process operation framework is presented in this article. This framework consistently applies web technologies to build up a microservices architecture. It automates the startup, synchronization, and management of scientific data processing and simulation tools (e.g. Python, Matlab, OpenModelica) as part of larger transdisciplinary, multi-domain data processing and co-simulation workflows. It uses container virtualization on the underlying cluster computing environment to control and manage different simulation nodes.Within the framework’s processing workflow, software executables can be distributed to different nodes on the cluster, easily access data and communicate with other components via communication adapters and a high-performance messaging channel infrastructure. By integrating Apache NiFi, the framework also provides an easy-to-use web user interface to allow users to model, perform and operate workflows for future energy system solutions. As soon as a complex workflow is set up in the process operation framework, researchers can use the workflow without any setup or configuration on their local workstations and without knowing any details of the underlying infrastructure or software environment.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    18
    References
    2
    Citations
    NaN
    KQI
    []