Transforming the Lindblad Equation into a System of Linear Equations: Performance Optimization and Parallelization of an Algorithm.

2020 
With their constantly increasing peak performance and memory capacity, modern supercomputers offer new perspectives on numerical studies of open many-body quantum systems. These systems are often modeled by using Markovian quantum master equations describing the evolution of the system density operators. In this paper we address master equations of the Lindblad form, which are a popular theoretical tool in quantum optics, cavity quantum electrodynamics, and optomechanics. By using the generalized Gell-Mann matrices as a basis, any Lindblad equation can be transformed into a system of ordinary differential equations with real coefficients. This allows us to use standard high-performance parallel algorithms to integrate the equations and thus to emulate open quantum dynamics in a computationally efficient way. Recently we presented an implementation of the transform with the computational complexity scaling as $O(N^5 log N)$ for dense Lindbaldians and $O(N^3 log N)$ for sparse ones. However, infeasible memory costs remain a serious obstacle on the way to large models. Here we present a parallel cluster-based implementation of the algorithm and demonstrate that it allows us to integrate a sparse Lindbladian model of the dimension $N=2000$ and a dense random Lindbladian model of the dimension $N=200$ by using $25$ nodes with $64$ GB RAM per node.
    • Correction
    • Cite
    • Save
    • Machine Reading By IdeaReader
    37
    References
    2
    Citations
    NaN
    KQI
    []