Classical benchmarking of Gaussian Boson Sampling on the Titan supercomputer

2020 
Gaussian Boson Sampling (GBS) is a model of photonic quantum computing where single-mode squeezed states are sent through linear-optical interferometers and measured using single-photon detectors. In this work, we employ a recent exact sampling algorithm for GBS with threshold detectors to perform classical simulations on the Titan supercomputer. We determine the time and memory resources as well as the amount of computational nodes required to produce samples for different numbers of modes and detector clicks. It is possible to simulate a system with 800 optical modes postselected on outputs with 20 detector clicks, producing a single sample in roughly 2 h using 40% of the available nodes of Titan. Additionally, we benchmark the performance of GBS when applied to dense subgraph identification, even in the presence of photon loss. We perform sampling for several graphs containing as many as 200 vertices. Our findings indicate that large losses can be tolerated and that the use of threshold detectors is preferable over using photon-number-resolving detectors postselected on collision-free outputs.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    46
    References
    3
    Citations
    NaN
    KQI
    []