Effect of correlated decay on fault-tolerant quantum computation

2017 
We analyze noise in the circuit model of quantum computers when the qubits are coupled to a common bosonic bath and discuss the possible failure of scalability of quantum computation. Specifically, we investigate correlated (super-radiant) decay between the qubit energy levels from a two- or three-dimensional array of qubits without imposing any restrictions on the size of the sample. We first show that regardless of how the spacing between the qubits compares with the emission wavelength, correlated decay produces errors outside the applicability of the threshold theorem. This is because the sum of the norms of the two-body interaction Hamiltonians (which can be viewed as the upper bound on the single-qubit error) that decoheres each qubit scales with the total number of qubits and is unbounded. We then discuss two related results: (1) We show that the actual error (instead of the upper bound) on each qubit scales with the number of qubits. As a result, in the limit of large number of qubits in the computer, $N\ensuremath{\rightarrow}\ensuremath{\infty}$, correlated decay causes each qubit in the computer to decohere in ever shorter time scales. (2) We find the complete eigenvalue spectrum of the exchange Hamiltonian that causes correlated decay in the same limit. We show that the spread of the eigenvalue distribution grows faster with $N$ compared to the spectrum of the unperturbed system Hamiltonian. As a result, as $N\ensuremath{\rightarrow}\ensuremath{\infty}$, quantum evolution becomes completely dominated by the noise due to correlated decay. These results argue that scalable quantum computing may not be possible in the circuit model in a two- or three- dimensional geometry when the qubits are coupled to a common bosonic bath.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    59
    References
    4
    Citations
    NaN
    KQI
    []