Learning quantum models from quantum or classical data

2020 
In this paper, we address the problem of how to represent a classical data distribution in a quantum system. The proposed method is to learn the quantum Hamiltonian, that is such that its ground state approximates the given classical distribution. We review previous work on the quantum Boltzmann machine (QBM) (Kieferova M and Nathan W 2017 Phys. Rev. A 96 062327, Amin M H et al 2018 Phys. Rev. X 8 021050) and how it can be used to infer quantum Hamiltonians from quantum statistics. We then show how the proposed quantum learning formalism can also be applied to a purely classical data analysis. Representing the data as a rank one density matrix introduces quantum statistics for classical data in addition to the classical statistics. We show that quantum learning yields results that can be significantly more accurate than the classical maximum likelihood approach, both for unsupervised learning and for classification. The data density matrix and the QBM solution show entanglement, quantified by the quantum mutual information I. The classical mutual information in the data I c ≤ I/2 = C, with C maximal classical correlations obtained by choosing a suitable orthogonal measurement basis. We suggest that the remaining mutual information Q = I/2 is obtained by non orthogonal measurements that may violate the Bell inequality. The excess mutual information I − I c may potentially be used to improve the performance of quantum implementations of machine learning or other statistical methods.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    47
    References
    14
    Citations
    NaN
    KQI
    []