Practical Advantage in Microwave Quantum Illumination

2020 
Broadly speaking, in quantum illumination we can say that a proposed protocol has a “quantum advantage” if it outperforms all possible classical protocols. In the optical domain of LIDAR, this is the most useful metric as lasers can routinely produce nearly ideal classical states of light at room temperature (RT). This is not the case in the microwave domain of RADAR where the photon energy is much less than the 300K thermal energy, meaning that a real RT microwave source will always be contaminated by significant thermal noise. Thus, it is not clear if it is technologically possible to produce an ideal classical microwave signal at RT. It is therefore interesting to ask if a microwave quantum illumination protocol can have a “practical advantage” compared to the best technologically feasible RT microwave source. In this paper, we look to frame this question more precisely. As a concrete example, we present experimental results showing that, contrary to recent claims in the literature [1], an entangled microwave source amplified by a cryogenic HEMT amplifier fails to obtain any performance advantage over a simply constructed RT source and, in facts, performs significantly worse. We present a simple theory which explains the experimental results and which offers guidance on how a practical advantage might be achieved.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    18
    References
    1
    Citations
    NaN
    KQI
    []