Cross-domain Person Re-Identification with Identity-preserving Style Transfer

2021 
Although great successes have been achieved recently in person re-identification (re-ID), there are still two major obstacles restricting its real-world performance: large variety of camera styles and a limited number of samples for each identity. In this paper, we propose an efficient and scalable framework for cross-domain re-ID tasks. Single-model style transfer and pairwise comparison are seamlessly integrated in our framework through adversarial training. Moreover, we propose a novel identity-preserving loss to replace the content loss in style transfer and mathematically show that its minimization guarantees that the generated images have identical conditional distributions (conditioned on identity) as the real ones, which is critical for cross-domain person re-ID. Our model achieved state-of-the-art results in challenging cross-domain re-ID tasks.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    34
    References
    0
    Citations
    NaN
    KQI
    []