Unsupervised Domain Adaptation for Person Re-identification via Individual-preserving and Environmental-switching Cyclic Generation

2021 
Unsupervised domain adaptation for person re-identification (Re-ID suffers severe domain discrepancies between source and target domains. To reduce the domain shift caused by the changes of context, camera style, or viewpoint, existing methods in this field fine-tune and adapt the Re-ID model with augmented samples, either through translating source samples to the target style or by assigning pseudo labels to the target. The former methods may lose identity details but keep redundant source background during translation, while the latter methods may assign noisy labels when the model meets the unseen background and person pose. To address the challenges, we mitigate the domain shift in the former translation direction by decoupling environment and identity-related features in a cyclic manner. We propose a novel individual-preserving and environmental-switching cyclic generation network (IPES-GAN . Our network has the following distinct features: 1 Decoupled features instead of fused features: the images are encoded into an individual part and an environmental part, which are proved beneficial to generation and adaptation; 2 Cyclic generation instead of one-step adaptive generation. The source and target environment features are swapped to generate cross-domain images with preserved identity-related features conditioned with source (target environment features, and then swapped again to generate back the input image, so that cyclic generation runs in a self-supervised way. Experiments carried out on two major benchmarks: Market-1501 and DukeMTMC-reID reveal state-of-the-art performance.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []