Asynchronous Teacher Guided Bit-wise Hard Mining for Online Hashing.
2021
Online hashing for streaming data has attracted increasing attention recently. However, most existing algorithms focus on batch inputs and instance-balanced optimization, which is limited in the single datum input case and does not match the dynamic training in online hashing. Furthermore, constantly updating the online model with new-coming samples will inevitably lead to the catastrophic forgetting problem. In this paper, we propose a novel online hashing method to handle the above-mentioned issues jointly, termed Asynchronus Teacher-Guided Bit-wise Hard Mining for Online Hashing. Firstly, to meet the needs of datum-wise online hashing, we design a novel binary codebook that is discriminative to separate different classes. Secondly, we propose a novel semantic loss (termed bit-wise attention loss) to dynamically focus on hard samples of each bit during training. Last but not least, we design a asynchronous knowledge distillation scheme to alleviate the catastrophic forgetting problem, where the teacher model is delaying updated to maintain the old knowledge, guiding the student model learning. Extensive experiments conducted on two public benchmarks demonstrate the favorable performance of our method over the state-of-the-arts.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
23
References
0
Citations
NaN
KQI