Dual Distance Optimized Deep Quantization With Semantics-Preserving

2022 
Recently, quantization has been an effective technique for large-scale image retrieval, which can encode feature vectors into compact codes. However, it is still a great challenge to improve the discriminative capability of codewords while minimizing the quantization error. This letter proposes Dual Distance Optimized Deep Quantization (D 2 ODQ) to deal with this issue, by minimizing the Euclidean distance between samples and codewords, and maximizing the minimum cosine distance between codewords. To generate the evenly distributed codebook, we find the general solution for the upper bound of the minimum cosine distance between codewords. Moreover, scaler constrained semantics-preserving loss is considered to avoid trivial quantization boundary, and ensure that a codeword can only quantize the features of one category. In contrast to state-of-the-art methods, our method has a better performance on three benchmark datasets.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    39
    References
    0
    Citations
    NaN
    KQI
    []