Simplified TinyBERT: Knowledge Distillation for Document Retrieval

2020 
Despite the effectiveness of utilizing BERT for document ranking, the computational cost of such approaches is non-negligible when compared to other retrieval methods. To this end, this paper first empirically investigates the applications of knowledge distillation models on document ranking task. In addition, on top of the recent TinyBERT, two simplifications are proposed. Evaluation on MS MARCO document re-ranking task confirms the effectiveness of the proposed simplifications.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    30
    References
    9
    Citations
    NaN
    KQI
    []