A Neural Ranker for Open-Domain Question Answering via Sentence-Level Jump Encoding

2021 
Open-domain question answering extract the answer for a question from a large scale corpus, which typically employs ranker to filter irrelevant paragraphs. As traditional retrieval approaches based on statistical characteristics, neural ranking model has gradually become a popular choice for filtering as its capacity of semantic features extraction. But massive corpus request high performance for filtering efficiency. Thus, acceleration of textual encoding by neural model is worthy of attention and there are many achievements on various of perspectives. Inspired by reading habit of human, this paper presents a novel neural ranker on sentence-level jump encoding which can better skip those irrelevant sentences than existing neural models. When people browsing text with questions, they tend to find the key sentence based on heuristic information of question and current reading. And our sentence-level jump encoder works in a similar way which can effectively and accurately locate key sentences for question. We evaluate our neural ranker on long paragraph setting of Quasar-T dataset, our method achieves significant improvements on both ranking acceleration and ranking accuracy, as well as surpass other question answering baselines on the whole question answering results.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    35
    References
    0
    Citations
    NaN
    KQI
    []