A DBpedia-based Benchmark for Ontology-mediated Query Answering

2021 
Ontology-mediated query answering (OMQA) is a frame-work for querying data with a background ontology. Detailed evaluation of OMQA systems remains a challenge due to limitations in existing benchmarks. In this paper, we propose a new benchmark for OMQA based on natural language questions over DBpedia. In particular, the data are sampled from DBpedia with adjustable volumes and can easily reach a scale that is diffcult for existing OMQA systems to handle. Log-ical rules are automatically extracted from DBpedia using a rule learner, and the queries come from real-life natural language questions over DB-pedia. We evaluated two state-of-The-Art systems under various settings, to demonstrate the potential of our benchmark in benchmarking and analyzing the behavior of OMQA systems.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    7
    References
    0
    Citations
    NaN
    KQI
    []