Distributed Stream Consistency Checking

2018 
Dealing with noisy data is one of the big issues in stream processing. While noise has been widely studied in settings where streams have simple schemas, e.g. time series, few solutions focused on streams characterized by complex data structures. This paper studies how to check consistency over large amounts of complex streams. Our proposed methods exploit reasoning to assess if portions of the streams are compliant to a reference conceptual model. To achieve scalability, our methods run on state-of-the-art distributed stream processing platforms, e.g. Apache Storm or Twitter Heron. Our first method computes the closure of Negative Inclusions (NIs) for DL-Lite ontologies and registers the NIs as queries. The second method compiles the ontology into a processing pipeline to evenly distribute the workload. Experiments compares the two methods and show that the second one improves the throughput up to 139% with the LUBM ontology and 330% with the NPD ontology.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    29
    References
    0
    Citations
    NaN
    KQI
    []