A linear filtering theory-based approach for load shedding
2013
A Datastream Management System (DSMS) allows applications to query datastreams by specifying continuous queries (CQs). Unlike a traditional query in a Database Management System (DBMS), each CQ in the DSMS has to fulfill Quality of Service (QoS) requirements, such as tuple latency. In order to a CQ meets this quality parameter when the system is overloaded, it is necessary to discard some tuples, i.e., to perform a load shedding process. However, this is not an easy task since, such as reported in literature, it is essential to know when and how adjust the quality of CQs at runtime and how many tuples must be dropped. Any dynamic system is subjected to conditions of internal and external behavior that modify its operation and control. This implies that the system can be observable and controllable. In this paper we present a modern control-theory based approach to deal with some issues of load shedding in DSMSs. The results are based on the state space, described by a discrete stochastic estimator and noise characterization having a linear complexity.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
4
References
0
Citations
NaN
KQI