Incremental Calibration of Architectural Performance Models with Parametric Dependencies

2020 
Architecture-based Performance Prediction (AbPP) allows evaluation of the performance of systems and to answer what-if questions without measurements for all alternatives. A difficulty when creating models is that Performance Model Parameters (PMPs, such as resource demands, loop iteration numbers and branch probabilities) depend on various influencing factors like input data, used hardware and the applied workload. To enable a broad range of what-if questions, Performance Models (PMs) need to have predictive power beyond what has been measured to calibrate the models. Thus, PMPs need to be parametrized over the influencing factors that may vary. Existing approaches allow for the estimation of the parametrized PMPs by measuring the complete system. Thus, they are too costly to be applied frequently, up to after each code change. Moreover, they do not keep manual changes to the model when recalibrating. In this work, we present the Continuous Integration of Performance Models (CIPM), which incrementally extracts and calibrates the performance model, including parametric dependencies. CIPM responds to source code changes by updating the PM and adaptively instrumenting the changed parts. To allow AbPP, CIPM estimates the parametrized PMPs using the measurements (generated by performance tests or executing the system in production) and statistical analysis, e.g., regression analysis and decision trees. Additionally, our approach responds to production changes (e.g., load or deployment changes) and calibrates the usage and deployment parts of PMs accordingly. For the evaluation, we used two case studies. Evaluation results show that we were able to calibrate the PM incrementally and accurately.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    45
    References
    3
    Citations
    NaN
    KQI
    []