In this paper we address the issue of change after deployment in safety-critical embedded system applications. Our goal is to substitute lab-based verification with in-field formal analysis to determine whether an update may be safely applied. This is challenging because it requires an automated process able to handle multiple viewpoints such as functional correctness, timing, etc. For this purpose, we propose an original methodology for contract-based negotiation of software updates. The use of contracts allows us to cleanly split the verification effort between the lab and the field. In addition, we show how to rely on existing viewpoint-specific methods for update negotiation. We illustrate our approach on a concrete example inspired by the automotive domain.
Abstract Manufacturing systems and their control software exhibit a large number of variants, which evolve over time in order to meet changing functional and non-functional requirements. To handle the resulting complexity, we propose a multi-perspective modeling approach with different viewpoints regarding workflow, architecture and component behavior. We combine it with delta modeling to seamlessly capture variability and evolution by the same means on each of the viewpoints. We show how the separation in different viewpoints enables early performance analysis as well as code generation. The approach is illustrated using a case study.
Software variants of a Software Product Line (SPL) consist of a set of artifacts specified by features. Variability models document the valid relationships between features and their mapping to artifacts. However, research has shown inconsistencies between the variability of variants in features and artifacts, with negative effects on system safety and development effort. To analyze this mismatch in variability, the causal relationships between features, artifacts, and variants must be uncovered, which has only been addressed to a limited extent. In this paper, we propose taxonomy graphs as novel variability models that reflect the composition of variants from artifacts and features, making mismatches in variability explicit. Our evaluation with two SPL case studies demonstrates the usefulness of our variability model and shows that mismatches in variability can vary significantly in detail and severity.
Abstract The increasing demand for highly customizable manufacturing systems leads to an extreme number of possible machine variants. Feature models are often used to manage this system diversity. The development and maintenance of feature models are error-prone and time-consuming tasks, especially considering industrial-size models with thousands of features. In many cases, engineers might want to focus only on a few features relevant for their own domain. Additionally, each change may lead to anomalies in the feature model. In this paper, we present an approach to provide engineering support by giving user-friendly explanations for hidden dependencies and anomalies in feature models.
Automated production systems (aPS) are variant-rich, design-to-order systems and an increasing proportion of their functionality is implemented by control software. In control software development, software reuse is still commonly performed via clone-and-own despite many drawbacks, e.g., copying errors. This unplanned reuse leads to a high amount of historically grown software variants, which contain valuable domain expertise. Therefore, to enable planned reuse of existing control software solutions, an analysis of legacy software, inducing documentation of identified variability, is required. While so-called Software Product Lines enable the documentation of variability, they lack suitable variability visualization tailored to the needs of aPS stakeholders such as application or module developers. To address this gap, this paper introduces a variability visualization concept tailored to the needs of aPS stakeholders with the aim of supporting them in their daily tasks. The concept was evaluated successfully within a master student's course by use of a prototypical implementation of the visualization concept.
Model-based languages such as MATLAB/Simulink are crucial for the development of embedded software systems. To adapt to changing requirements, engineers commonly copy and modify existing systems to create new variants. Commonly referred to as clone-and-own, this reuse strategy is easy to apply and beneficial in the short term, but it entails severe maintenance and consistency issues in the long term, leading to a huge amount of redundant and similar assets. Moreover, a later transition towards structured reuse such as with software product lines inevitably requires the comparison of all existing variants prior to the actual migration. However, current work mostly revolves around the comparison of only two systems and despite approaches proposed that can cope with more, such are not applicable to embedded software systems such as MATLAB/Simulink. In this paper, we bridge this gap and propose Static Connectivity Matrix Analysis (SCMA), a novel comparison procedure that allows for the evaluation of multiple MATLAB/Simulink model variants at once. In particular, we transform models into a matrix form which is used to compare all models and to identify all similar structures between them, even with model parts being completely relocated during clone-and-own. We allow engineers to tailor results and to focus on any arbitrary variant subset, enabling individual reasoning prior to migration. We provide a feasibility study from the automotive domain, showing our matrix representation to be suitable and our technique to be fast and precise.