According to population biologist Richard Levins, every discipline has a “strategy of model building,” which involves implicit assumptions about epistemic goals and the types of abstractions and modeling approaches used. We will offer suggestions about how to model complex systems based upon a strategy focusing on independence in modeling. While there are many possible and desirable modeling strategies, we will contrast a model-independence-focused strategy with the more common modeling strategy of adding increasing levels of detail to a model. Levins calls the latter approach a ‘brute force’ strategy of modeling, which can encounter problems as it attempts to add increasing details and predictive precision. In contrast, a model-independence-focused strategy, which we call a ‘pluralistic strategy,’ draws off of Levins’s use of an assemblage of multiple, simple and—critically—independent models of ecological systems in order to do predictive and explanatory analysis. We use the example of model analysis of levee failure during Hurricane Katrina to show what a pluralistic strategy looks like in engineering. Depending on one’s strategy, one can deliberately engineer the set of available models in order to have more independent and complementary models that will be more likely to be accurate. We offer advice on ways of making models independent as well as a set of epistemic goals for model development that different models can emphasize.
Abstract Teaching the Fundamentals of Systems Engineering through Various Interactive Group Activities Madeleine Brannon, Dr. Thomas Mazzuchi, Dr. Zoe Szajnfarber George Washington University, Washington, DC Engineering Management and Systems Engineering Department mbrannon@gwu.eduThe concepts and tools taught in an introductory course to Systems Engineering involve amindset which is not familiar to freshman undergraduate students. Teaching SystemsEngineering at a freshman level is challenging because students do not have work experiences todraw from to solidify the tools they are learning. We aim to overcome this barrier by usingimmersive group activities to provide a simulated context in which students can apply and learnabout the benefits of Systems Engineering. Our Introduction to Systems Analysis course isstructured around three group projects, which collectively provide an overview of thefundamental lessons of the field. The projects are an egg drop challenge which teaches the valueof upfront systems engineering and rapid prototyping, a LEGO Mindstorm competition whichteaches the importance of testing and validation, in addition to design under operationaluncertainty, and a Lean Simulation game which teaches user needs and enterprise value.While it has been well established in the general pedagogical literature that group projects andactive learning are effective teaching tools, they are not widely used in Systems Engineering fora variety of reasons. Some of these reasons are creating realistic and accessible SystemsEngineering problems is difficult in a classroom setting and coordinating effective group projectscan be complex and costly. In this paper we document our attempt to overcome these challengesand explore how they impact the student’s learning experience. First we compare the content ofour Introduction to Systems Analysis to other similar undergraduate introductory systemsengineering classes at peer institutions to identify core differences in our approach. Second wemeasure the learning progress through class observations and feedback from the students. Theclass observations include our perceptions of how students’ questions evolve over the semesterand also the extent of their engagement. The feedback portion provides the results and analysisof a survey where students rate the projects in the course, exploring which projects successfullytied our learning objectives to their perceived knowledge of Systems Engineering.
Abstract The crowdsourcing literature has shown that domain experts are not always the best solvers for complex system design problems. Under certain conditions, novices and specialists in adjacent domains can provide novel solutions at lower costs. Additionally, the best types of solvers for different problems are dependent on the architecture of complex systems. The joint consideration of solver assignment and system decomposition, referred to as solver-aware system architecting (SASA), expands traditional system architecting practices by considering solver characteristics and contractual incentive mechanisms in the design process and aims to improve complex system design and innovation by leveraging the strengths of domain experts, crowds, and specialists for different parts of the problem. The joint consideration of problem decomposition and solver assignment decisions in SASA renders the design space exponentially more complex. Therefore, new computationally efficient and mathematically rigorous methods are needed to explore this high-dimensional space and extract reliable heuristics. To address this need, this paper presents a computational approach using a Markov decision process (MDP) formulation, Q-learning, and Gaussian mixture models. Together, these techniques explore the large space of possible solver–module assignments by modeling the sequential nature of solver assignment decisions, capturing these temporal dependencies, thereby enabling optimization for long-term expected rewards, and analyzing reward distributions. The approach identifies heuristics for solver assignment based on the designer’s preference for cost-performance trade-off through the parameterized reward function. The approach is demonstrated using a simple and idealized golf problem, which has characteristics similar to design problems, including how the problem is decomposed into interdependent modules and can be solved by different solvers with different strengths that interact with the module type. The results show that the proposed approach effectively elicits a rich set of heuristics applicable in various contexts for the golf problem and can be extended to more complex systems design problems.
ABSTRACT Decomposition is a critical enabler of complex system development, as it enables both task specialization and efficiency through parallel work. The process of decomposing involves partitioning system parameters into tightly coupled modules and managing any cross‐module coupling by designing passive interfaces or through active coordination. A rich literature has developed algorithms and tools to support this process. However, we contend that this view has placed too much emphasis on module selection, and not enough on the interaction with interface design. This perspective has significant implications for lifecycle costs and development time. To that end, this study explores how earlier consideration of interface design can create more valuable options to better navigate performance, cost, and schedule tradeoffs. Specifically, through an abstract simulation experiment, we demonstrate that (1) a sequential approach that first selects modules and then designs interfaces to support those modules, yields lower performance than an integrated approach that considers modules and supporting interfaces simultaneously; and (2) this result is even stronger when schedule and cost are considered as part of the evaluation. In other words, an integrated approach provides more options for project managers seeking to navigate the performance‐cost‐schedule tradeoff known as the golden triangle. These results emphasize the need for a decomposition aid that adopts a holistic view of the optimization problem, accounting for interface creation, intra‐organization collaboration, and valuing nonperformance measures of effectiveness.
As engineering systems become more and more complex, technology transition increasingly involves deploying an upgraded subsystem across a legacy network. This mode of upgrade presents new challenges for systems architects concerned with maintaining value over multiple infused technical changes. This paper explores the dynamics of technology transition in path-dependent infrastructure systems. It uses a model-based case study of the envisioned military airborne tactical network upgrade as a basis for developing guidelines for effective transition path design. Based on the natural diffusion dynamics of the system we identified an inherent tradeoff between upgrade cycle and sustained capability levels. In other words, assuming even weakly exponential growth in demand, there is a relationship between timing of infusion and longevity of benefit. As a result, a less capable upgrade, deployed expediently can do more good than a more sophisticated upgrade that can only be integrated in the next block upgrade. In addition, by conceptualizing the transition “path” as a design lever, two dimensions of problem decomposition can be exploited to mitigate transition barriers: (1) self-contained subnetworks can provide a proving ground for full-system future benefits in order to mitigate stakeholder resistance; and (2) the technical system can be designed for evolvability, making it possible to stage deployment in the technical dimension as well.
Effective space R&D portfolio management is becoming an increasingly complex issue. Understanding the implications of today’s R&D decisions for the effectiveness of future space missions is an area that remains largely unexplored. Although there are multiple portfolio methods in existence and use, none completely takes into account the complicated and overlapping multilayered issues that describe the NASA innovation landscape. The purpose of this paper is to propose a framework that captures empirical dynamics related to mission sequencing processes, uncertain technical progress across multiple, changing funding streams for both R&D and project related goals, and workforce expertise and prioritization that have been observed through years of detailed study. Greater understanding of these issues leads to a greater ability to investigate a range of levers in order to deliver better technology research and development, mission planning and performance, and workforce time allocation. Through establishing this framework, this paper presents an approach for capturing many of the important governing dynamics in this system. Initial results are explored.