This paper presents a new efficient multi-objective evolutionary algorithm for solving computationally-intensive optimization problems. To support a high degree of parallelism, the algorithm is based on a steady-state design. For improved efficiency the algorithm utilizes a surrogate to identify promising candidate solutions and filter out poor ones. To handle the uncertainties associated with the approximative surrogate evaluations, a new method for multi-objective optimization is described which is generally applicable to all surrogate techniques. In this method, basically, surrogate objective values assigned to offspring are adjusted to consider the error of the surrogate. The algorithm is evaluated on the ZDT benchmark functions and on a real-world problem of manufacturing optimization. In assessing the performance of the algorithm, a new performance metric is suggested that combines convergence and diversity into one single measure. Results from both the benchmark experiments and the real-world test case indicate the potential of the proposed algorithm.
Traditionally, a simulation-based optimization (SO) system is designed as a black-box in which the internal details of the optimization process is hidden from the user and only the final optimization solutions are presented. As the complexity of the SO systems and the optimization problems to be solved increases, instrumentation -- a technique for monitoring and controlling the SO processes -- is becoming more important. This paper proposes a white-box approach by advocating the use of instrumentation components in SO systems, based on a component-based architecture. This paper argues that a number of advantages, including efficiency enhancement, gaining insight from the optimization trajectories and higher controllability of the SO processes, can be brought out by an on-line instrumentation approach. This argument is supported by the illustration of an instrumentation component developed for a SO system designed for solving real-world multi-objective operation scheduling problems.
Digitalization through Industry 4.0 technologies is one of the essential steps for the complete collaboration, communication, and integration of heterogeneous resources in a manufacturing organization towards improving manufacturing performance. One of the ways is to measure the effective utilization of critical resources, also known as bottlenecks. Finding such critical resources in a manufacturing system has been a significant focus of manufacturing research for several decades. However, finding a bottleneck in a complex manufacturing system is difficult due to the interdependencies and interactions of many resources. In this work, a digital twin framework is developed to detect, diagnose, and improve bottleneck resources using utilization-based bottleneck analysis, process mining, and diagnostic analytics. Unlike existing bottleneck detection methods, this novel approach is capable of directly utilizing enterprise data from multiple levels, namely production planning, process execution, and asset monitoring, to generate event-log which can be fed into a digital twin. This enables not only the detection and diagnosis of bottleneck resources, but also validation of various what-if improvement scenarios. The digital twin itself is generated through process mining techniques, which can extract the main process map from a complex system. The results show that the utilization can detect both sole and shifting bottlenecks in a complex manufacturing system. Diagnosing and managing bottleneck resources through the proposed approach yielded a minimum throughput improvement of 10% in a real factory setting. The concept of a custom digital twin for a specific context and goal opens many new possibilities for studying the strong interaction of multi-source data and decision-making in a manufacturing system. This methodology also has the potential to be exploited for multi-objective optimization of bottleneck resources.
This paper describes a decision support system (DSS) built on knowledge extraction using simulation-based optimization and data mining. The paper starts with a requirements analysis based on a survey conducted with a number of industrial companies about their practices of using simulations for decision support. Based upon the analysis, a new, interactive DSS that can fulfill the industrial requirements, is proposed. The design of the cloud-based system architecture of the DSS is then described. To show the functionality and potential of the proposed DSS, an application study has been performed for the optimal design of a hypothetical but realistic flexible production cell. How important knowledge with respect to different preferences of the decision maker can be generated as rules, using the new Flexible Pattern Mining algorithm provided in the DSS, will be revealed by the results of this application study.
Proceedings of the 18th International Conference on Flexible Automation and Intelligent manufactring : FAIM 2008 June 30th – July 2nd, 2008 University of Skovde, Sweden
The performance of a production system is primarily evaluated by its throughput, which is constrained by throughput bottlenecks. Thus, bottleneck analysis (BA), encompassing bottleneck identification, diagnosis, prediction, and prescription, is a crucial analytical process contributing to the success of manufacturing industries. Nevertheless, BA requires a substantial quantity of information from the manufacturing system, making it a data-intensive task. Based on the dynamic nature of bottlenecks, the optimal strategy for BA entails making well-informed decisions in real-time and executing necessary modifications accordingly. The efficient implementation of BA requires gathering, storing, analyzing, and illustrating data from the shop floor. Utilizing Industry 4.0 technologies, such as cyber-physical systems and cloud technology, facilitates the execution of data-intensive operations for the successful management of BA in real-world settings. The main objective of this study is to establish a framework for BA through the utilization of Cloud-Based Cyber-Physical Systems (CB-CPSs). First, a literature review was conducted to identify relevant research and current applications of CB-CPSs in BA. Using the results of the review, a CB-CPSs framework was subsequently introduced for BA. The application of the framework was assessed via simulation in a real-world manufacturer of marine engines. The findings indicate that the implementation of CB-CPSs can contribute significantly to throughput improvement.