The high-end equipment intelligent manufacturing (HEIM) industry is of strategic importance to national and economic security. Engineering management (EM) for HEIM is a complex, innovative process that integrates natural science, technology, management science, social science, and the human spirit. New-generation information technology (IT), including the internet, cloud computing, big data, and artificial intelligence, have made a remarkable influence on HEIM and its engineering management activities, such as product system construction, product life cycle management, manufacturing resources organization, manufacturing model innovation, and reconstruction of the enterprise ecosystem. Engineering management for HEIM is a key topic at the frontier of international academic research. This study systematically reviews the current research on issues pertaining to engineering management for HEIM under the new-generation IT environment. These issues include cross-lifecycle management, network collaboration management, task integration management of innovative development, operation optimization of smart factories, quality and reliability management, information management, and intelligent decision making. The challenges presented by these issues and potential research opportunities are also summarized and discussed.
The multi-wave algorithm (Glover, 2016) integrates tabu search and strategic oscillation utilizing repeated waves (nested iterations) of constructive search or neighborhood search. We propose a simple multi-wave algorithm for solving the Uncapacitated Facility Location Problem (UFLP) to minimize the combined costs of selecting facilities to be opened and of assigning each customer to an opened facility in order to meet the customers’ demands. The objective is to minimize the overall cost including the costs of opening facilities and the costs of allocations. Our experimental tests on a standard set of benchmarks for this widely-studied class of problems show that our algorithm outperforms all previous methods.
Optimization of large-scale supply chain planning models requires the application of decomposition strategies to reduce the computational expense. Two major options are to use either spatial or temporal Lagrangean decomposition. In this paper, to further reduce the computational expense a novel decomposition scheme by products is presented. The decomposition is based on a reformulation of knapsack constraints in the problem. The new approach allows for simultaneous decomposition by products and by time periods, enabling the generation of a large number of subproblems, that can be solved by using parallel computing. The case study shows that the proposed product decomposition exhibits similar performance as the temporal decomposition, and that selecting different orders of products and aggregating the linking constraints can improve the efficiency of the algorithm.
Acquisition and analysis of customer requirements are the essential steps in high-end equipment design. Considering that Internet and big data technologies are integrated into the manufacturing industry, we propose a method of analyzing customer requirements based on open-source data. First, online data are collected with focused crawlers and preprocessed to filter noise and duplicate. Then, user opinions are extracted based on the defined template, and users’ sentiments are analyzed. Based on the relationship between user sentiments and attribute parameters, the parameter range that satisfies customers can be obtained. The proposed method is evaluated by using an example of new energy vehicle to verify its availability and feasibility.
This study investigates an energy-aware flow shop scheduling problem with a time-dependent learning effect. The relationship between the traditional and the proposed scheduling problem is shown and objective is to determine a job sequence in which the total energy consumption is minimized. To provide an efficient solution framework, composite lower bounds are proposed to be used in a solution approach with the name of Bounds-based Nested Partition (BBNP). A worst-case analysis on shortest process time heuristic is conducted for theoretical measurement. Computational experiments are performed on randomly generated test instances to evaluate the proposed algorithms. Results show that BBNP has better performance than conventional heuristics and provides considerable computational advantage.
This work is devoted to the problem of planning of freight railway transportation. We define a special conflict graph on the basis of a set of acceptable train routes. The investigation aims to solve the classical combinatorial optimization problem in relation to the maximum independent set of vertices in undirected graphs. The level representation of the graph and its tree are introduced. With use of these constructions, the lower and upper bounds for the number of vertices in the maximum independent set are obtained.
The NP-hard scheduling problems of semiconductor manufacturing systems (SMSs) are further complicated by stochastic uncertainties. Reactive scheduling is a common dynamic scheduling approach where the scheduling scheme is refreshed in response to real-time uncertainties. The scheduling scheme is overly sensitive to the emergence of uncertainties because the optimization of performance (such as minimum make-span) and the system robustness cannot be achieved simultaneously by conventional reactive scheduling methods. To improve the robustness of the scheduling scheme, we propose a novel slack-based robust scheduling rule (SR) based on the analysis of robustness measurement for SMS with uncertain processing time. The decision in the SR is made in real time given the robustness. The proposed SR is verified under different scenarios, and the results are compared with the existing heuristic rules. Simulation results show that the proposed SR can effectively improve the robustness of the scheduling scheme with a slight performance loss.
Non-convex optimization can be found in several smart manufacturing systems. This paper presents a short review on global optimization (GO) methods. We examine decomposition techniques and classify GO problems on the basis of objective function representation and decomposition techniques. We then explain Kolmogorov’s superposition and its application in GO. Finally, we conclude the paper by exploring the importance of objective function representation in integrated artificial intelligence, optimization, and decision support systems in smart manufacturing and Industry 4.0.
Residual life estimation is essential for reliability engineering. Traditional methods may experience difficulties in estimating the residual life of products with high reliability, long life, and small sample. The Bayes model provides a feasible solution and can be a useful tool for fusing multisource information. In this study, a Bayes model is proposed to estimate the residual life of products by fusing expert knowledge, degradation data, and lifetime data. The linear Wiener process is used to model degradation data, whereas lifetime data are described via the inverse Gaussian distribution. Therefore, the joint maximum likelihood (ML) function can be obtained by combining lifetime and degradation data. Expert knowledge is used according to the maximum entropy method to determine the prior distributions of parameters, thereby making this work different from existing studies that use non-informative prior. The discussion and analysis of different types of expert knowledge also distinguish our research from others. Expert knowledge can be classified into three categories according to practical engineering. Methods for determining prior distribution by using the aforementioned three types of data are presented. The Markov chain Monte Carlo is applied to obtain samples of the parameters and to estimate the residual life of products due to the complexity of the joint ML function and the posterior distribution of parameters. Finally, a numerical example is presented. The effectiveness and practicability of the proposed method are validated by comparing it with residual life estimation that uses non-informative prior. Then, its accuracy and correctness are proven via simulation experiments.