Obsolescence of integrated systems which contain hardware and software is a problem that affects multiple industries and can occur for many reasons, including technological, economic, organizational, and social factors. It is especially acute in products and systems that have long life cycles, where a high rate of technological innovation of the subcomponents result in a mismatch in life cycles between the components and the systems. While several approaches for obsolescence forecasting exist, they often require data that may not be available. This paper describes an approach using non-probabilistic scenarios coupled with decision analysis to investigate how particular scenarios influence priority setting for products and systems. Scenarios are generated from a list of emergent and future conditions related to obsolescence. The key result is an identification of the most and least disruptive scenarios to the decision maker’s priorities. An example is presented related to the selection of technologies for energy islanding, which demonstrates the methodology using six obsolescence scenarios. The paper should be of broad interest to scholars and practitioners engaged with enterprise risk management and similar challenges of large-scale systems.
This study aims to determine the relationship between several factors of governance and the level of risk in 10 Tunisian banks during an analysis period of eight years. We propose an important empirical question and examine the internal mechanisms of governance aimed at reducing financial risks. This estimation is based on a model with a single equation that examines variables relative to governance and credit risk to determine their impact on banking financials. Results demonstrate that the internal mechanisms of governance present diverging effects on the financial risk of the Tunisian banks in our case study (i.e., credit risk). Moreover, making applications work by putting together a process and model for banking risk is important. This model can be applied in any bank, and the results can be used to make decisions in real time.
Quality function deployment (QFD) is an effective method that helps companies analyze customer requirements (CRs). These CRs are then turned into product or service characteristics, which are translated to other attributes. With the QFD method, companies could design or improve the quality of products or services close to CRs. To increase the effectiveness of QFD, we propose an improved method based on Pythagorean fuzzy sets (PFSs). We apply an extended method to obtain the group consensus evaluation matrix. We then use a combined weight determining method to integrate former weights to objective weights derived from the evaluation matrix. To determine the exact score of each PFS in the evaluation matrix, we develop an improved score function. Lastly, we apply the proposed method to a case study on assembly robot design evaluation.
Real-time decision making reflects the convergence of several digital technologies, including those concerned with the promulgation of artificial intelligence and other advanced technologies that underpin real-time actions. More specifically, real-time decision making can be depicted in terms of three converging dimensions: Internet of Things, decision making, and real-time. The Internet of Things include tangible goods, intangible services, ServGoods, and connected ServGoods. Decision making includes model-based analytics (since before 1990), information-based Big Data (since 1990), and training-based artificial intelligence (since 2000), and it is bolstered by the evolving real-time technologies of sensing (i.e., capturing streaming data), processing (i.e., applying real-time analytics), reacting (i.e., making decisions in real-time), and learning (i.e., employing deep neural networks). Real-time includes mobile networks, autonomous vehicles, and artificial general intelligence. Central to decision making, especially real-time decision making, is the ServGood concept, which the author introduced in an earlier paper (2012). It is a physical product or good encased by a services layer that renders the good more adaptable and smarter for a specific purpose or use. Addition of another communication sensors layer could further enhance its smartness and adaptiveness. Such connected ServGoods constitute a solid foundation for the advanced products of tomorrow which can further display their growing intelligence through real-time decisions.
The finance-based scheduling problem (FBSP) is about scheduling project activities without exceeding a credit line financing limit. The FBSP is extended to consider different execution modes that result in the multi-mode FBSP (MMFBSP). Unfortunately, researchers have abandoned the development of exact models to solve the FBSP and its extensions. Instead, researchers have heavily relied on the use of heuristics and meta-heuristics, which do not guarantee solution optimality. No exact models are available for contractors who look for optimal solutions to the multi-objective MMFBSP. CPLEX, which is an exact solver, has witnessed a significant decrease in its computation time. Moreover, its current version, CPLEX 12.9, solves multi-objective optimization problems. This study presents a mixed-integer linear programming model for the multi-objective MMFBSP. Using CPLEX 12.9, we discuss several techniques that researchers can use to optimize a multi-objective MMFBSP. We test our model by solving several problems from the literature. We also show how to solve multi-objective optimization problems by using CPLEX 12.9 and how computation time increases as problem size increases. The small increase in computation time compared with possible cost savings make exact models a must for practitioners. Moreover, the linear programming-relaxation of the model, which takes seconds, can provide an excellent lower bound.
The increasing importance of technology foresight has simultaneously raised the significance of methods that determine crucial areas and technologies. However, qualitative and quantitative methods have shortcomings. The former involve high costs and many limitations, while the latter lack expert experience. Intelligent knowledge management emphasizes human–machine integration, which combines the advantages of expert experience and data mining. Thus, we proposed a new technology foresight method based on intelligent knowledge management. This method constructs a technological online platform to increase the number of participating experts. A secondary mining is performed on the results of patent analysis and bibliometrics. Thus, forward-looking, innovative, and disruptive areas and relevant experts must be discovered through the following comprehensive process: Topic acquisition → topic delivery → topic monitoring → topic guidance → topic reclamation → topic sorting → topic evolution → topic conforming → expert recommendation.
During financial crisis, companies constantly need free cash flows to efficiently react to any uncertainty, thus ensuring solvency. Working capital requirement (WCR) has been recognized as a key factor for releasing tied up cash in companies. However, in literatures related to lot-sizing problem, WCR has only been studied in the single-level supply chain context. In this paper, we initially adopt WCR model for a multi-level case. A two-level (supplier–customer) model is established on the basis of the classic multi-level lot-sizing model integrated with WCR financing cost. To tackle this problem, we propose sequential and centralized approaches to solve the two-level case with a serial chain structure. The ZIO (Zero Inventory Ordering) property is further confirmed valid in both cases. This property allows us to establish a dynamic programming-based algorithm, which solves the problem in O(T4). Finally, numerical tests show differences in optimal plans obtained by both approaches and the influence of varying delays in payment on the WCR of both actors.
Unreasonable allocation of shared resources reduces the system efficiency and is a considerable operational risk. Sub-processes with insufficient portion of shared resources could not help accomplish complicated tasks, and overstaffing and idle resources will occur in the sub-processes assigned with redundant shared resources. This unfair portion distribution may cause internal contradictions among sub-processes and even lead to the collapsing of the entire system. This study proposes a data-driven, mixed two-stage network data envelopment analysis model. This method aims to reasonably define the allocation portion of shared extra intermediate resources among several nonhomogeneous subsystems and measure the overall system performance. A data set of 58 international hotels is used to test the features of the proposed model.
Hedge funds have recently become popular because of their low correlation with traditional investments and their ability to generate positive returns with a relatively low volatility. However, a close look at those high-performing hedge funds raises the questions on whether their performance is truly superior and whether the high management fees are justified. Incurring no alpha costs, passive hedge fund replication strategies raise the question on whether they can similarly perform by improving efficiency at reduced costs. Therefore, this study investigates two different model approaches for the equity long/short strategy, where weighted segmented linear regression models are employed and combined with two-state Markov switching models. The main finding proves a short put option structure, i.e., short equity market volatility, with the put structure present in all market states. We obtain an evidence that the hedge fund managers decrease their short-volatility profile during turbulent markets.
Energy sustainability is a complex problem that needs to be tackled holistically by equally addressing other aspects such as socio-economic to meet the strict CO2 emission targets. This paper builds upon our previous work on the effect of household transition on residential energy consumption where we developed a 3D urban energy prediction system (EvoEnergy) using the old UK panel data survey, namely, the British household panel data survey (BHPS). In particular, the aim of the present study is to examine the validity and reliability of EvoEnergy under the new UK household longitudinal study (UKHLS) launched in 2009. To achieve this aim, the household transition and energy prediction modules of EvoEnergy have been tested under both data sets using various statistical techniques such as Chow test. The analysis of the results advised that EvoEnergy remains a reliable prediction system and had a good prediction accuracy (MAPE 5%) when compared to actual energy performance certificate data. From this premise, we recommend researchers, who are working on data-driven energy consumption forecasting, to consider merging the BHPS and UKHLS data sets. This will, in turn, enable them to capture the bigger picture of different energy phenomena such as fuel poverty; consequently, anticipate problems with policy prior to their occurrence. Finally, the paper concludes by discussing two scenarios of EvoEnergy development in relation to energy policy and decision-making.