Lei Wang1,2,*, Yuxin Qi1,2
1 School of Event and Economic Management, Shanghai Institute of Tourism, Shanghai, 201418, China
2 School of Tourism, Shanghai Normal University, Shanghai, 201418, China
* Corresponding Author: Lei Wang. Email:
(This article belongs to this Special Issue: Artificial Intelligence in Renewable Energy and Storage Systems)
Computer Modeling in Engineering & Sciences 2023, 135(1), 325-339. https://doi.org/10.32604/cmes.2022.019730
Received 11 October 2021; Accepted 28 March 2022; Issue published 29 September 2022
Currently, the energy crisis has become one of the most prominent and urgent environmental issues [1,2]. For the last few years, excessive energy resources have been consumed, which causes both resource depletion and environmental pollution. Consequently, energy conservation attracts considerable attention from government concern and public awareness [3,4]. An official report shows that the energy demand continues to grow at a rapid rate of 56% between 2010 and 2040 (EIA 2013). The manufacturing industry consumes an amount of energy and produces a great deal of greenhouse gas, which result in seriously environmental pollution. As a result, manufacturers are encouraged to make energy-efficient decisions involving process reformation, product design and production scheduling when managing and organizing their production [5–7]. Parallel machine scheduling is an important scheduling problem in practical manufacturing systems. In the past years, vast studies have been made on this subject, such as Saricicek et al. , Prot et al. , Vallada et al. , Bozorgirad et al. , Zhong et al.  and Li . In an actual process, it is important to take the actual environment into account when studying the parallel machine scheduling problems (PMSP). In the workplace, the parameters of scheduling problem are influenced by many existing factors [14–16]. Stec et al.  considered that the processing time of jobs obeys a specified normal distribution in a PMSP. Al-Khamis et al.  focused on the uncertain due dates when solving a PMSP. Zhang  used the random processing time and adjustable production rate in a PMSP.
Previous work on parallel machine scheduling tends to set the processing time as a fixed constant, which obviously deviates from the actual situation. In real-life, the processing time of jobs always changes with their start time. For example, the steel industry wants to process the job under high temperatures. However, if the processing time of the unprocessed job is very late, the temperature will drop accordingly. Therefore, due to the later start time, the processing time of this job will be prolonged, i.e., the later the job starts, the longer the corresponding processing time is. In general, it is called the deteriorating effect. Kang et al.  aimed at achieving maximum completion time minimization in an identical PMSP with processing deteriorating jobs. Cheng et al.  developed the unrelated PMSP considering the deteriorating maintenance activities. Jiang , Huang et al. , Zhao et al.  and Na et al.  proposed related PMSPs with respect to deteriorating effects. Mazdeh et al.  discussed the PMSP considering both machine and job deterioration effects. Except for the deteriorating effects, the learning effects usually occur as well. In the real world, production facilities (machines, workers) will continuously improve their production capacity and skills over time. Therefore, when one job is processed late, its processing time can be shortened. Mosheiov  summarized the classical scheduling problems with consideration of learning effects. Yeh et al.  concentrated on both learning effects and fuzzy processing time in a PMSP. To solve it, they used two heuristic algorithms, i.e., a simulated annealing method and a genetic approach. Hidri et al.  solved a PMSP with learning effects. They employed several heuristic algorithms to tackle it. Eren et al.  focused on the learning effects in a PMSP regarding setup and removal times. Toksari Duran et al.  presented a PMSP with the consideration of due dates, the deteriorating and learning effects, and sequence dependent-setups.
In practical scheduling problems, not only single objective needs to be considered, but also multi-objective needs to be solved optimally . The prior research has coped with multi-objective problems under the condition of considering deteriorating and learning effects. Lu et al.  aimed at optimizing completion time and total load and resource cost in a multi-objective PMSP considering deteriorating and learning effects. Amini et al.  studied an identical PMSP where the setup and removal times of jobs are influenced by both deteriorating and learning effects. Arik et al.  considered a multi-objective PMSP under a fuzzy environment, in which the jobs are affected by deterioration and learning effects. Rostami et al.  also took the deterioration and learning effects into account in solving a multi-objective non-identical PMSP.
Through the analysis of existing studies, we find that an energy-aware PMSP with the consideration of deteriorating and learning effects and uncertainty is not be adequately considered by the existing studies. In an actual process, energy consumption optimization needs to be considered, and thus a multi-objective model should be addressed to find a balance between energy-saving and time-related criteria. Additionally, because the details of machines and jobs usually cannot be foreseen, jobs’ processing time is uncertain . Besides, to depict a real-life manufacturing system, it is necessary to consider the deteriorating and learning effects. By comparing with previous studies, twofold contributions are made as follows:
1) We propose a stochastic energy-aware parallel machine scheduling problem with deteriorating and learning effects. Its goal is to minimize the makespan and energy consumption. Furthermore, a stochastic multi-objective model is formulated to describe it mathematically.
2) We design a multi-objective multi-verse optimization incorporating a stochastic simulation approach to solve the proposed problem. In it, a multi-objective multi-verse optimization is applied to search for favorable solutions from the entire solution domain, and the stochastic simulation approach is employed to evaluate the performance of obtained solutions. Via conducting experiments on a set of test problems and comparing the designed approach with its peers, i.e., the nondominated sorting genetic algorithm II and multi-objective evolutionary algorithm based on decomposition, we demonstrate its effectiveness in solving the considered problem.
In this paper, the investigated problem is demonstrated as: There are n jobs in a manufacturing system, and they need to choose any one out of m machines to perform production operations. The deteriorating and learning effects of jobs occur in the manufacturing process. The real processing time of jobs is calculated via using their normal processing time, start processing time, processing position and speed in a schedule. In the investigated problem, we aim at minimizing the makespan and energy consumption by determining job assignment among machines, job sequence on machines and job processing speed. We introduce the following notations:
set of jobs, }, in which the index n indicates the number of jobs, and 0 is a virtual job.
set of machines, , in which the index m indicates the number of machines, and 0 is a virtual machine.
set of speeds, , in which the index d denotes the number of processing speeds.
job index, .
machine index, .
position index, .
selected processing speed index, .
normal processing time of the job j on machine .
learning rate of job j on machine .
deteriorating rate of job j on machine .
the -th processing speed of machine .
setup time of job j on machine .
energy consumption per unit time in the processed phase if job j is processed on machine .
energy consumption per unit time in the setup phase if job j is processed on machine .
actual processing time of job j at the -th position on machine .
start processing time of job j at the -th position on machine .
if job j is processed at the -th position on machine , ; otherwise,.
if job j chooses the -th processed speed for processing, ; otherwise, .
A viable scheduling scheme needs to obey the following restricted conditions: 1) Only one machine is chosen for each job to perform processing operations; 2) For each machine, only one job rather than multiple jobs can be processed on it at a time; 3) No machines can be interrupted. Note that the actual processing time of jobs can be calculated as:
According to the above notations and definitions, the considered problem can be established as follows:
where the objective functions (2) and (3) are minimizing the makespan and total energy consumption, respectively. Note that E () demonstrates the expected value of the random variable . Eq. (4) guarantees that at most one job can be processed at one position of a machine. Eq. (5) indicates that each job is allowed to choose only one position of a machine for processing. Eq. (6) shows that each job is processed with only one speed. Eq. (7) gives the actual processing time of jobs. Eq. (8) shows the energy consumption per unit time of jobs. Eq. (9) indicates that the start processing time of subsequent jobs on each machine should be larger than the end time of previous jobs. Eq. (10) means that the end time of each machine is smaller than the makespan. Eq. (11) represents that each decision variable equals to 0 or 1.
Recently, a global optimization algorithm named Multi-verse optimizer (MVO) is initially put forward by Hatamlou et al. . It primarily contains three important cosmology notations, i.e., white hole, black hole, and wormhole. Based on them, employing three mathematical models can contribute to exploration and exploitation, respectively. More detailed descriptions about MVO can refer to .
Due to its advantages, such as easy implementation, powerful exploration and exploitation abilities, MVO has been successfully employed to solve many optimization problems [39–41]. Thus, it is a good choice to deal with the considered problem. Although basic MVO is initially introduced to solve continuous and single-objective optimization problems, its variants have been employed to solve various discrete and multi-objective optimization problems and shown excellent performance. Consequently, to make it suitable for handling the considered problem, a MOMVO incorporating a stochastic simulation approach is designed according to the considered problem’s characteristics. The detailed descriptions of MOMVO are shown below.
For solving the considered PMSP, it is necessary to make three decisions as: job assignment among machines, job sequence on machines and job processing speed. In this work, we use a double-chain structure to indicate a solution . In MOMVO, a solution is expressed by an integer string . represents a job sequence where an integer is a job index, and indicates a speed vector where , denotes the selected processing speed of its corresponding job. Taking 20 jobs and 2 machines as an example, each job has 5 available processing speeds, an individual can be expressed as: (7, 18, 16, 20, 5, 19, 14, 1, 13, 15, 8, 4, 11, 2, 12, 3, 6, 10, 9, 17) and (1, 4, 2, 3, 5, 3, 5, 2, 1, 1, 3, 1, 2, 2, 3, 1, 5, 3, 3, 1). It can be seen that , , , , , represents that the speed of processing job is 1, and their corresponding jobs are 7, 13, 15, 14, 3, 17, respectively.
The decode rules are used to transform a solution into a feasible scheme as: Each job is allocated to a machine having earliest time and its actual processing time is calculated as Eq. (1). Note that the objective function values of a solution are not directly evaluated due to the jobs’ random processing time. Hence, for estimating the solutions, we employ a well-known Monte Carlo Simulation approach.
In MOMVO, an initial population consists of 100 randomly viable solutions. Besides, to restore the obtained non-dominated solutions in the search process, an external archive is developed and a Pareto rule is employed to update it .
Notice that the objective function is expressed as an expected value, and we cannot directly evaluate it. Monte Carlo simulation is one of the most effective stochastic simulation methods. It has been effectively employed to deal with various stochastic problems . Therefore, it is adopted in this work to estimate the solutions. The true objective function value is calculated by taking an average of extensive replications. Each replication is regarded as a sample where the jobs’ normal processing time is generated at random from their corresponding distributions. In fact, the true objective value can be achieved if the number of replications tends to infinity. However, the computation resource is usually limited, and we need to do such work under limited resources. Thus, the number of replications is set to 10.
In the proposed MOMVO, we randomly select an individual in the current population as a black hole. And a binary selection approach is regarded as wormhole, and it is employed to choose an individual as a white hole. Fig. 1 indicates Pseudo-code of selection approach, where is a solution chosen from the external archive, is randomly chosen from population. The two selected individuals, i.e., black hole and white hole, are regarded as parent individuals. This work uses genetic operation including crossover and mutation approaches to generate new individuals. Let and be two newly generated individuals. The crossover operation is performed as follows:
1) Select two parent individuals by using a binary selection method, i.e., and .
2) Perform an order-based crossover approach to acquire child individuals and . The first step is to randomly select two different points in . Then store the partial permutation between two points as jobs of at the same positions. Then the remaining jobs need to be added as the order of their appearance in . Same procedure as is performed to get job permutation of .
3) Generate two binary strings and randomly. If the value in of is 1, the value corresponding to will be inherited into ; If the value in is 0, the value corresponding to will be inherited to . If the value in of is 1, the value corresponding to will be inherited into ; If the value in is 0, the value corresponding to will be inherited to . Therefore, and of two newly generated individuals are produced.
The following mutation operations with a probability are performed on each newly generated offspring individual. Different ways of mutation operations are adopted for parts O and V of the individual . Firstly, for part O of the individual , (1) select two different points in the part O string randomly; (2) the genes in the two positions are exchanged. Secondly, for the part V of the individual , (1) one point in the part V string is selected randomly; (2) the gene at this point is replaced with a random generated number, where this number is between 1 and 5.
For the solution algorithm designed to tackle multi-objective problems, finding a solution set with outstanding approximation and distribution is desired. Thus, the rank and crowding distance methods  can availably balance the approximation and distribution. Therefore, MOMVO combines the contemporary population with new individuals, and the top 100 individuals are selected as the next population by employing aforementioned methods.
In addition, MOMVO uses the newly obtained population to update the external archive with maximum capacity constraints by employing a Pareto dominance rule . The framework of the basic MOMVO is described by the pseudo-code in Fig. 2.
To test MOMVO’s effectiveness in coping with the problem studied, we choose two classical and popular multi-objective optimization methods, i.e., Nondominated sorting genetic algorithm II (NSGA-II)  and Multi-objective evolutionary algorithm based on decomposition (MOEA/D) , as peer algorithms to perform comparison experiments.
To validate the performance of MOMVO in solving the problem under consideration, we randomly produce 30 test problems which are the combinations of and . The normal processing time of jobs follows normal distribution. Their mean processing time is a random number from 1 to 100, and their standard deviation is calculated through multiplying their mean processing time by the coefficient , where . The setup time of jobs is obtained from 1 to 10. Machine processing speed at five levels, i.e., . The deteriorating and learning rate of jobs on machines are randomly produced from and [−0.1, 0], respectively. The energy consumption per unit time of standby machines is set to .
MOMVO, NSGA-II and MOEA/D have two identical parameters as: population size and mutation probability. For three algorithms, the population size is equal to 100. The values of mutation probability for them are equal to 0.3, 0.3 and 1.0, respectively. The maximum capacity of external archive is set as 100 and the number of fitness evaluation 100 mn is chosen as the stopping condition, in which m and n are machine number and job number, respectively.
Approximation and distribution are two key indicators for evaluating the multi-objective optimization algorithms’ performance. Therefore, this work employs C-metric by Deb et al.  and IGD-metric by Fu et al.  as performance metrics.
1) C-metric is employed to measure a dominated percentage of two sets acquired by two optimizers. represents the dominated percentage of solutions in Y dominated by at least one solution in . is calculated as:
where denotes the number of solutions in . Evidently, a larger value means that the X is better.
2) IGD-metric is adopted to assess both convergence and diversity of a solution set by measuring the average distance between an optimal solution set and an approximated solution set . It is calculated as:
where denotes the Euclidean distance between a solution v in and its closest solution in . In fact, we are unable to attain for the practical problem. Thus, all solutions acquired by MOMVO and its peers are combined, and then choose the nondominated solutions as . Besides, we normalize all objective function values into [0, 1], then calculate IGD-metric. Obviously, the smaller IGD value, the better .
In this part, MOMVO’s performance in tackling the considered problem is validated. MOMVO and its peers run 10 times for each problem, and the mean and variance values regarding C-metric and IGD-metric are calculated.
Table 1 exhibits the experiment results of C-metric, in which MOMVO, NSGA-II and MOEA/D are denoted by the indices “X”, “Y” and “Z”, respectively. It is worth mentioning that a larger mean value represents a better performance. Through looking at the results, it is clear that MOMVO shows obvious outperformance since the mean values of MOMVO is larger than those of its peers on most of test problems. In addition, the average of mean and variance values on all the test problems are calculated, we can find that the average of MOMVO, NSGA-II and MOEA/D regarding mean value are 0.6345, 0.1990, 0.7139 and 0.2082, and those of variance value are 0.1696, 0.0984, 0.1241 and 0.0813, respectively. Hence, the average of MOMVO regarding mean value performs better than its peers, while the average of MOMVO in terms of variance value shows a little worse. Through the analysis of above results, we can declare that MOMVO has obvious advantage in solving the considered problem.
Table 2 gives the experiment results regarding IGD-metric. It is worth mentioning that a smaller mean value denotes a better approximation performance. We can observe that MOMVO has obvious advantages over its peers on most of test cases. Through calculating the average values of all instances in term of mean and variance indicators, we know that the average of MOMVO, NSGA-II and MOEA/D regarding mean values are 0.3573, 0.4800 and 0.7456, while those in terms of variance values are 0.0383, 0.0482 and 0.0683, respectively. Hence, MOMVO outperforms NSAG-II and MOEA/D regarding the average. As can be seen from the above analysis, MOMVO is superior to its peers in obtaining a better set of nondominated solutions.
To visually analyze the experiment results, we draw the boxplot graphs of some test problems of 10 runs in Fig. 3. From them, we can find that the obtained results of MOMVO are better and more stable than its peers. Therefore, we can conclude a conclusion that MOMVO is well-converged and well-diversified in solving our considered problem.
This work addresses a stochastic multi-objective parallel machine scheduling problem with deteriorating and learning effects to reach makespan and energy consumption minimization. By fully considering the characteristics of the proposed problem, we design a multi-objective multi-verse optimization incorporating a stochastic simulation approach to deal with it. The experimental results show that it is an excellent optimization algorithm for solving our considered problem. The results can assist decision-makers in making effective decisions under stochastic environments when handling a parallel machine scheduling problem with multiple optimization objectives, deteriorating and learning effects. The studied problem can help the manufacturing industry improve production efficiency and decrease energy consumption, thereby protecting resources and reducing environmental pollution.
In future work, we plan: 1) To propose a rescheduling approach based on the formulated model; 2) To study an effective approach to solve parallel machine scheduling problems according to the formulated model.
Funding Statement: The authors received no specific funding for this study.
Conflicts of Interest: The authors declare that they have no conflicts of interest to report regarding the present study.