Metaheuristics for Two-stage No-Wait Flexible Flow Shop Scheduling ...

4 downloads 0 Views 529KB Size Report
Abstract - The flexible flow-shop scheduling problem. (FFSSP) is an important branch of production scheduling, the flexible flow shop is a combination of two ...
Proceedings of the 2015 International Conference on Industrial Engineering and Operations Management Dubai, UAE, March 3 – 5, 2015

Metaheuristics for Two-stage No-Wait Flexible Flow Shop Scheduling Problem Mageed A. Ghaleba and Umar S. Suryahatmajab

Ibrahim M. Alharkan

Department of Industrial Engineering King Saud University Riyadh, 12372, KSA a [email protected] b [email protected]

Department of Industrial Engineering King Saud University Riyadh, 12372, KSA

Abstract - The flexible flow-shop scheduling problem (FFSSP) is an important branch of production scheduling, the flexible flow shop is a combination of two well-known machine environments, which are flow shops and parallel machines, and it is well known that this problem is NP-hard. The no-wait requirement is a phenomenon that may occur in flow shops, and it’s prevent the jobs to wait between two successive machines (or stages), jobs must be processed from the start to finish, without any interruption on machines (or stages) or between them. Many different approaches have been applied to FFSSP with no-wait. In this research, two-stage no-wait flexible flow shop scheduling problem (NWFFSSP) has been solved using two meta-heuristics, which are Tabu Search (TS) and Particle Swarm Optimization (PSO). We solved the NWFFSSP with minimum makespan as a performance measure. The performance of the proposed algorithms are studied and compared with previous research results using the same problem data. The results of the study proposes the effective algorithm. Index Terms - Two-stage Flexible flow shop, no-wait, Makespan, Tabu Search, Particle Swarm Optimization.

I. INTRODUCTION Generally, the flexible flow shop (FFS) scheduling problem is to optimize a certain objective function or functions of a set of n jobs and k stages in series (more than one machine in at least one stage). The FFS scheduling problem is quite common in practice, an especial case of hybrid flowshops is the no-wait requirement, this phenomenon not allowed the jobs to wait between any two successive stages [1]. The no-wait requirement implies that the starting time of a job at any stage has to be delayed to ensure that the job can go through the next stage without having to wait. In some industries, due to the temperature or other characteristics of the material it is required that each operation follow the previous one immediately. Such situations appear in the chemical processing [2], food processing [3], concrete ware production [4], pharmaceutical processing [5] and production of steel, plastics, and aluminum products [6]. An example of such environment is a steel rolling mill in which a slab of steel is not allowed to wait, as it would cool off during a wait [1]. If any waiting happened, the slab of steel will be either a rework or a scrap product. Thus, such requirement took a highly importance in the flexible flow shop machine environments.

978-1-4799-6065-1/15/$31.00 ©2015 IEEE

[email protected] Flexible flow shop scheduling problem (FFSSP) has attracted recently the attention of several researches, which resulted in several attempts to solve this problem. However, it was pointed out in the surveys by Goyal and Sriskandarajah [7] and Hall and Sriskandarajah [3] that the no-wait FFSSP has received less attention from both theoretical and practical aspects than classic FFSSP. The parallel machine environment as well as the flow shop are special cases of the FFS environment, and the parallel machine environment is already NP-hard [1]. The flexible flow shop scheduling problems (FFSSP) are, in most cases, NP-hard. For instance, FFSSP restricted to two processing stages, even in the case when one stage contains two machines and the other one a single machine, is NP-hard based on the results obtained by Gupta [8]. And according to the research work by Rock [9], the nowait FFSSP with more than two machines is strongly NP-hard. According to the previous research, it has been found that most of the attempts to solve the No-Wait FFSSP have used heuristic algorithms or specific heuristics and metaheuristics. Many research works were done using specific heuristic to solve the NWFFSSP, Ribas Vila [10] modeled the scheduling problem that comes from an ice cream manufacturing company as a three-stage no-wait hybrid flow shop with batch dependent setup costs. They formulated the problem as a mixed integer program and developed two competitive heuristic one of them proposed to be schedule in the ice cream factory. Xie et al. [11] developed a new heuristic algorithm (the minimum deviation algorithm - MDA) to solve the twostage no-wait flowshops with minimizing the makespan. The performance of the new heuristic has been evaluated and the results showed that the MDA is effective in most of the studied instances. Wang et al. [12] developed a heuristic with a tight error bound for solving the two-stage NWFFSSP with no idle time between two consecutive processed jobs on machines of the second stage. The objective was to minimize the makespan. Liu et al. [13] designed a greedy heuristic named least deviation algorithm to solve the two-stage no-wait hybrid flowshop scheduling problem with minimizing the makespan. The results showed that the least deviation algorithm outperforms other algorithms used to solve such problem in most cases tested in this study. Metaheuristics have been used in solving the NWFFSSP in many research papers, Jolai et al. [14] considered a no-wait flexible flow shop-scheduling problem with sequence

dependent setup times to minimize the makespan. They proposed and used three metaheuristic algorithms to solve this problem, namely population based simulated annealing (PBSA), adapted imperialist competitive algorithm (AICA) and hybridization of adapted imperialist competitive algorithm and population based simulated annealing (AICA&PBSA). Then they evaluated the proposed algorithms and the computational evaluations showed that the hybrid algorithm (AICA&PBSA) was superior to other algorithms. Rabiee et al. [15] solved a no wait two-stage flexible flow shop-scheduling problem with a minimum makespan as a performance measure using SA and GA algorithms. The performance of the used metaheuristics were compared with the performance of MDA (Minimum Deviation Algorithm). The result showed that SA proposed to be more efficient than the other algorithms (MDA and GA) in term of minimizing makespan. Shafaei et al. [16] discussed the no-wait two-stage flexible flow shop with minimizing mean flow time as a performance measure. The performance of six developed metaheuristics algorithms (Hybrid Imperialist Competitive Algorithm (HICA), Stochastic Imperialist Competitive Algorithm (SICA), Hybrid Ant Colony Optimization (HACO), Stochastic Ant Colony Optimization (SACO), Hybrid Particle Swarm Optimization (HPSO) and Stochastic Particle swarm optimization (SPSO)) was evaluated. The results showed that HICA significantly outperforms other algorithms while SPSO yielded the lowest CPU time. However, most of the published research about TS is for classic FFSSP, while a few number of research can be found for no-wait FFSSP. Moreover, most of the published research about PSO is for continuous optimization problems, while little research can be found for combinatorial optimization problems, especially on scheduling problems. Obviously, it is a challenge to employ the algorithms in different areas of problems other than those areas that the inventors originally focused on. To the best of our knowledge, there is no published work for dealing with the two-stage no-wait FFSSP with makespan using TS and PSO. In this paper, we will propose the TS and different approaches of the PSO for the no-wait FFSSP with the makespan criterion, in which the purpose is to find an efficient meta-heuristic that solve the problem. The proposed PSO approaches are the discrete particle swarm optimization (DPSO) [17], similar particle swarm optimization (SPSO) [18] and geometric particle swarm optimization (GPSO) [19]. The remaining content of this paper are organized as follows. In section 2, the no-wait FFSSP is introduced. In section 3, the TS and PSO are proposed after presenting the starting solution chosen and the constructive heuristic. In section 4, experimental study is presented with the chosen instances, parameter tuning, comparison process and the results. Finally, in section 5, we end the paper with some conclusions and future work propositions. II. THE TWO-STAGE NO-WAIT FLEXIBLE FLOW SHOP The flexible flow shop is a machine environment with identical stages in series; at stage , 1, … , , there are

machines in parallel. There is a no-wait restriction between any two successive stages. Job , 1, … , , has to be processed at each stage on one machine, any one will do (At any moment of time, each machine can processed at most one job and similarly, each operation of a job can only be processed on one machine). The processing times of job at the various stages are , ,… , . In this research, we will work on the two-stage FFSSP only. The following figure (Fig. 1) explains the two-stage FFSSP.

Fig. 1 Flexible flow shop environment description.

To fulfill the restrictions a constructive heuristic is developed to make sure that the completion time of a job at any stage must be equal to the starting time of that job on the next stage. The problem is to find a sequence that the maximum completion time, i.e., makespan , is minimized. To compute the makespan and to ensure the feasibility of the generated solutions (Schedules) a constructive heuristic is used. The problem can be described using Graham notation [20] | | | | as follows: , where k is the number of stages. The parallel machine environment as well as the flow shop are special cases of the FFS environment, and the parallel machine environment is already NP-hard [1]. The flexible flow shop scheduling problems (FFSSP) are, in most cases, NP-hard. For instance, FFSSP restricted to two processing stages, even in the case when one stage contains two machines and the other one a single machine, is NP-hard based on the results obtained by Gupta [8]. And according to the research work by Rock [9], the no-wait FFSSP with more than two machines is strongly NP-hard. III. SOLUTION METHODS The proposed solution methods are based on TS and PSO approaches. They are categorized based on the algorithms applied for job sequencing and machine assignment. With reference to the algorithms presented in this paper, the job sequences are generated using the stochastic search process of TS and PSO. The jobs are then allocated to the machines using a constructive heuristic algorithm, which allocates the job with the first priority to the machine with the earliest available time. In the rest of this section, the starting solution and the constructive heuristic are explained. Then the structure of the Tabu Search and the different approaches of the PSO are presented. This is followed by a summary of the algorithms

parameters that will be used in the tuning process in the following section. A. Starting Solution and Constructive Heuristic Since the proposed algorithms are improvement type algorithms, a starting solution is necessary to be the start out complete schedule for the used metaheuristics. Then an iterative improvement approach will be applied to the starting solution until a high quality feasible solution is obtained. The constructive heuristic was developed to make sure that the completion time of a job on the first stage must be equal to the starting time of that job on the second stage, which is the fulfillment of the restrictions.

ƒ •

A.2 The constructive heuristic To construct a solution for such type of environment for the given constraint, which is the no-wait constraint, the most important points that should be considered in constructing a solution are: • The jobs are assigned according to the availability of the machines in each stage. • If the machine in the following stage will be busy, the job will not be processed until that machine will be idle (which means that the processing of the job in the assigned machine will not start until we assure that there is an idle machine in the following stage). • The time that the job will take in the machine waiting for the machine in the next stage to be idle (for satisfying the no-wait constraint), will be added to its processing time in that stage. The steps of the constructive heuristic for solving such type of environment for the given constraint (the no-wait constraint) are as follows: •

Step 1: Initialization o Select a starting

1, 2, … ,

sequence,

and

set

.

to zero, Set machines availability time where 1, … , , is the no. stages. ƒ Always choose the minimum available time and its machine at each stage for each job. o For stage 1: , . , . o For stage 2: will be, at stage o Starting time for each job 1 and at stage 2. Step 2: o For each stage 1 and 2, find and . o Starting with the first job in the initial sequence: o



,

and add the job

to

Step 3: as the starting time o For each job , set as the machine that processing that job. and o For the first stage: The completion . time o For stage 2: ƒ ƒ o

A.1 Starting solution The starting sequence in this study was generated using two dispatching rules shortest processing time (SPT) and longest processing time (LPT), and the schedule has been done using the developed constructive heuristic. The sequence with the minimum value has been chosen as our starting solution.

Set the .



. And for the machine that process this job but at stage 1: ƒ The for all jobs that follow this job and has been processed by the same machine will be updated by this value . Repeat this step until then go to Step 4.

o Step 4: o If then stop and compute the value of . o Otherwise, go to Step 2.

B. Tabu Search (TS) Tabu search algorithm was proposed by Glover [21]. In 1986, he pointed out the controlled randomization in SA to escape from local optima and proposed a deterministic algorithm [21]. In a parallel work, a similar approach named “steepest ascent/mildest descent” has been proposed by Hansen [22]. In the 1990s, the tabu search algorithm became very popular in solving optimization problems in an approximate manner. Nowadays, it is one of the most widespread S-metaheuristics. The use of memory, which stores information related to the search process, represents the particular feature of tabu search. Usually, the whole neighbourhood is explored in a deterministic manner. To avoid cycles, TS manages a memory of the solutions or moves recently applied, which is called the tabu list. This tabu list constitutes the short-term memory. At each iteration of TS, the short-term memory is updated. The algorithm in Table I (Algorithm 1) explains how TS works. TABLE I ALGORITHM 1 TS ALGORITHM Algorithm 1: Template of Tabu Search algorithm ; %Initial solution Initialize the tabu list, medium-term and long-term memories; Repeat Find best admissible neighbour ; %non tabu or aspiration criterion holds ; Update tabu list, aspiration conditions, medium and long term memories; If _ holds Then intensification; If _ holds Then diversification; Until Stopping criteria satisfied Output: Best solution found.

According to Laporte et al. [23], TS-baseed methods have been the most successful metaheuristics for a wide variety of problems, especially for solving the vehicle routing problem. In general, Reinelt [24] stated that the basic TS difficulties include the tabu list design, the mecchanism of list management, and the non-prohibited move sellection.

C. Particle Swarm Optimization Particle swarm optimization is an algoritthm proposed by Kennedy and Eberhart [25] in 1995, motivateed by such social behaviours of organisms, as the flocking toggether of birds or the schooling together of fish. Particle swarm m optimization as an optimization tool provides a populatioon-based search procedure in which individuals (here callled “particles”) change their positions (states) with time. Inn a PSO system, particles move around in a multidimensionnal search space. During its move, each particle adjusts its position according to its own experience and the experience of a neighbouring particle, making use of the best position encoountered by itself and its neighbour [26]. For example, it can bee observed that a fish school is able to avoid a predator in the foollowing manner: initially it is divided into two groups, then thhe original school is reformed (see Fig. 2), while maintaininng the cohesion among the school [27].

In the basic model, a swarm co onsists of particles flying around in a dimensional search space. Each particle is a candidate solution to the problem, and is represented by the vector in the decision space. A particle p has its own position and velocity, which means the flyin ng direction and step of the particle (Fig. 3). Optimization takes advantage of the cooperation between the particlees. The success of some particles will influence the behav viour of their peers. Each particle successively adjusts its po osition toward the global optimum according to the follow wing two factors: the best position visited by itself denoted as , ,..., and the best position visited by the whole swarm (or , the best position for a given subset , ,..., . The vector of the swarm) denoted as represents the differen nce between the current position of the particle and the best position of its neighbourhood [28]. The algorithm in Table II (Algorrithm 2) explains how the particles updates process works in PSO. P TABLE II ALGORITHM 2 BASIC PSO O ALGORITHM Algorithm 2: Template of Particle Swarm Optimization O algorithm Random initialization of the whole swarm Repeat Evaluate For all particles Update velocities: 1

1

1

Fig. 2 a schematic of a fish school avoiding a predator. ((a) The school forms only one group, (b) the individuals avoid the predator by forming a “fountain” like structure, and (c) the school is reformeed [27].

Thus, a PSO system combines local searrch methods with global search methods, attempting to balancee exploration and exploitation. Originally, PSO has been succeessfully designed for continuous optimization problems. Eachh particle in the swarm occupies a position and has a velocity or rate of change in position (Fig. 3). Each particle knows its current position and velocity as well as the position and veloocity of the other particles [28].

Fig. 3 Particle swarm with their associated positions andd velocities. At each iteration, a particle moves from one position to another inn the decision space. PSO uses no gradient information during the seearch [28].

C.1 Basic PSO (for continuous optimization)

Move to the position: If , If , Update , End For Until Stopping criteria Output: Best solution found.

1

C.2 PSO for No-Wait FFSSP Unlike discrete optimization algorithms, a PSO algorithms are applied traditionally to continuo ous optimization problems. Some adaptations must be made for discrete optimization problems. They differ from contin nuous models in “mapping between particle positions and discrete solutions” and “velocity models” [28]. Many discrete representations such as binary encodings [29] and permutations can be ussed for mapping between particle positions and discrete so olutions particle position. Velocity models for discrete optimiization problems have been generally inspired from mutation and a crossover operators of evolutionary algorithms (EAs). A key k issue in designing PSO for the combinatorial optimization (i.e., scheduling problems) is to establish a suitable way to o encode a solution (i.e., schedule) to PSO particle. PSO has been applied to o different combinatorial optimization problems. For examp ple, the traveling salesman problem [30, 31], machine sch heduling [32, 33], train scheduling [34], and vehicle routin ng [35]. Many researchers suggested different ways to encodee a schedule (or generally a permutation) to PSO particle. For example, discrete particle swarm optimization (DPSO) [17], similar particle swarm

optimization (SPSO) [18], and geometric particle swarm optimization (GPSO) [19]. In this research, DPSO, SPSO, GPSO are coded and applied to the NWFFSSP. D. TS and PSO Design Parameters In TS, in addition to the search components of local search, such as the representation, neighbourhood, initial solution, we have to define the following concepts that compose the search memory of TS: tabu list size, inner step size. In the same way for PSO, in addition to the swarm size the crossover operators, crossover ratio, and mutation operators were defined. Table III summarizes the main parameters of TS and PSO algorithms. The used crossover operators are PMX, CX, and PTL. The used mutation operators are Swap and Insertion. For both TS and PSO, the number of iterations were set to 100, 200 or 300 iterations. TABLE III PARAMETERS TUNING FOR THE PROPOSED ALGORITHMS Algorithm Parameters Values Tabu list 5, 10, 15, 20, 25 Inner step (Neighborhood size)* 2, 5, 10, 20, 50 TS Neighborhood generation Swap method Number of Iterations 100, 200, 300 Swarm size 20, 50, 70, 100 Crossover probability 0.3, 0.5, 0.8, 0.9 Crossover operators PMX, CX, PTL Mutation operators Swap, Insertion DPSO Number of Iterations 100, 200, 300 w 0.1, 0.3, 0.5, 0.8, 1 C1 0.1, 0.3, 0.5, 0.8, 1 C2 0.1, 0.3, 0.5, 0.8, 1 PSO Swarm size 20, 50, 70, 100 Crossover probability 0.3, 0.5, 0.8, 0.9 SPSO Crossover operators PMX, CX, PTL Mutation operators Swap, Insertion Number of Iterations 100, 200, 300 Swarm size 20, 50, 70, 100 Crossover operators 3PMBCX GPSO Mutation operators Swap, Insertion Number of Iterations 100, 200, 300 *Neighborhood size is 2, swarm size in PSO.

, where n = number of jobs in TS and

According to the chosen parameters values a simulation runs have been established to find the best parameters setting for each algorithm. Factorial designs have been used (a design of experiments approaches), to develop a set of runs for each algorithm (full factorial design in TS and GPSO, and fractional in DPSP and SPSO). Solving 18 randomly chosen instances will result in a number of 1350 runs for TS, 1800 runs for DPSO, 1800 runs for SPSO, and 432 runs for GPSO. Table IV summarize the best-found setting of parameters for each algorithm. IV. EXPERIMENTAL STUDY In this section, we studied the performance of the proposed algorithms according to the best-found setting resulted from the tuning process. The performance of the proposed algorithms studied by solving a set of instances that have been solved before in the research work of [11] and [15].

The set of instances classified into large and small-scale problems, for each problem (small or large-scale) there is three different instances according to the number of machines at each stage. For example, for the small-scale problem of 10 jobs, there is 10a, 10b and 10c, 10a is the instance of 10 jobs with ‘3’ machines in the first stage and ‘4’ machines in the second stage. Table V list all the instances used in the performance evaluation of the proposed algorithms. TABLE IV PARAMETERS SETTINGS FOR THE PROPOSED ALGORITHMS Values for Values for Algorithm Parameters Small-Scale Large-Scale Problems Problems Tabu list 10 15 Inner step (Neighborhood 20 50 size)* TS Neighborhood generation Swap Swap method Number of Iterations 500 700 Swarm size 50 70 Crossover probability 0.8 0.8 Crossover operators CX CX Mutation operators Insertion Insertion DPSO Number of Iterations 200 300 w 1 1 C1 0.8 0.8 C2 0.8 0.8 PSO Swarm size 50 70 Crossover probability 0.8 0.8 SPSO Crossover operators PTL CX Mutation operators Insertion Swap Number of Iterations 200 300 Swarm size 50 70 Crossover operators 3PMBCX 3PMBCX GPSO Mutation operators Swap Swap Number of Iterations 200 300

A. The Evaluation Process In order to evaluate the performance of the proposed algorithms, comparisons with different algorithms, used to solve the same problem with the same set of data, have been made. TABLE V THE USED INSTANCES Small-Scale Problems Large-Scale Problems No. M/Cs each No. M/Cs each No. Jobs No. Jobs stage stage j S1 S2 j S1 S2 8a 3 4 72a 8 10 8

10

14

16 20

8b

2

2

8c 10a

3 3

2 4

10b

2

2

10c

3

2

14a

3

4

14b

2

2

14c 16a

3 3

2 4

16b

2

2

16c

3

2

20a

3

4

72

80

88

108 120

72b

10

10

72c 80a

12 8

10 10

80b

10

10

80c

12

10

88a

8

10

88b

10

10

88c 108a

12 8

10 10

108b

10

10

108c

12

10

120a

8

10

24

20b

2

2

120b

10

10

20c 24a 24b 24c

3 3 2 3

2 4 2 2

120c 132a 132b 132c

12 8 10 12

10 10 10 10

132

The compared algorithms are the MDA (Minimum Deviation Algorithm), which is considered as an efficient algorithm to solve the two-stage no-wait FFSSP and developed by [11], simulated annealing SA and genetic algorithm GA which have been proposed by [15]. TABLE VI MAKESPAN VALUES FOR SMALL-SCALE INSTANCES No. M/Cs Literature No. Jobs Proposed Algorithms each Algorithms stage PSO j S1 S2 MDA SA GA TS

8

DPSO

SPSO

8a

3

4

100

78

78

76

76

76

GPSO

76

8b

2

2

121

116

116

116

116

116

116

10

3

2

126

115

115

128

115 104

115

4

115 104

115

3

101

101

101

101

10b

2

2

165

150

153

150

150

150

150

10c 14a

3 3

2 4

154 145

144 108

143 109

143 107

143 107

143 107

143 107

14b

2

2

188

157

160

156

156

156

156

14c

3

2

137

133

133

132

132

132

131

TABLE VII MAKESPAN VALUES FOR LARGE-SCALE INSTANCES No. M/Cs Literature No. Jobs Proposed Algorithms each Algorithms stage PSO S S MD j SA GA TS DPS SPS GPS 1 2 A O O O 1 210 182 185 178 72a 8 1835 1815 1803 0 6 2 6 8 72

14

8c 10a

is the solution by the proposed algorithm Where and is the best solution from the literature. If the POI value is positive that means we have an improvement, if zero means no improvement, and if negative that means we have worse solution. B. Computational Results Review of literature reveals that three algorithms solved the same problem data that we have. These algorithms are the minimum deviation algorithm (MDA) which developed by [11] and SA and GA which proposed by [15]. Thus, in our study the performance of the proposed algorithms were compared with each other and with the algorithms from the literature for 36 different problems. Simulation results are listed in the Tables VI and VII for the both small and largescale problems.

72b

1 0

1 0

178 2

150 0

160 2

148 4

72c

1 2

1 0 1 0

157 1 226 1

139 5 195 9

146 8 201 4

138 7 193 7

1569

1530

1493

1428

1420

1395

1977

1961

1958

1665

1664

1646

1575

1545

1525

2225

2209

2193

1860

1851

1820

1746

1723

1699

2486

2468

2455

4

171

117

118

117

122

122

117

2

2

218

175

179

175

183

182

175

16c 20a

3 3

2 4

170 214

156 174

156 177

156 173

165

165 174

156 173

80a

8

173

20b

2

2

300

261

269

259

263

262

259

80b

20c

3

2

247

234

236

236

236

236

236

1 0

1 0

190 1

163 4

172 4

160 3

24a 24b 24c

3 2 3

4 2 2

216 305 273

194 289 254

196 304 256

194 287 254

194 295 254

194 296 255

194 290 254

80c

1 2

88a

8

1 0 1 0

171 9 254 3

154 6 221 4

160 9 226 8

151 0 218 1

The simulation model for the proposed algorithms was developed using MATLAB R2012a on a laptop with a Core i5 2.5 GHz Processor and 6 GB RAM. The performance of the proposed algorithms was testified by solving 36 different instances (18 small-scale and 18 large-scale instances). Regarding the performance measure, two measures have been used in the evaluation process, the first is the Relative Percentage Deviation (RPD) of Makespan over the best solution, which calculated as follows: | | 100%

88b

1 0

1 0

215 3

184 0

195 0

179 2

88c

1 2

108 a

8

1 0 1 0

185 4 265 5

169 1 245 4

179 9 255 0

168 9 244 4

108 b

1 0

1 0

230 3

207 4

223 1

203 0

2102

2076

2057

108 c 120 a

1 2

1 0 1 0

212 8 300 5

196 6 278 4

207 5 294 9

195 2

1996

1970

1948

278 0

2831

2806

2787

120 b

1 0

1 0

258 2

245 0

260 9

236 2

2428

2432

2386

120 c 132 a 132

1 2

1 0 1 0 1

247 0 349 6 303

232 1 315 3 265

243 9 328 2 285

231 1 314 3 259

2345

2341

2318

3185

3166

3153

2687

2649

2630

20 24

108 120 132

100%

88

is the solution by the proposed algorithm Where and is the best value among all the algorithms (literature and proposed). The second is the Percent of Improvement (POI) of Makespan over the compared algorithms from the literature, which calculated as follows:

80

3

16

16a 16b

8

8 1

b 132 c

0 1 2

0 1 0

1 271 5

6 249 3

0 261 1

3 246 5

25539

2486

2476

After reviewing the results in Table VI, it is shown that TS outperforms SA in 9 out of 18 innstances, GSPO outperforms it in 8 out of 18, while both outpeerforms GA in 13 instances, and MDA in all instances. Which iindicates that the TS and GPSO algorithms reached better resuults in about 50% of the solved instances. In addition, RPD performance meassure with %95 confidence interval was calculated as shownn in Fig. 4 and it indicates that the results obtained by the algoorithms has nonoverlapped ranges and TS and GPSO havee the lowest and approximately equally confidence interval whhich confirms the superiority of the TS and GPSO compareed to the other algorithms. Moreover, POI performance meeasure with %95 confidence interval was calculated as shownn in Fig. 5 and it gives the same indications as the RPD perforrmance measure, the TS and GPSO have the highest and approoximately equally confidence interval which also confirms the ssuperiority of the TS and GPSO compared to the other algorithm ms. Interval Plot of MDA, SA, GA, TS, DPSO, SPSO, G GPSO

Fig. 5 Means and interval plots for small-sscale instances in terms of POI.

V. CONCLU USIONS In this research, a flexible flow shop-scheduling problem introduced and a special case of su uch environment, which is the No-Wait flexible flow shop scheduling problems, is introduced as our aim of this researrch. The research work has been done on only a two-stage no-wait FFSSP, in which TS and PSO with three versions DPSO, SPSO and GPSO have been applied to solve the problem.

95% CI for the Mean

25

Interval Plot of MDA, SA, GA, TS,, DPSO, SPSO, GPSO 95% CI for the Me ean 20

16 14 12

10

10 Data

Data

15

5

8 6

0

4 MDA

SA

GA

TS

DPSO

SP SO

GPSO

Fig. 4 Means and interval plots for small-scale instancees in terms of RPD.

2 0 MDA

After reviewing the results in Table VII,, it is shown that TS outperforms SA in 17 out of 18 innstances, GSPO outperforms it in 12 out of 18, while both outpperforms GA and MDA in all instances. Which indicates that thhe TS and GPSO algorithms reached better results in more thann 90% in TS and 60% in GPSO for the solved instances. Also in term of RPD and POI performannce measures and with %95 confidence interval the range of thee results obtained is not overlapped, and with the lower confiddence interval in term of RPD and the highest confidence intterval in term of POI for the TS, this confirms the superioority of the TS compared to the other algorithms (Fig. 6 and F Fig. 7). Thus, as a result, the proposed TS can be considered as an effective and efficient algorithm for solving tthe two-stage nowait FFSSP with minimum makespan. Alsoo, GPSO can be considered as the best PSO approach ffor solving the scheduling problems in general and NWFFS SSP in particular with not to forget to mention that GPSO com mes second after TS compared to the other algorithms.

SA

GA

TS

DPSO

SPSO

GPSO

Fig. 6 Means and interval plots for large-sscale instances in terms of RPD.

From the computational resultts, it is shown that TS and GPSO algorithms reached better reesults in more than 50% of the solved instances in small-scale problems. But, more than 90% and 60 for TS and GPSO respectively r for large-scale problems. From the shown figures, some overlay is seen between the algorithms but they sho ow that both TS and GPSO have the lowest variation as their ranges are the narrowest one. This indicates that the proposeed TS can be considered as an effective and efficient algorithm m for solving the two-stage no-wait FFSSP with minimum mak kespan. Also GPSO can be considered as the best PSO approach a for solving the scheduling problems in general and NWFFSSP in particular, with not to forget to mention that GPSO comes second after TS compared to the other algorithm ms, which are MDA, SA and GA from the literature, DPSO and d SPSO from the proposed PSO approaches.

[11]

[12]

[13] Fig. 6 Means and interval plots for large-scale instancees in terms of POI.

Thus, for the future work, the perfoormance of the proposed TS and GPSO is very promising annd encouraging to investigate these solution methods for largerr problems (more than two stages) and utilize other metaheuuristics (i.e., Ant Colony Optimization or Firefly Algorithm m) to solve the NWFFSSP.

[14]

[15]

REFERENCES [1] [2]

[3]

[4] [5]

[6]

[7]

[8] [9] [10]

M. L. Pinedo, Scheduling - Theory, Algorithms, and Systems vol. 4. New York, USA: S Springer Science Business Media, LLC, 2012. C. Rajendran, "A no-wait flowsshop scheduling heuristic to minimize makespan," Journal of the Operational Research Society, vol. 45, pp. 472-478, 1994. N. G. Hall and C. Sriskandarajahh, "A survey of machine scheduling problems with bblocking and nowait in process," Operations researrch, vol. 44, pp. 510-525, 1996. J. Grabowski and J. Pempera, "Sequencing of jobs in some production system," Europeean Journal of Operational Research, vol. 125, pp. 5535-550, 2000. W. Raaymakers and J. Hoogeveeen, "Scheduling multipurpose batch process industriies with no-wait restrictions by simulated annealling," European Journal of Operational Research, vool. 126, pp. 131151, 2000. T. Aldowaisan and A. Allahverdi, "New heuristics for< i> m-machine no-wait flowsshop to minimize total completion time," Omega, vol. 32, pp. 345-352, 2004. C. Sriskandarajah and S. P. Setthi, "Scheduling algorithms for flexible flowshops: w worst and average case performance," European Journaal of Operational Research, vol. 43, pp. 143-160, 19899. J. N. Gupta, "Two-stage, hybrid flow wshop scheduling problem," Journal of the Operational Research Society, pp. 359-364, 1988. H. Röck, "The three-machine no-w wait flow shop is NP-complete," Journal of the ACM ((JACM), vol. 31, pp. 336-345, 1984. I. Ribas Vila and R. Companys Pascual, "A hybrid flow shop model for an ice crream production scheduling problem," Journal of Industrial

[16]

[17]

[18]

[19]

[20] [21] [22]

[23]

Engineering and Manageement, vol. 2, pp. 60-89, 2009. J. Xie, W. Xing, Z. Liu,, and J. Dong, "Minimum deviation algorithm for tw wo-stageno-wait flowshops with parallel machines," Computers C & Mathematics with Applications, vol. 47, pp. 1857-1863, 2004. Z. Wang, W. Xing, and F. Bai, "No-wait flexible flowshop scheduling with w no-idle machines," Operations Research Lettters, vol. 33, pp. 609-614, 2005. Z. Liu, J. Xie, J. Li, and J. Dong, "A heuristic for two-stage no-wait hybrid flowshop f scheduling with a single machine in either staage," Tsinghua Science and Technology, vol. 8, pp. 43--48, 2003. F. Jolai, M. Rabiee, and H. H Asefi, "A novel hybrid meta-heuristic algorithm for f a no-wait flexible flow shop scheduling problem with sequence dependent setup times," Internation nal journal of production research, vol. 50, pp. 7447 7-7466, 2012. M. Rabiee, P. Ramezan ni, and R. Shafaei, "An efficient simulated anneaaling algorithm for a NoWait Two Stage Flexiblle Flow Shop Scheduling Problem," International Journal of Advanced Information Technology (IJAIT), ( vol. 1, pp. 13-28, 2011. R. Shafaei, N. Moradiinasab, and M. Rabiee, "Efficient meta heuristicc algorithms to minimize mean flow time in no-waitt two stage flow shops with parallel and identical machines," International Society of Management Science and Engineering Management, vol. 6, pp. 42 21-430, 2011. M. Fatih Tasgetiren, Q.-K K. Pan, P. Suganthan, and Y.-C. Liang, "A discreete differential evolution algorithm for the no-w wait flowshop scheduling problem with total flowtime criterion," in Computational Intelligence in Scheduling, 2007. m on, 2007, pp. 251-258. SCIS'07. IEEE Symposium Z. Lian, X. Gu, and B. Jiao, "A similar particle swarm optimization alg gorithm for permutation flowshop scheduling to o minimize makespan," Applied Mathematics and Computation, vol. 175, pp. 773-785, 2006. A. Moraglio, C. Di Chio,, J. Togelius, and R. Poli, "Geometric particle swarm m optimization," Journal of Artificial Evolution and Applications, A vol. 2008, p. 11, 2008. P. Brucker, Scheduling Alg gorithms: Springer, 2007. F. Glover, "Tabu search-p part I," ORSA Journal on computing, vol. 1, pp. 190--206, 1989. P. Hansen, "The steepesst ascent mildest descent heuristic for combinatorial programming," in Congress on numerical methods in combinatorial optimization, Capri, Italy, 1986, pp. 70-145. G. Laporte, M. Gendreau, J. Y. Potvin, and F. Semet, "Classical and modern heuristics h for the vehicle routing problem," Interrnational transactions in operational research, vol. 7, pp. 285-300, 2000.

[24] [25]

[26] [27] [28] [29]

[30]

[31]

[32]

[33]

[34]

[35]

G. Reinelt, The traveling salesman: computational solutions for TSP applications: Springer-Verlag, 1994. J. Kennedy and R. C. Eberhart, "Particle swarm optimization," presented at the IEEE International Conference on Neural Networks, Perth, Australia, 1995. R. J. Moraga, G. W. DePuy, and G. E. Whitehouse, "Metaheuristics- A Solution Methodology for Optimization Problems." J. Dreo, A. Petrowski, and E. Taillard, "Metaheuristics for Hard Optimization," 2006. E.-G. Talbi, Metaheuristics: from design to implementation vol. 74: John Wiley & Sons, 2009. J. Kennedy and R. C. Eberhart, "A discrete binary version of the particle swarm algorithm," in Systems, Man, and Cybernetics, 1997. Computational Cybernetics and Simulation., 1997 IEEE International Conference on, 1997, pp. 4104-4108. G. Onwubolu and M. Clerc, "Optimal path for automated drilling operations by a new heuristic approach using particle swarm optimization," International journal of production research, vol. 42, pp. 473-491, 2004. K.-P. Wang, L. Huang, C.-G. Zhou, and W. Pang, "Particle swarm optimization for traveling salesman problem," in Machine Learning and Cybernetics, 2003 International Conference on, 2003, pp. 15831585. M. F. Tasgetiren, M. Sevkli, Y.-C. Liang, and G. Gencyilmaz, "Particle swarm optimization algorithm for permutation flowshop sequencing problem," in Ant Colony Optimization and Swarm Intelligence, ed: Springer, 2004, pp. 382-389. W. Xia, Z. Wu, W. Zhang, and G. and Yang, "Applying particle swarm optimization to job-shop scheduling problem," Chinese J. Mech. Eng. (English Edition), vol. 17, pp. 437–441, 2004. C. S. Chang and C. M. Kwan, "Evaluation of evolutionary algorithms for multi-objective train schedule optimization," in AI 2004: Advances in Artificial Intelligence, ed: Springer, 2005, pp. 803815. Y. W. Zhao, B. Wu, W. Wang, Y. L. Ma, W. Wang, and H. Sun, "Particle swarm optimization for vehicle routing problem with time windows," in Materials Science Forum, 2004, pp. 801-805. BIOGRAPHY

Ibrahim Alharkan is an Associate Professor and Chairman of deanship of admission at King Saud University, Research interests are in the areas of production planning and control, modeling and simulation of industrial and service systems, and applied operations research. These areas of interest include production planning and control, inventory control, production sequencing, scheduling and lot sizing; expert system; simulated annealing algorithms; genetic algorithms; Tabu

search; scatter search algorithms, and total quality management, quality control, maintenance planning and scheduling, project scheduling. Mageed A. Ghaleb is a researcher and a master student at King Saud University (KSU), in industrial engineering department. His area of expertise is industrial and manufacturing systems. He received the BSc in Industrial Engineering from Taiz University, Taiz, Yemen. Umar S. Suryahatmaja is a researcher and a master student at King Saud University (KSU), in industrial engineering department. His area of expertise is industrial and manufacturing systems.