258
Stochastic Scheduling for a Network of MEMS Job Shops Amrusha Varadarajan Dissertation submitted to the faculty of the Virginia Polytechnic Institute and State University in partial fulfillment of the requirements for the degree of Doctor of Philosophy In Industrial and Systems Engineering Dr. Subhash C. Sarin, Chair Dr. Michael P. Deisenroth Dr. Barbara Fraticelli Dr. Robert W. Hendricks Dr. Robert H. Sturges November 29, 2006 Blacksburg, Virginia Keywords: Stochastic Programming, L-shaped Method, Feasibility Cuts, Reentrant Flow, Deadlock prevention, Heuristic, Dynamic Scheduling 2006, Amrusha Varadarajan

Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

Stochastic Scheduling for a Network of MEMS Job Shops

Amrusha Varadarajan

Dissertation submitted to the faculty of the Virginia Polytechnic Institute and State University

in partial fulfillment of the requirements for the degree of

Doctor of Philosophy In

Industrial and Systems Engineering

Dr. Subhash C. Sarin, Chair Dr. Michael P. Deisenroth

Dr. Barbara Fraticelli Dr. Robert W. Hendricks

Dr. Robert H. Sturges

November 29, 2006 Blacksburg, Virginia

Keywords: Stochastic Programming, L-shaped Method, Feasibility Cuts, Reentrant Flow, Deadlock prevention, Heuristic, Dynamic Scheduling

2006, Amrusha Varadarajan

Page 2: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

ii

Stochastic Scheduling for a Network of MEMS Job Shops

Amrusha Varadarajan

ABSTRACT

This work is motivated by the pressing need for operational control in the fabrication of

Microelectromechanical systems or MEMS. MEMS are miniature three-dimensional

integrated electromechanical systems with the ability to absorb information from the

environment, process this information and suitably react to it. These devices offer

tremendous advantages owing to their small size, low power consumption, low mass and

high functionality, which makes them very attractive in applications with stringent

demands on weight, functionality and cost. While the system’s ‘brain’ (device electronics)

is fabricated using traditional IC technology, the micromechanical components

necessitate very intricate and sophisticated processing of silicon or other suitable

substrates. A dearth of fabrication facilities with micromachining capabilities and a

lengthy gestation period from design to mass fabrication and commercial acceptance of

the product in the market are factors most often implicated in hampering the growth of

MEMS. These devices are highly application specific with low production volumes and

the few fabs that do possess micromachining capabilities are unable to offer a complete

array of fabrication processes in order to be able to cater to the needs of the MEMS

R&D community. A distributed fabrication network has, therefore, emerged to serve the

evolving needs of this high investment, low volume MEMS industry. Under this

environment, a central facility coordinates between a network of fabrication centers

(Network of MEMS job shops – NMJS) containing micromachining capabilities. These

fabrication centers include commercial, academic and government fabs, which make their

services available to the ordinary customer. Wafers are shipped from one facility to

another until all processing requirements are met. The lengthy and intricate process

Page 3: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

iii

sequences that need to be performed over a network of capital intensive facilities are

complicated by dynamic job arrivals, stochastic processing times, sequence-dependent set

ups and travel between fabs. Unless the production of these novel devices is carefully

optimized, the benefits of distributed fabrication could be completely overshadowed by

lengthy lead times, chaotic routings and costly processing. Our goal, therefore, is to

develop and validate an approach for optimal routing (assignment) and sequencing of

MEMS devices in a network of stochastic job shops with the objective of minimizing the

sum of completion times and the cost incurred, given a set of fabs, machines and an

expected product mix.

In view of our goal, we begin by modeling the stochastic NMJS problem as a two-stage

stochastic program with recourse where the first-stage variables are binary and the

second-stage variables are continuous. The key decision variables are binary and pertain

to the assignment of jobs to machines and their sequencing for processing on the

machines. The assignment variables essentially fix the route of a job as it travels through

the network because these variables specify the machine on which each job-operation

must be performed out of several candidate machines. Once the assignment is decided

upon, sequencing of job-operations on each machine follows. The assignment and

sequencing must be such that they offer the best solution (in terms of the objective)

possible in light of all the processing time scenarios that can be realized. We present two

approaches for solving the stochastic NMJS problem. The first approach is based on the

L-shaped method (credited to van Slyke and Wets, 1969). Since the NMJS problem lacks

relatively complete recourse, the first-stage solution can be infeasible to the second-stage

problem in that the first stage solution may either violate the reentrant flow conditions or

it may create a deadlock. In order to alleviate these infeasibilities, we develop feasibility

cuts which when appended to the master problem eliminate the infeasible solution.

Alternatively, we also develop constraints to explicitly address these infeasibilities directly

within the master problem. We show how a deadlock involving 2 or 3 machines arises if

and only if a certain relationship between operations and a certain sequence amongst

them exists. We generalize this argument to the case of m machines, which forms the

basis for our deadlock prevention constraints. Computational results at the end of

Page 4: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

iv

Chapter 3 compare the relative merits of a model which relies solely on feasibility cuts

with models that incorporate reentrant flow and deadlock prevention constraints within

the master problem. Experimental evidence reveals that the latter offers appreciable time

savings over the former. Moreover, in a majority of instances we see that models that

carry deadlock prevention constraints in addition to the reentrant flow constraints

provide at par or better performance than those that solely carry reentrant flow

constraints.

We, next, develop an optimality cut which when appended to the master problem helps

in eliminating the suboptimal master solution. We also present alternative optimality and

feasibility cuts obtained by modifying the disjunctive constraints in the subproblem so as

to eliminate the big H terms in it. Although any large positive number can be used as the

value of H, a conservative estimate may improve computational performance. In light of

this, we develop a conservative upper bound for operation completion times and use it as

the value of H. Test instances have been generated using a problem generator written in

JAVA. We present computational results to evaluate the impact of a conservative

estimate for big H on run time, analyze the effect of the different optimality cuts and

demonstrate the performance of the multicut method (Wets, 1981) which differs from

the L-shaped method in that the number of optimality cuts it appends is equal to the

number of scenarios in each iteration. Experimentation indicates that Model 2, which

uses the standard optimality cut in conjunction with the conservative estimate for big H,

almost always outperforms Model 1, which also uses the standard optimality cut but uses

a fixed value of 1000 for big H. Model 3, which employs the alternative optimality cut

with the conservative estimate for big H, requires the fewest number of iterations to

converge to the optimum but it also incurs the maximum premium in terms of

computational time. This is because the alternative optimality cut adds to the complexity

of the problem in that it appends additional variables and constraints to the master as

well as the subproblems. In the case of Model 4 (multicut method), the segregated

optimality cuts accurately reflect the shape of the recourse function resulting in fewer

overall iterations but the large number of these cuts accumulate over the iterations

making the master problem sluggish and so this model exhibits a variable performance

Page 5: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

v

for the various datasets. These experiments reveal that a compact master problem and a

conservative estimate for big H positively impact the run time performance of a model.

Finally, we develop a framework for a branch-and-bound scheme within which the L-

shaped method, as applied to the NMJS problem, can be incorporated so as to further

enhance its performance.

Our second approach for solving the stochastic NMJS problem relies on the tight LP

relaxation observed for the deterministic equivalent of the model. We, first, solve the LP

relaxation of the deterministic equivalent problem, and then, fix certain binary

assignment variables that take on a value of either a 0 or a 1 in the relaxation. Based on

this fixing of certain assignment variables, additional logical constraints have been

developed that lead to the fixing of some of the sequencing variables too. Experimental

results, comparing the performance of the above LP heuristic procedure with CPLEX

over the generated test instances, illustrate the effectiveness of the heuristic procedure.

For the largest problems (5 jobs, 10 operations/job, 12 machines, 7 workcenters, 7

scenarios) solved in this experiment, an average savings of as much as 4154 seconds and

1188 seconds was recorded in a comparison with Models 1 and 2, respectively. Both of

these models solve the deterministic equivalent of the stochastic NMJS problem but

differ in that Model 1 uses a big H value of 1000 whereas Model 2 uses the conservative

upper bound for big H developed in this work. The maximum optimality gap observed

for the LP heuristic over all the data instances solved was 1.35%. The LP heuristic,

therefore, offers a powerful alternative to solving these problems to near-optimality with

a very low computational burden. We also present results pertaining to the value of the

stochastic solution for various data instances. The observed savings of up to 8.8% over

the mean value approach underscores the importance of using a solution that is robust

over all scenarios versus a solution that approximates the randomness through expected

values.

We, next, present a dynamic stochastic scheduling approach (DSSP) for the NMJS

problem. The premise behind this undertaking is that in a real-life implementation that is

faithful to the two-stage procedure, assignment (routing) and sequencing decisions will

Page 6: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

vi

be made for all the operations of all the jobs at the outset and these will be followed

through regardless of the actual processing times realized for individual operations.

However, it may be possible to refine this procedure if information on actual processing

time realizations for completed operations could be utilized so that assignment and

sequencing decisions for impending operations are adjusted based on the evolving

scenario (which may be very different from the scenarios modeled) while still hedging

against future uncertainty. In the DSSP approach, the stochastic programming model for

the NMJS problem is solved at each decision point using the LP heuristic in a rolling

horizon fashion while incorporating constraints that model existing conditions in the

shop floor and the actual processing times realized for the operations that have been

completed.

The implementation of the DSSP algorithm is illustrated through an example problem.

The results of the DSSP approach as applied to two large problem instances are

presented. The performance of the DSSP approach is evaluated on three fronts; first, by

using the LP heuristic at each decision point, second, by using an optimal algorithm at

each decision point, and third, against the two-stage stochastic programming approach.

Results from the experimentation indicate that the DSSP approach using the LP heuristic

at each decision point generates superior assignment and sequencing decisions than the

two-stage stochastic programming approach and provides solutions that are near-optimal

with a very low computational burden. For the first instance involving 40 operations, 12

machines and 3 processing time scenarios, the DSSP approach using the LP heuristic

yields the same solution as the optimal algorithm with a total time savings of 71.4% and

also improves upon the two-stage stochastic programming solution by 1.7%. In the

second instance, the DSSP approach using the LP heuristic yields a solution with an

optimality gap of 1.77% and a total time savings of 98% over the optimal algorithm. In

this case, the DSSP approach with the LP heuristic improves upon the two-stage

stochastic programming solution by 6.38%. We conclude by presenting a framework for

the DSSP approach that extends the basic DSSP algorithm to accommodate jobs whose

arrival times may not be known in advance.

Page 7: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

vii

DEDICATION

To my Parents and in loving memory of my Grandparents

Page 8: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

viii

ACKNOWLEDGEMENTS

This dissertation was an undertaking born out of the confluence of extremely talented

and dedicated teachers, an immensely patient and wonderfully supportive advisor, a

tenacious problem and one curious mind that appeared fascinated with the world of

MEMS. Were it not for my advisor Dr. Subhash C. Sarin and the memorable MEMS

Conference at Johns Hopkins University that he insisted I attend back in late 2003, this

work would have perhaps not seen the light of day. I would like to gratefully

acknowledge and thank him for his keen supervision, foresight and reassuring guidance

during my years at Virginia Tech.

With deep gratitude I thank my committee: Dr. Michael P. Deisenroth, Dr. Robert H.

Sturges, Dr. Robert W. Hendricks and Dr. Barbara P. Fraticelli for being supportive of

my efforts and for their immense help during all stages of this work. I have greatly

benefited and truly enjoyed taking courses spanning the spectrum from Industrial

Automation, Semiconductor Devices fabrication to Graph theory and Network Flows

under them and I would be remiss if I did not acknowledge the level of commitment,

discipline and enthusiasm that they bring to their work everyday. I would like to take this

opportunity to thank Dr. Hanif D. Sherali for his illuminating lectures and penetrating

insights on the theory and techniques of optimization and for instilling in me with a

sound scientific basis to undertake serious and rigorous research. My humble thanks and

appreciation to Dr. Peter E. Haskell for teaching me the art of writing proofs.

A special thanks to Ms. Lovedia Cole for her unflinching assistance at all times and for

meticulously attending to all my administrative needs. As far as I can remember, I have

placed rush orders at her desk only to discover that she 'schedules' things far better and

faster than my optimization solver is capable of ! Kim Ooms, Nicole Lafon and Dot

Cupp deserve a special mention for cheerfully handling conference registrations, travel

bookings and reimbursements that seem to ever so often punctuate graduate student life.

Page 9: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

ix

I would like to thank my lab-mates Seon Ki Kim, Yuqiang Wang, Liming Yao, Lixin

Wang and Ming Chen in the EMR lab for the light and serious moments we have shared,

stimulating discussions and our musings on the vagaries of life which always seemed to

find their way to the whiteboard; it has been a pleasure working with you all on projects

and proposals over the years. Our lab has been an amalgamation of distinct cultures,

identities and personalities and this diversity has enriched my experience at Virginia Tech

making it immensely valuable. Thanks are also due to Hitesh Attri for his help with the

JAVA problem generator. I wish you all Godspeed in all your endeavors.

This has been a remarkable professional and personal journey for me, where times of

frustration and self-doubt always culminated in brilliant moments of discovery and

gratification. Sitting in my lab, grimly hoping that my algorithms would run a tad faster, I

have often sought solace in Piet Hein's sagacious observation "A problem worthy of

attack, proves its worth by fighting back".

On a personal note, I shall forever remain indebted to my parents Dr. Varadarajan S.

Iyengar and Dr. Kanta Varadarajan for their indefatigable support and selfless sacrifice

over the years. Their deep, abiding respect for education combined with a strong sense of

discipline and ethical values unarguably make their shoes too big for me to fill.

Overwhelming thanks to my siblings Dr. Srinidhi Varadarajan and Archya Singhal, my

brother-in-law Ankur Singhal and my sister-in-law Shfawn Varadarajan for all their help,

comradeship, precious memories and lighthearted moments in the past and the promise

of equally good times in the future. Finally I am sincerely thankful to all those who,

beyond the scope of this scientific pursuit showed an interest in my work and to my

friends Suhas Rao, Rakesh Pathak, Gaurav Singh and Ravi Kant who provided

unwavering support in many simple ways when times were tough.

Page 10: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

x

CONTENTS

List of Tables ...................................................................................................................................................xii List of Figures.................................................................................................................................................xiv

1 Introduction 1.1 Background...................................................................................................................... 1 1.2 Problem Definition ........................................................................................................ 5

2 Literature Review 2.1 Flexible job shop scheduling ......................................................................................11

2.1.1 Mathematical Programming and Heuristic Procedures......................11 2.1.2 Other Approaches.....................................................................................18

2.2 Multisite Planning and Scheduling.............................................................................25 2.2.1 Mathematical Programming and Heuristic Procedures......................25 2.2.2 Other Approaches.....................................................................................31

2.3 Stochastic Scheduling...................................................................................................32 2.3.1 Stochastic Flow Shop Scheduling...........................................................32 2.3.2 Stochastic Job Shop Scheduling .............................................................41

2.4 MEMS Scheduling........................................................................................................45 2.5 Stochastic Programming Models ...............................................................................47

2.5.1 Applications of Two-Stage Stochastic Linear Programs......................49 2.6 The L-shaped Method of Van Slyke and Wets.........................................................53 2.7 The Multicut Algorithm ..............................................................................................58

3 The Stochastic Model for a network of MEMS job shops (NMJS) 3.1 Background....................................................................................................................61 3.2 The Two-stage Stochastic Program with Recourse for the NMJS problem .....63 3.3 Approach 1: The L-shaped Method for the NMJS problem ...............................70

3.3.1 Subproblem/Stage-II (Recourse) Problem Augmented with Artificial Variables – Phase I ...................................................................74

3.3.2 Development of Feasibility Cuts ............................................................75 3.3.3 Eliminating Infeasibilities due to Reentrant Flow...............................77 3.3.4 Eliminating Infeasibilities due to Cyclic Scheduling ...........................81 3.3.5 Analysis of Deadlock Occurrence and Prevention .............................84 3.3.6 Optimality Cut .........................................................................................121 3.3.7 An Alternative Optimality Cut..............................................................124 3.3.8 Determining the Magnitude of Big ‘H’ ...............................................135 3.3.9 An Alternative Stage-I (Master) Objective Function........................137 3.3.10 Generating Test Instances .....................................................................138 3.3.11 Optimality Cut for the Multicut Algorithm........................................143 3.3.12 Experimentation – I................................................................................144

3.3.12.1 Experimentation on the Impact of Reentrant Flow and deadlock prevention constraints on performance............146

Page 11: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

xi

3.3.12.2 Run Time Performance of Different Models with a Time Based Objective Function ...................................................156

3.3.12.3 Run Time Performance of Different Models with a Cost Based Objective Function ...................................................163

3.3.13 The Integer L-shaped algorithm for the NMJS problem.................170 3.4 The Deterministic Equivalent for the Stochastic NMJS Problem........................172

3.4.1 Experimentation –II ....................................................................................174 3.5 Experimentation on the Value of Stochastic Solution...................................................187 3.6 Experimentation on the Impact of Customer Priorities for Completion Time and

Budget Surplus ......................................................................................................................191

4 A Dynamic Stochastic Scheduling Approach 4.1 Introduction .................................................................................................................197 4.2 The DSSP Approach...................................................................................................197 4.3 An Illustrative Example .............................................................................................202 4.4 The DSSP Approach Applied to a Large Problem Instance ..............................216 4.5 Flowchart for the generalized DSSP approach .....................................................230

5 Conclusions and Future Research 5.1 Conclusions...................................................................................................................233 5.2 Future Research............................................................................................................236

References .......................................................................................................................................238

Page 12: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

xii

LIST OF TABLES

2.1 Summary of contributions in stochastic flow shop scheduling .....................................................33 2.2 Summary of contributions in stochastic job shop scheduling .......................................................42 3.1 Impact of reentrant flow and deadlock prevention constraints on the CPU time

required to obtain the optimal solution ...........................................................................................147 3.2 Difference in CPU time between models........................................................................................151 3.3 Performance difference between the models..................................................................................153 3.4 Impact of reentrant flow and deadlock prevention constraints on number of feasibility and optimality cuts generated ............................................................................................................154 3.5 Description of models tested.............................................................................................................156 3.6 Run time performance of models with a time based objective for dataset 2/6/8/5/3..........157 3.7 Run time performance of models with a time based objective for dataset 3/4/8/5/3..........158 3.8 Run time performance of models with a time based objective for dataset 4/4/8/5/3..........159 3.9 Run time performance of models with a time based objective for dataset 3/6/8/5/3..........160 3.10 Run time performance of models with a time based objective for dataset 3/6/8/5/5.........161 3.11 Description of models tested............................................................................................................163 3.12 Run time performance of models with a cost based objective for dataset 2/6/8/5/3 .........164 3.13 Run time performance of models with a time based objective for dataset 3/4/8/5/3.........165 3.14 Run time performance of models with a time based objective for dataset 4/4/8/5/3.........166 3.15 Run time performance of models with a time based objective for dataset 3/6/8/5/3.........167 3.16 Run time performance of models with a time based objective for dataset 3/6/8/5/5.........168 3.17 Performance of Models 1, 2 and the LP Heuristic for dataset 3/6/8/5/3..............................174 3.18 Performance of Models 1, 2 and the LP Heuristic for dataset 3/10/8/5/3............................175 3.19 Performance of Models 1, 2 and the LP Heuristic for dataset 4/10/8/5/3............................176 3.20 Performance of Models 1, 2 and the LP Heuristic for dataset 3/6/8/5/5..............................177 3.21 Performance of Models 1, 2 and the LP Heuristic for dataset 3/10/8/5/5............................178 3.22 Performance of Models 1, 2 and the LP Heuristic for dataset 4/10/8/5/5............................179 3.23 Performance of Models 1, 2 and the LP Heuristic for dataset 3/6/8/5/7..............................180 3.24 Performance of Models 1, 2 and the LP Heuristic for dataset 3/10/8/5/7............................181 3.25 Performance of Models 1, 2 and the LP Heuristic for dataset 4/10/8/5/7............................182 3.26 Performance of Models 1, 2 and the LP Heuristic for dataset 4/10/12/5/7 .........................183 3.27 Performance of Models 1, 2 and the LP Heuristic for dataset 5/10/12/7/3 ..........................184 3.28 Performance of Models 1, 2 and the LP Heuristic for dataset 5/10/12/7/7 .........................185 3.29 Value of stochastic solution ..............................................................................................................188 3.30 Impact of penalty factors on the completion time and budget surplus values for Job J1 .....192 3.31 Impact of penalty factors on the completion time and budget surplus values for Job J2 .....193 3.32 Impact of penalty factors on the completion time and budget surplus values for Job J3 .....194 3.33 Impact of penalty factors on the completion time and budget surplus values for Job J4 .....195 3.34 Impact of penalty factors on the completion time and budget surplus values for Job J5 .....196 4.1 A comparison of solutions obtained by the DSSP and the ATSSP approaches.....................212 4.2 A comparison of solutions obtained by the DSSP, the optimal and the ATSSP approaches for a large problem instance (Instance 1) ........................................................................................216 4.3 A comparison of time and solutions obtained by the LP heuristic and CPLEX at each

Page 13: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

xiii

decision point for a large problem instance....................................................................................217 4.4 A comparison of solutions obtained by the DSSP, the optimal and the ATSSP approaches for a large problem instance (Instance 2) .........................................................................................221 4.5 Times and solutions obtained by the LP heuristic at each decision point for a large problem instance..................................................................................................................................222 4.6 Time and solutions obtained by CPLEX at each decision point for a large problem instance ..................................................................................................................................................226

Page 14: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

xiv

LIST OF FIGURES

1.1 2005 MEMS market by product............................................................................................................. 2 1.2 Worldwide MEMS market estimate ..................................................................................................... 2 1.3 A network of MEMS job shops ............................................................................................................ 6 2.1 Disjunctive graph depicting alternative machines............................................................................20 2.2 A feasible selection ...............................................................................................................................20 2.3 Underlying theme of the integrated approach ..................................................................................21 2.4 Flow chart depicting the L-shaped method of van Slyke and Wets .............................................59 3.1 A deadlock configuration involving two machines..........................................................................73 3.2 Cyclic and acyclic schedules for the case of two machines ............................................................81 3.3 Other forms of deadlock ......................................................................................................................83 3.4 A deadlock free configuration .............................................................................................................84 3.5 Operations related by R2 and sequenced according to S2 .............................................................85 3.6 Network representation of a two-machine deadlock ......................................................................86 3.7 Acyclic network configurations ..........................................................................................................88 3.8 Operations related by R2 but not sequenced according to S2.......................................................90 3.9 Network representation of the configurations obtained by negating clauses 1or 2 or both under R2 and S2 ...........................................................................................................................93 3.10 A deadlock involving three machines................................................................................................96 3.11 Network representation of a three machine deadlock ...................................................................97 3.12 Acyclic network configurations ........................................................................................................100 3.13 Resulting configuration obtained by preserving clauses 1, 2 and 3 under R3 but negating clause 4..................................................................................................................................................101 3.14 A two-machine deadlock..................................................................................................................102 3.15 Operations related by R3 but not sequenced according to S3 ..................................................104 3.16 A deadlock configuration involving m machines where operations are related by Rm and sequenced according to Sm.......................................................................................................107 3.17 Network representation of a m machine deadlock ......................................................................109 3.18 n and p operations .............................................................................................................................110 3.19 Network representation of a configuration belonging to Σ but where clauses 1* and 2* under Rm are negated........................................................................................................................113 3.20 Deadlock configuration – machines belonging to category C2.................................................115 3.21 An equivalent configuration after re-numbering the machines.................................................115 3.22 Network representation of Figure 3.21..........................................................................................116 3.23 The L-shaped method as applied to the NMJS problem ...........................................................123 4.1 AS chart at decision point 1.............................................................................................................204 4.2 AS chart at decision point 2.............................................................................................................206 4.3 AS chart – ATSSP approach............................................................................................................213 4.4 AS chart for case 1 – DSSP approach............................................................................................213 4.5 AS chart for case 2 – DSSP approach............................................................................................214 4.6 AS chart for case 3 – DSSP approach............................................................................................214 4.7 AS chart for case 4– DSSP approach.............................................................................................214 4.8 AS chart for case 5 – DSSP approach............................................................................................214

Page 15: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

xv

AS chart for case 6 – DSSP approach ....................................................................................................215 4.10 AS chart for case 7 – DSSP approach............................................................................................215 4.11 AS chart for case 8 – DSSP approach............................................................................................215 4.12 AS chart for case 9 – DSSP approach............................................................................................215 4.13 A plot comparing the solutions of the LP heuristic versus an optimal algorithm at each decision point for instance 1 ...........................................................................................................219 4.14 A plot comparing the CPU times of the LP heuristic versus that required by CPLEX at each decision point for instance 1 ..................................................................................................219 4.15 AS chart determined by the ATSSP approach for instance 1....................................................220 4.16 Final AS chart determined by the DSSP approach for instance 1 ............................................220 4.17 A plot depicting the LP heuristic solution value at each decision point for instance 2 ........224 4.18 A plot of the CPU times required by the LP heuristic at each decision point for instance 2.............................................................................................................................................224 4.19 AS chart determined by the ATSSP approach for instance 2....................................................225 4.20 AS chart determined by the DSSP approach for instance 2 ......................................................225 4.21 A plot depicting the optimal solution at each decision point for instance 2...........................228 4.22 A plot of the CPU times required by the optimal algorithm at each decision point for instance 2.............................................................................................................................................228 4.23 AS chart determined by the optimal algorithm for instance 2 .................................................229

Page 16: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

1

C h a p t e r 1

INTRODUCTION

1.1 Background

This work is motivated by the pressing need for operational control in the fabrication of

Microelectromechanical systems or MEMS. MEMS are miniature three-dimensional

integrated electromechanical systems with the ability to absorb information from the

environment, process this information and suitably react to it. In this capacity, these

devices can simultaneously “sense”, “control” and “actuate” so as to completely redefine

the way man and machine perceive and interact with the physical world. These devices

offer tremendous advantages owing to their small size, low power consumption, low

mass and high functionality, which makes them very attractive in applications with

stringent demands on weight, functionality and cost.. While the systems ‘brain’ (device

electronics) is fabricated using traditional IC technology, the micromechanical

components necessitate very intricate and sophisticated processing of silicon or other

suitable substrates. Yole Development, (2005) a research firm that follows the MEMS

market predicts in its Ultimate MEMS Market Analysis, report that MEMS revenues will

swell up to $7 billion in 2007, see Figure 1.2. Also, see Figure 1.1 for the MEMS market

segmentation by product.

The often quoted factors hindering the growth of MEMS have been the lengthy

gestation period required from design concepts to mass fabrication and commercial

acceptance of the product in the market and the near absence of fabrication facilities with

micromachining capabilities. The few fabs that possess these capabilities do not usually

cater to the MEMS R&D community, and when they do, they are unable to offer a

complete array of fabrication processes that make each device so unique. Since MEMS

devices are highly application specific, their production volumes are low. Coupled with

the fact that each device design calls for a unique processing sequence, this has resulted

Page 17: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

2

2005 MEMS Market Segmentation

8% 11%11%

18%24%

28%

AccelerometersGyroscopesOther ProductsMicromirrorsPressure sensorsInkjet Heads

Total sales = $ 5.4 billion

Figure 1.1: 2005 MEMS Market Segmentation by Product (Statistics from Yole Development)

3.854.5

5.46.2

7

01234567

Estim

ate

in b

illio

ns o

f do

llars

2003 2004 2005 2006 2007

Year

MEMS WorldWide Market Projection

Figure 1.2: MEMS Worldwide Market Projection (Statistics from Yole Development)

Page 18: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

3

in little to no standardization of technologies in this field. Some stand-alone fabs that

cater to the MEMS R&D community offer process modules with rather limited

fabrication capabilities in an effort to create standardized process sequences similar to the

wafer fabrication industry.

A distributed fabrication network has, therefore, emerged to serve the evolving needs of

this high investment, low volume MEMS industry. Under this environment, a central

facility coordinates between a network of fabrication centers (containing micromachining

capabilities. These fabrication centers include commercial, academic and government

fabs) which make their services available to the ordinary customer. Wafers are shipped

from one facility to another until all processing requirements are met. One of the major

advantages to such an approach lies in the fact that the facilities can now collectively

offer a wide variety of fabrication processes to a MEMS designer who is now not

constrained by the technologies available at a single facility, but instead, has the added

flexibility to choose between multiple facilities capable of performing the same operation.

Such flexibility also provides the designer with pricing and quality options. Moreover, a

centralized fab would be slower to react to evolving needs than a network of fabs, each

operating independent of the other, of which some are capable of offering highly

specialized processes. To add to this, the cost of setting up a state-of-the-art facility

outweighs the cost of coordinating between preexisting fabs that offer select processes.

Operating in this fashion would also enable underutilized fabs in the country to be more

visible and efficient. The distributed fabrication network (henceforth, referred to as the

Network of MEMS job shops (NMJS)), thus, offers a novel solution to the problem of

MEMS manufacturing.

A similar concept is being examined in Europe in a study called the ‘Flying Wafer

Project’ (2006) that aims at connecting European R&D institutes, industrial 300 mm

research sites and pilot lines to facilitate a virtual 300 mm CMOS R&D line. The

objectives of this study include developing specifications for: 1) wafer/carrier handling,

transport and logistics, 2) wafer/carrier tracking and monitoring, 3) standardization of

data interfaces and secure data access, 4) input/output procedures and optimized

Page 19: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

4

contamination control, and 5) virtual process flow planning, so as to ensure a economical

wafer transport with minimal risk between the various sites, and to provide a secure

means of data transfer and management through the internet.

The NMJS on the other hand, primarily caters to the MEMS R&D community, and

therefore, handles a wide variety of jobs. Each of these jobs needs to be custom built,

and therefore, follows a distinct processing sequence. There is no sequencing flexibility as

all operations in the processing sequence follow a strict precedence. The total number of

individual processing steps to be performed on each job varies with the type of the job or

the device being built. Individual processing steps across jobs are also expected to vary

widely owing to unique parameter settings for each operation. Operation processing

times are stochastic due to several reasons. Firstly, since the machinery in the fabs is

expected to process a broad spectrum of devices typically in low volumes, the use of

dedicated machinery for individual parameter settings of a specific process is not

economically justified. As parameter settings vary at a particular step, so does the time

taken to accomplish the step. Secondly, as compared to commercial fabs, academic fabs

tend to view industry deadlines less stringently and therefore time and commitment are

issues that prevent them from functioning as efficiently as commercial fabs. Finally, in a

manufacturing environment characterized by MEMS R&D jobs engineering design

changes are often required at process steps in order for the device to be manufacturable.

Set up time is incurred whenever a processor begins a new job and these set ups are

expected to be sequence dependent. As the NMJS caters to low volume, highly

application specific needs of its customers, it is believed that lot sizes will be small and

the nature of job arrivals dynamic. Also, reentrant flow is expected to be the norm.

In summary, the lengthy and intricate process sequences that need to be performed over

a network of capital intensive facilities are further complicated by the high variability in

job arrivals, process durations, processing sequence lengths as well as their composition

Unless the production of these novel devices is carefully optimized, the benefits of

distributed fabrication could be completely overshadowed by lengthy lead times, chaotic

routings and costly processing. Our work is, therefore, motivated by the pressing need for

Page 20: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

5

operational control in the network so as to permit economical (in terms of cost incurred) and efficient (in

terms of lead time realized) manufacturing of these devices. In particular, our goal is to develop and

validate an approach for optimal routing (assignment) and sequencing of MEMS devices in a network of

stochastic job shops with the objective of minimizing the sum of completion times and the cost incurred,

given a set of fabs, machines and an expected product mix. By reducing completion time (also

called production lead time), one can hope to decrease product development costs owing

to more prototype iterations that now become possible per unit time. Also, inventory

levels are reduced, and consequently, engineering changes in a processing sequence are

likely to affect a smaller percentage of work than it would have been affected otherwise.

This work although undertaken for the very first time for the MEMS industry, will have a

broad impact on the manufacturing and operational control of all future application

specific technologies that rely on specialized manufacturing processes available only at

choice locations and where it is economically infeasible to build a central state-of-the-art

facility. Since facilities involved in the manufacture of nascent technologies will not have

highly reliable estimates of processing time durations, job arrival patterns etc, a significant

amount of variability is expected to be present in the manufacturing system during

product infancy. Contributions from our research will offer insights and fundamental

ideas for efficient production control in such stochastic environments. Additionally, our

work may also find applications in existing manufacturing environments where products

are required to visit several factories in prespecified sequences in order to have all of their

processing requirements met. However, at a given factory, there could exist flexibility in

the form of parallel machines for the processing of jobs. Lastly, this undertaking will help

highlight the issues associated with a network of stochastic MEMS job shops as being

rather distinct from those associated with the traditional job shops concerns, and it will

form building blocks of an interesting research area in its own right.

1.2 Problem Definition

Figure 1.3 depicts a simplified network of MEMS job shops. Each customer order

(interchangeably called a job) comprises of a requirement for a certain number of devices

that are built on silicon wafers. Orders from different customers arrive at different points

of time. An order undergoes processing at various stages on specific machines that are

Page 21: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

6

capable of performing the requisite operations on the job. Orders differ in the number

as well as the type of processing stages that they must undergo in order to complete all

processing requirements. At each stage, however, there exists flexibility in terms of

alternative machines, to choose from for processing. These alternative machines could

either be located in the same fab (or facility) or in different fabs. The processing time and

the cost of processing a job in accordance with each of these alternatives may be

different. The choice of a machine for processing a given operation on a job, therefore,

rests entirely on the performance measure to be optimized and other hard constraints

that may need to be satisfied. Since some fabs may not possess the machining capabilities

to handle all the requirements of a job, the job may be transferred to other fabs to fulfill

its processing requirements. In doing so, transfer times and transfer costs are incurred.

Figure 1.3: A network of MEMS job shops

Fab 1

Fab 2

Fab 3

Fab 4

Central Facility

Customer delivery

Page 22: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

7

On the other hand, if a job merely travels within the same fab for successive operations,

then the travel time and cost incurred are assumed to be negligible.

Once jobs queue up at a machine for processing, the sequence in which to process their

respective tasks would need to be determined. Jobs incur set up times on machines

before their processing. This set up is sequence dependent, which means that the time

required to set up a job on a machine depends on the previous job that just finished its

processing on that machine. This sequence of tasks, once again, is an outcome of the

overall performance objective that one wishes to achieve. Job arrival times, number of

processing stages for each job, machines capable of processing each stage, travel times

and travel costs between fabs, sequence-dependent set up times and customer budget are

assumed to be known and are, therefore, deterministic. But, since exact processing time

values for the machines may not be available and/or are expected to vary, these

parameters are assumed to be stochastic.

Operational control issues in a network of MEMS job shops involve assigning to each

operation of a job, a fab and a machine that it can be processed on, and also, to

determine its start time. In doing so, the interactions amongst jobs and the capacities of

the machines would need to be duly considered.

In view of the above, the problem that we address in this dissertation can be stated as

follows:

“Given a set of jobs and a network of MEMS job shops, determine an allocation of job

operations to facilities and a sequence in which to process the operations on each

machine in the facility, where, operation processing times are uncertain and the sequence

in which to process the operations of each job is known apriori, so as to minimize a

function of the completion times and the transfer and processing cost incurred”.

In allocating job operations among machines in multiple facilities, one must

consider the processing time and the cost for performing this operation on each

Page 23: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

8

processor since these may vary, as well as the transfer costs between machines

because transfers are expensive in terms of money, time and, may even pose a

risk, as certain fabs might not possess the required degree of familiarity with a

certain product. The allocation essentially fixes a route for each job as it flows

through the network. Once the operations of all the jobs have been allocated (i.e.

the routing/assignment problem is solved) to the different processors in the

various facilities, the question then is ‘In what sequence should these be

processed at the individual machines in a facility?’ The sequence chosen must

take into account the sequence-dependent nature of the set ups while optimizing

the performance measure of choice.

Page 24: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

9

C h a p t e r 2

LITERATURE REVIEW

This chapter presents a study of literature pertinent to the NMJS problem. Although this

problem has not been directly addressed in the literature so far, the nature of the problem

bears a certain resemblance to flexible job shops since each job operation can be

performed on multiple machines. Moreover, the areas of multi-site and stochastic

scheduling definitely merit attention because the NMJS problem involves scheduling of

operations at multiple sites on machines with stochastic processing times. Literature on

MEMS scheduling although sparse is very relevant to this problem as it may throw light

on existing approaches for scheduling these complicated devices. Finally, a study of

stochastic programming models is also presented because this area deals with models and

approaches for decision making under uncertainty and has been very successfully used in

areas as diverse as production planning, asset liability management, traffic management,

freight scheduling, among others.

Scheduling can be defined as the systematic assignment of jobs to processors and

determination of their start times to ensure completion of these tasks in reasonable time

(Jones and Rabelo, 1998). Scheduling problems give rise to allocation and sequencing

decisions. Graves (1981) introduced a classification scheme for scheduling problems. The

criteria for classification are

1. Requirement Generation

2. Processing Complexity

3. Scheduling Criteria

4. Parameter Variability

5. Scheduling Environment

Page 25: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

10

There exists a particular subclass of problems that fall under the category of ‘Processing

complexity’. These problems are referred to as flexible job shop problems (FJSP) or

Multi-processor job shop scheduling problems. The FJSP is a generalization of the

classical job shop scheduling problem Jm||Cmax (a job shop with m machines and

makespan as the objective. This naming convention was introduced by Graham et al.

(1979) and consists of three fields α|β|γ where α describes the machine environment, β

specifies the job characteristics and γ provides the objective function) which has been

addressed by many researchers. In a traditional job shop, a set of jobs is processed on a

collection of machines and each job has a set of operations that it undergoes

consecutively. The jobs take different routes through the shop but every operation can

be performed by only one machine and preemption is not permitted. In contrast,

operations in a FJSP can be processed by multiple machines, or in other words, to every

operation is designated a set of machines capable of performing it. This problem is very

relevant in flexible manufacturing systems that have the ability to perform multiple

operations and in shops with multiple parallel machines dedicated to performing a

specific task. A similar situation exists in a distributed fabrication environment such as

the NMJS. This is because, the entire fabrication of a MEMS device is performed over a

network of facilities each of which possesses some unique as well as some common

processing capabilities as its sister facilities. Also, since every MEMS device calls for a

unique processing sequence, the existing environment is that of a pure job shop. In

summary, we have a job shop with multiple machines capable of processing each

operation, though of course, the shop here stands to mean a ‘network of facilities’.

Numerous approaches have been presented in the literature for tackling the job-shop

problem. These include heuristic or dispatching rules, which assign a priority to jobs

waiting in queue for a machine, schedule permutation techniques such as those seen in

tabu search, genetic algorithms and simulated annealing in which a feasible initial

schedule is systematically permuted and the best result is returned subject to a

performance measure. Then, there are Artificial Intelligence techniques that employ

neural networks, fuzzy logic and search methods, and finally, analytical and semi-

analytical methods, which use mathematical models (Subramaniam et al., 2000). While

Page 26: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

11

job-shop problems have attracted a lot of attention, research in the area of FJSP has been

sparse. Next, we briefly review the work reported in the area of flexible job shops.

2.1 Flexible Job Shop Scheduling:

2.1.1 Mathematical Programming and Heuristic Procedures:

Nasr and Elsayed (1990) investigate the problem of minimizing the flow time in a general

job shop with alternative machine tool routings.

Notation: i job number ( 1,2,...,i n= )

k machine k ( 1,2,...,k m= )

iJ number of operations required to complete job i .

ijO operation number j of job i

ijr time at which operation j of job i becomes ready for scheduling

ijd set of machines that can process ijO

ijS starting time of operation ijO

ijC completion time of operation ijO

=

1, if operation of job is processed on machine0, otherwiseijk

j i kX

ijkO operation j of job i if processed on machine k

ijkt machining time of operation ijO on machine k

H a very large positive number

∈ ∩=

1, if precedes on machine ( )

0, otherwiseij pq ij pq

ijpqk

k kO O d dy

Page 27: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

12

They provide a mixed-integer program, and view the scheduling process as a decision

making process over a set of discrete event times given by 1 2, , ..., qT t t t= . Each

element of T marks the finish time of one or more operations. The problem is

decomposed into q sub-problems where in each sub-problem, a set of operations

available for scheduling is considered with at most one operation from each job such that

ij etr ≤ . In each of the subproblems, the objective is to minimize the mean flow time

subject to machine assignment and sequencing constraints.

The subproblem is given by:

1

1

1, 1 , (2.1)

(1 ) (1 ) (1 ) ,

1 , , 1 . (2.2)(1 ) (1 ) ,

1 , , 1 . (2.3), 0 1, (2.4)

n

iji

m

ijkk

ijk pqkpq ij pqkijpqk

ijk pqkij pq ijkijpqk

ijk ijpqk

C

i nX

H H HyC C tX Xi p n k m

H H HyC C tX Xi p n k m

yX

=

=

= ∀ ≤ ≤

− + − + − + − ≥

∀ ≤ ≤ ≤ ≤− + + − + − ≥

∀ ≤ ≤ ≤ ≤=

M1 : Minimize

subject to

or

, 0.ijijkt C ≥ The objective of M1 is to minimize the mean flow time of jobs. Constraint (2.1) permits

only one machine to be assigned to each operation. Constraints (2.2) and (2.3) are

disjunctive constraints that ensure that two different jobs cannot be processed on a

machine at the same time.

The subproblems are further simplified by forcing each machine to process only one job.

The reduced sub-problem described below resembles the assignment problem.

1 1

1

:

1, 1, ..., . (2.5)

n m

ikiki k

m

ikk

z C X

i nX

= =

=

=

= ∀ =

∑∑

M2 Minimize

subject to

Page 28: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

13

1

1, 1,..., . (2.6)

0. (2.7)

n

iki

ij

k mX

C=

= ∀ =

=

1, if job is processed on machine0, otherwiseik

i kX (2.8)

The objective of M2 ensures that the assignment of jobs to machines results in the

minimum flow time. Constraints (2.5) and (2.6) are assignment constraints which force

every operation to be assigned to only one machine and each machine to be assigned to

only one operation. A scheduling algorithm is provided that begins with the first

operations of all jobs, and using the Hungarian algorithm in conjunction with a conflict

solving procedure, all operations of all jobs are assigned to machines. The elements in the

decision matrix of the Hungarian algorithm are lower bounds on the flow time of an

operation Oij if it were to be processed on machine k. A conflict arises when two

operations require the same machine and a decision must be made as to which operation

is to be assigned first and whether to keep the second operation waiting or to assign it to

one of the alternative machines capable of processing it. They also generalize the SPT

rule for a single machine case to the case of multiple machines by a simple replacement

of the entries in the decision matrix of the Hungarian algorithm.

Kim (1990) considers a job shop in which alternative routings may exist for each job or

part and the order size for each part is greater than one. An order is defined by its

quantity, a due date and a part. The processing of a part is called a job. The problem

results in special precedence relationships. For instance, if there exists only one machine

that can process the first operations for parts of an order, then the first operation of the

first part must necessarily precede the first operation of the second part. If there are two

machines available, then the above condition need not hold true. Both operations can

begin simultaneously. Heuristic algorithms called ‘list scheduling algorithms’ are used that

generate the order of the operations using priority rules. Eight different dispatching rules

are used and four methods are prescribed to calculate parameters specific to these rules.

In these four methods, they consider alternative machines and vary the amount of

Page 29: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

14

consideration given to precedence relationships between operations of different parts in

an order.

Hutchison et al (1991) address the problem of scheduling FMS in a random job-shop

environment. They consider a static environment and provide a mathematical program

that considers both loading (assignment to one machine out of several alternatives) and

the scheduling problems simultaneously. The objective is to minimize the makespan. In

light of the previously defined notation, we define the following new notation:

( )o k for each operation there exists a one-to-one correspondence between option

( )o and machine ( )k . An option is associated with one machine only.

( )o ki jC completion time of the tho option of operation j on machine k of job i .

( )o ki JC the completion time of the tho option of the last operation J on machine

k of job i

( )o ki jt machining time of the tho option of operation j on machine k of job i .

( ) ( )o k l k

thi pj q ly

=

1, if the option of operation on machine of job precedes the

option of operation on machine of job , 0, otherwise

th j k i oq k p

( ),

o ki jX

=

1, if the option of operation on machine of job is used0, otherwise

th j k i o

ijd the number of alternative machine options for operation j of job i

iQ the number of operations of job i

maxT the largest completion time for the last operation of all jobs

Page 30: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

15

( )( ) ( ) ( )

( )( ) ( ) ( )( ) ( )

max

( 1)( 1) ( 1)

( 1)1

: Minimizesubject to

(1 ) ,

1,..., , 1,..., , 1,..., (2.9)( ) ,

, 1,..., ( ), 1,..., , 1,..

o ko k o k o k

o ko k l k o ko k l k

ii i ijj j j

i ji

ii p i jj q ji pj q

i

T

HC C tXi n j kQ d

H yC C t X

i p n i p j qQ

++ +

+−

− + − ≥

∀ = = =

− + ≥

∀ = ≠ = =

M3

( )( ) ( ) ( )( ) ( )

( )

( )( ) ( )

max

11 1

., , (2.10)

1,...,(1 ) ,

, 1,..., ( ), 1,..., , 1,..., , (2.11)

1,...,, 1,...., ; 1,..., (2.12)

, 1,....,

l kl k o k l ko k l k

o k

o ko k o k

j

pp i p qq j qi pj q

i j

i iJJ

ii i

Q

k mH yC C t X

i p n i p j qQ Q

k mi n oC dT

iC t X

=− + − ≥

∀ = ≠ = =

=≤ ∀ = =

≥ ∀ = 1; 1,..., (2.13)in o d=

( )( ), 1,..., ; 1,..., ; 1,..., (2.14)

o ko k ii ijjj ii n j oQC dHX≤ ∀ = = =

( )1

1, 1,..., ; 1,..., (2.15)ij

o k

d

i j io

i n j QX=

= ∀ = =∑All variables are non-negative.

The objective of M3 is to minimize the makespan. Constraints (2.9) capture precedence

conditions between operations of a job. Constraints (2.10) and (2.11) are disjunctive

constraints that permit only one job to be processed on the machine at any time.

Constraints (2.12) set maxT to be at least greater than or equal to the largest completion

time amongst all operations. Constraints (2.13) ensure that for the tho alternative machine

option, the completion time of the first operation must be greater than or equal to its

processing time. Similarly, if the tho option is not used, then constraints (2.14) state that

the completion time must equal zero. Finally, constraints (2.15) state that only one

alternative machine option can be used.

Three schemes are presented to solve the problem. Under scheme 1, a branch and bound

procedure is used to solve the mathematical formulation and adjustments are made to

accommodate alternative machine options. Branches are pruned by calculating lower

bounds, which are obtained by relaxing the precedence constraints in the original

problem. Even when the precedence constraints are relaxed, the problem continues to be

Page 31: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

16

combinatorial in nature and a second branch and bound scheme is introduced to solve

this relaxed problem wherein a LP solution is used as its lower bound. Under scheme 2,

the original problem is decomposed into a loading problem and a scheduling

subproblem. The loading problem attempts to balance machine loads as it assigns a

unique machine out of the several available for each job-operation. The resulting

scheduling subproblem resembles the classical job-shop problem and it is solved using

yet another branch and bound algorithm. Finally, under scheme 3, an SPT dispatching

rule with a look ahead control policy is used, which chooses the machine with the least

load in order to process an operation when the operation has multiple machines to

choose from. Computational results show that offline schemes 1 and 2 outperform the

online scheme 3 and the differences becoming significant as the routing flexibility

increases. Moreover, scheme 1 outperforms scheme 2 but it does so at the cost of a

higher computational expense. Once again, the differences (between scheme 1 and 2)

increase as the routing flexibility increases.

Subramaniam et al. (2000) consider a dynamic job shop with multipurpose machining

centers. They propose a new two-stage scheduling approach in which a machine

selection rule is first applied to identify a free machine to be scheduled followed by a

dispatching rule. The dynamic job shop considered in this paper includes machine

breakdowns, repairs and alternative routings. They examine two cases: (1) all machines

are identical and can process any operation (complete flexibility), and (2) a job can be

alternatively processed on three machines only (partial flexibility). The performance

measures used are mean job cost and mean job tardiness. They propose three machine

selection rules. The first rule (LAC) calculates the cost of processing every job operation

in the machine queue and the machine with the lowest average cost is given the highest

priority. The second rule (LAP) determines the average processing time of each machine

queue and gives priority to the machine with the lowest average processing time. The

third rule (LACP) gives the highest priority to the machine with the minimum aggregate

cost and processing time. These rules are used to select a machine for scheduling. Once a

machine is chosen, dispatching rules such as random, first-in-first-out, earliest due date

and shortest processing time are used in order to sequence the jobs on the machine. A

simulation analysis was carried out and results indicated that the LAC and LAP rules

Page 32: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

17

perform well for the mean cost and mean tardiness performance measures whereas the

LACP rule gives the best results for the mean cost performance measure although it does

not perform as well with the tardiness criterion.

Alvarez –Valdez et al. (2005) describe a heuristic based scheduling system for a glass

factory whose structure corresponds to a flexible job shop but with certain unique

characteristics. Certain operations are allowed to overlap whereas others have no-wait

constraints between them. While the primary objective function is a non-convex cost

function corresponding to job completion times, some secondary objectives such as

minimizing work-in-progress are also taken into account.

Using the notation below:

iC completion time of job i

p penalty

id due-date of job i

ie early-date of job i

il late-date of job i ,

( )

( )

( )( ) ( )

( ) ( ) ( )

( ) ( ) ( )

1

1 2

1 2 3

1 2 3 4

0, ,, ,

,,

,,

i ii

i i

i i i i i

i i i i

i i ii

i i i i i i

i i

i i i i i i i

i i

pu C

C dk d C e C d

k d e k C de C lp C

k d e k l d k e CC e

k d e k l d k e k C lC l

= − ≤ ≤ − + −

≤ ≤= − + − + −

≤ − + − + + −

∑Minimize

where the non-convex objective function is given byifif

if

if

if

and 1 2,k k and 3k are parameters with non-decreasing values that reflect the penalty of

the placement of iC with respect to ie and id .

The heuristic is capable of generating due dates or completion times for customer orders.

In case the due-dates are given, it is capable of checking whether a schedule that satisfies

Page 33: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

18

as many due dates as possible can be generated. Jobs are assigned urgency levels or

priorities. The problem is solved by decomposing it into urgency levels and solving each

subproblem in a decreasing order of urgency. Subproblems are solved in two phases. In

Phase-I, an initial solution is generated whereas Phase-II improves upon this solution.

Computational results, however, have not been reported.

2.1.2 Other Approaches:

In addition to the above mathematical programming procedures, there have been other

approaches such as the Tabu search, Hybrid methodologies and Multi Agent systems

that have been used successfully to tackle the FJSP.

Tabu Search:

Tabu search is a popular neighborhood search technique that has been used by

researchers for the FJSP. It prevents solutions from cycling by penalizing moves that take

the solution in the following iteration to points in the feasible region that have already

been visited. Recent moves are stored in tabu lists and these form the tabu search

memory. This memory can be dynamic and it can change as the algorithm proceeds. Yet,

another feature of the tabu search is its capability to ‘intensify’ or ‘diversify’ the search.

During intensification, priority is given to solutions that share features similar to the

current solution whereas during diversification the search is spread out in different

regions of the feasible space. Both intensification and diversification are carried out by

adding penalizing terms in the objective function and these terms have weights that can

be adjusted so as to alternately intensify or diversify the search.

Brandimarte (1993) proposes a hierarchical tabu search approach for the routing and

scheduling problem. One-way and two-way architectures are presented. In the one-way

scheme, the higher-level routing problem is first solved using simple dispatching rules to

assign operations to a machine when multiple machines are available. This is then

followed by a lower level problem, which is handled by tabu search in order to find a

near-optimal sequence of jobs for the machines. In the two-way architecture, the higher

Page 34: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

19

and lower levels interact with each other and iteratively improve upon the solution. That

is, for a given routing, the lower level scheduling problem is first solved and depending

upon the result, a disjunctive graph representation is used and operations on the critical

path are swapped during the tabu search in order to reduce the makespan. The

operations that are swapped are rendered tabu and are not allowed to be swapped back

for a given number of iterations determined by the length of the tabu list. The tabu

search is terminated if after a given number of iterations, no improvement in the solution

is detected.

Barnes and Chambers (1996) describe an adaptive, dynamic tabu search strategy for the

flexible job shop problem. They define a sequencing and a routing move, which allows

concurrent consideration of sequencing and routing decisions. A sequencing move is

defined as the exchange of critical operations whereas a routing move relocates the

critical operation to a feasible alternate machine position. Moves are evaluated depending

on the makespan values the resulting solutions generate. They solve a number of

challenging flexible job shop problems by replicating machines in select benchmark job

shop problems. Results indicate that the tabu search takes advantage of the improved

flexibility owing to alternate machines and gives an overall average improvement nearing

10% as compared to the initial flexible dispatching solution.

Dauzère-Pérès and Paulli (1997) present an integrated approach using tabu search for

routing and sequencing jobs in a flexible job shop with makespan as the objective. They

use a disjunctive graph representation and modify it to accommodate alternative

machines. A disjunctive graph is a directed graph with nodes and two sets of arcs,

namely, conjunctive and disjunctive arcs. The nodes depict the operations that must be

performed on the jobs. Operations of the same job are connected through solid uni-

directional conjunctive arcs (thus representing routes) whereas operations of different

jobs but that must be processed on the same machine are connected through dashed

disjunctive arcs. The arc length of conjunctive and disjunctive arcs radiating from a node

is set equal to the processing time of the operation represented by the node. A source

node with n conjunctive arcs connects to the first operations of the n jobs whereas the

sink node receives n conjunctive arcs from the last operations of the n jobs. A feasible

Page 35: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

20

schedule is generated when one disjunctive arc from each pair can be chosen so that the

resulting directed graph is acyclic. The modified disjunctive graph representation to

accommodate alternative machines is shown below in Figure 2.1.

Figure 2.1: Disjunctive graph depicting alternative machines

Conditions are provided to generate a feasible selection of machine-operation

assignments. A feasible selection is shown below. Note that Figure 2.2 is composed

solely of conjunctive arcs and each operation is assigned to only machine out of the

several alternatives available.

Figure 2.2: A feasible selection

Once the feasible selection is made, operations can either be resequenced at other

positions on the same machine or can be moved to an alternative machine in order to

improve the objective. The work in this paper is distinctive because the authors show

1 2 3

4 5 6

7 8 9

S F

Machine 1 Machine 2 Machine 3

1 2 3

4 5 6

7 8 9

S F

Machine 1 Machine 2 Machine 3

Page 36: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

21

that just by ‘moving’ an operation, either reassignment to a different machine or

resequencing on the same machine can be achieved. An operation is moved from one

machine to an alternate machine using the following two steps:

1. Delete the conjunctive arcs (a,b) and (b,c) connected to b (see Figure 2.3),

and add the conjunctive arc (a,c) where a, b and c denote operations.

2. Delete a conjunctive arc (e,d) where e and d are two operations assigned to a

machine in Mb (Mb represents the set of machines capable of processing

operation b) and add the conjunctive arcs (e,b) and (b,d).

Figure 2.3: Underlying theme of the integrated approach

If a(b) and a’(b) represent the pre-move and post-move machine assignments for

operation b, then if a(b) ≠ a’(b), the operation is reassigned to another machine. However

if a(b) = a’(b) then the operation is resequenced on the same machine. This non-

distinction between reassignment and resequencing is one of the major features of this

model. They go on to describe a neighborhood structure for a simple tabu search along

with a lower bound scheme to estimate the makespan in order to evaluate the quality of a

move. Although routing and sequencing are indistinguishable in the tabu search, the

initial solution for the search, however, is obtained by first routing the operations to a

single machine out of several available alternatives, and then, sequencing the jobs at the

machines. Their results indicate that the tabu search procedure leads to substantial

improvements over the starting solution and that an integrated approach in which both

routing and sequencing decisions are solved simultaneously gives better results than a

b

a

ce

d

b

a

c e

d Job J(b)

Job J(b)

Page 37: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

22

hierarchical approach in which one alternates between operation assignment and

optimizing the classical job shop problem.

Scrich et al. (2004) present two heuristics 1) a hierarchical and 2) a multiple start tabu

search procedure for scheduling jobs in a flexible job shop in order to minimize total

tardiness. In the hierarchical procedure, dispatching rules are used to determine the initial

routing and tabu search is used to generate the best sequencing. After a given number of

iterations, the operations are reassigned to different machines and the new sequence is

determined. In the multiple start procedure, several initial routings are determined using a

penalized dispatching rule with modified priority values, and tabu search is applied to

every sequencing subproblem that originates from each starting solution. Computational

tests comparing the two procedures reveal that the hierarchical method generally

performs better than the multi-start procedure for cases with low tardiness values

whereas the opposite is true for instances with higher tardiness values.

Hybrid Approaches:

A number of hybrid approaches that combine the strength of multiple search techniques

have been reported in the literature for solving the FJSP. The goal of these approaches is

to quickly zero in on a reasonably good solution rather than the optimal solution.

Kacem et al. (2002) propose a Pareto approach that combines a fuzzy multiobjective

evaluation stage and an evolutionary multi-objective optimization stage where the

objective is to minimize makespan, the total workload of machines and the workload of

the most loaded machine. A Pareto optimal set comprises of those solutions that cannot

be improved along any one dimension without simultaneously suffering deterioration

along other dimensions. The multiobjective evaluation stage is used for comparing

solution quality based on the different objective functions whereas the optimization stage

comprises of the following two phases: 1) localization and 2) Controlled genetic

algorithm. Computational results indicate that this approach produces good quality

solutions in reasonable time.

Page 38: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

23

Baykasoglu et al. (2004) use scheduling grammar to model the flexible job shop. They

draw from an earlier work (Baykasoglu et al. (2002)) in which they show how the

grammar of liguistics can be used to model the flexible job shop problem. The schedule

algebra is different from the traditional algebra in its rules of addition and multiplication

of matrices and its characterization of matrices. By defining problem parameters (like

jobs and machines, among others) as sets and appropriately manipulating them through

the use of this algebra, some scheduling and other problems can be solved effectively.

The problem is solved using the Giffler and Thomson (1960) priority rule-based heuristic

in conjunction with a multiobjective tabu search approach. Makespan, total tardiness and

load balance are chosen as the different objectives to be satisfied. An example is provided

to describe the proposed algorithm.

In a real life case of a flexible job shop scheduling problem, Tanev et al.(2004) propose a

hybrid evolutionary algorithm that combines priority dispatching rules (PDR’s) and a

genetic algorithm (GA) in order to schedule customer orders in factories of plastic

injection machines. Although computationally efficient, PDR’s are myopic and in order

to use them for real life scheduling of plastic injection machines, it becomes necessary to

empirically evolve them or use a combination of them. The GA is, therefore, used to

overcome the above drawbacks. A comparison of the proposed hybrid approach with

GA alone without the PDR’s and PDR’s alone without the GA indicates that the

combined strength of PDR’s and GA in the hybrid approach outperforms both the

above cases. Computational results for an experimental case further indicate that a

schedule for 400 customers can be generated by the hybrid approach in about half an

hour.

Ho and Tay (2004) use composite dispatching rules to provide an initial solution to

GENACE – a methodology that uses evolutionary computation to solve the flexible job

shop scheduling problem. Using a cultural based genetic algorithm, beliefs of effective

schedules that carry good schemata from each generation are preserved. For some

benchmark problems that were solved using GENACE, it was found that better upper

Page 39: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

24

bounds for makespan were obtained than by the methods previously proposed by

Kacem et al (2002) and Brandimarte (1993).

Xia and Wu (2005) describe a hybrid approach that combines particle swarm

optimization and simulated annealing for the multiobjective scheduling of a flexible job

shop with the objectives of minimizing makespan, total workload of machines and the

workload of the critical machine. Particle swarm optimization is evolutionary

computation technique that simulates the behavior of birds in flight and their means of

information exchange. In the hybrid approach, particle swarm optimization is used to

determine the routing of the jobs whereas the sequencing is determined by simulated

annealing. Several test instances have been presented to demonstrate the effectiveness of

the procedure.

Multiagent approaches:

Wu and Weng (2005) describe a multiagent scheduling method with integrated job

routing and sequencing for a flexible job shop with earliness and tardiness objectives.

Jobs are classified based on whether they have a single operation or two or more

operations left, and two heuristic algorithms are presented for scheduling jobs that fall in

either category. The multiagent scheduling method proposed in this study is compared

with two contract-net approaches called production reservation (PR) and single-step

production reservation (SSPR) which have been used to minimize average tardiness in

hierarchical scheduling for flexible manufacturing systems. PR works by scheduling each

job as it arrives in the manufacturing system as opposed to scheduling all jobs at once.

PR, however, is unable to handle the need for rescheduling in case of a machine

breakdown or when a job needs to be modified. SSPR takes care of this deficiency by not

choosing a machine for the next operation until the current operation is completed.

Computational results indicate that the proposed multiagent method performs

significantly better than both PR and SSPR for the various utilization levels and due date

settings considered. Moreover, it was found that the multiagent method could determine

a schedule for 2000 jobs on ten machines in about 1.5 minutes. Results also indicated

that the multiagent method is sensitive to the due date tightness factor under all shop

load levels considered in the study.

Page 40: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

25

2.2 Multisite Planning and Scheduling:

In the NMJS problem under study, wafers travel between various fabrication facilities so

as to meet all of the processing requirements. The issue of scheduling jobs or wafers at

these multiple sites is, therefore, rather relevant to our problem. This section presents an

overview of work reported in the area of multisite planning and scheduling.

2.2.1 Mathematical Programming and Heuristic Approaches: Gascon et al. (1998) attend to the problem of scheduling a set of lumber drying kilns to

meet demand in an environment involving multiple items, multiple machines and

multiple sites. The processing times (here lumber drying times) are stochastic. The

primary issues are: 1) When should a resource be loaded, 2) If multiple resources for the

same task are available, which one should be chosen, and finally, 3) Which items should

be produced. To address these issues, the authors first present a kiln loading heuristic for

the multimachine, multiproduct, single site case, which they later extend to handle the

multisite situation. In this study, since lumber does not travel from one kiln to another

this can be classified as a single stage problem as opposed to the multistage problem that

exists in MEMS scheduling.

Roux et al. (1999) describe an iterative solution methodology that switches between first

solving the planning model (or the lot-sizing problem for each site in each time period)

to yield a plan and then solving N (the total number of sites) independent scheduling

problems to yield a new sequence that minimizes makespan. Both dependent and

independent demand are considered. Set up times and set up costs are not accounted for.

After a finite number of iterations, the procedure generates the best solution found so far

for the entire problem. Note that, in our NMJS problem however, scheduling decisions

for each site cannot be decoupled from each other because jobs traverse the network

during the course of their processing. For a fixed set of sequences

1 2 sY = , , ...,y y y where sy is the sequence of operations on the resources in site s , the

planning model is presented below:

Notation:

s site

Page 41: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

26

T planning horizon

t planning period

fS set of final sites

sitX amount of product i to be completed by the end of period t at site s

sitI inventory level of product i at the end of period t at site s

stc capacity of planning period t at site s

sitD demand for product i in site s at the end of period t

sA set of pairs of successive operations on the routings in site s

sL set of last operations of jobs in site s

sF set of first operations of jobs in site s

pr(u) product associated with operation u

isl lead time of product i in site s

up processing time of operation u per unit of product pr(u)

sitJ a job that corresponds to the production of a quantity s

itX of product i that must

be completed by the end of period t and which must start after ist - l

utt start time of operation u in job spr(u)tJ

sΓ set of products in site s

sics + inventory cost of product i at site s

sics − backlog cost of product i at site s

sicp production cost of product i at site s

sitK dependent demand for product i in period t at site s

Page 42: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

27

( ) ( )

sit

1

, 1 , 1

, 1

: X

0, , , (2.16)

0, , , (2.17),

s sf

Tss ss s

it iti iit s S i s iS

s s s s s sit it i t i t it it f

s s s sit i t it it f

s sit it

cpcs csI I

s i tSI I I I X Ds i tSI I X K

X

+ −+ −

= ∈ ∈ ∈ ∈Γ Γ

+ − + −− −

+ +−

+

+ +

− − − − + = ∀ ∈ ∀

− − + = ∀ ∉ ∀

∑ ∑∑ ∑∑M4 Minimize

subject to

( )

0, , , (2.18)0 , , , (2.19)

0, ( , ) , , (2.20)0 , , , (2.21)

sit f

spr v t sut vt v

ut s

s i tIs i tSI

u v s tpt t X Au s tt O

≥ ∀

≥ ∀ ∈ ∀

− − ≥ ∀ ∈ ∀

≥ ∀ ∈ ∀

( )

( ) ''

( )1

1

( )1

1

0 , ( , ) , , (2.22)

, , , (2.23)

, , , (2.24)

, , , (2.25)pr u s

spr v tut vt v s

tss

pr u t sut pup

tss

pr u t sut pup

t ls

sut pp

u v s tp yt t X

u s tpt cX L

u s tpt cX L

u s tt c F

=

=

=

− − ≥ ∀ ∈ ∀

+ ≤ ∀ ∈ ∀

+ ≥ ∀ ∈ ∀

≥ ∀ ∈ ∀

Here, ∈ s(u, v) y implies that u and v are consecutive operations in the sequence of a

resource in site s .

In the above model, the objective of M4 is a function of the inventory cost, the backlog

cost and production cost (see Roux, et al. (1999) for details). Constraints (2.16) and (2.17)

are inventory balance constraints. Constraints (2.20) ensure precedence between

operations. Constraints (2.22) are machine capacity constraints. Constraints (2.23) state

that production sitX of a job s

itJ should be completed by end of period t . At the same

time, it should not be completed before the beginning of period t and this is ensured by

constraints (2.24). Finally, constraints (2.25) prevent the completion of a job if

components from other sites are not available in the relevant period.

Page 43: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

28

Guinet (2001) proposes a two-level production management approach for global multi-

site production planning and local scheduling although the focus of his work is primarily

production planning in the multi-site environment. The multi-site planning problem is

modeled as a flow problem (multiperiod and known demand) where the key variables are

the 0-1 assignment variables and the continuous variables, which comprise of quantities

of each product to be produced at each site in each time period. A dual problem is

generated from the primal (flow problem) and a primal-dual solution is provided. The

model for the flow problem with variable and fixed costs is presented below:

Notation

N Number of products

T number of periods

M number of sites

Dem(i, j) demand of product i for period j (in time units)

Res(p,w) amount of resource available for site w for period p (in time units)

Buf(p,w) space available at site w for period p (in time units).

Set(i,w) set up time of product i at site w

H(i,w) holding cost of product i at site w

D(i) delay cost of product i per unit

P(i,w) processing cost of product i for unit at site w

S(i,w) set-up cost (fixed) per unit of time incurred at site w to satisfy product

i demand

C(i, j, p,w) variable cost per unit of demand of product i for period j satisfied by site

w in period p

x(i, j, p,w) amount of demand for product i in period j satisfied by site w in period

p

Page 44: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

29

=

1, if site in period is employed to satisfy the demand for product in period

0, otherwise

w p iy(i, j, p,w) j

z(i, p,w) set-up time required to process product i at site w in period p

f(w) set of final sites

1 2" , j , j"i i 1i is a component of 2i for period j demand.

1 1 1 1

1 1 1

1 1

1 1

: ( ) ( , , , ) ( , , , )

( , , ) ( , )

( , , , ) ( , ) , 1,..., , 1,..., (2.26)

( , , , ) ( , ) , 1,..., , 1,...,

N T T M

i j p w

N T M

i p w

T M

p w

T

i j

Z x i j p w C i j p w

z i p w S p w

x i j p w Dem i j i N j T

x i j p w Res p w p T w M

= = = =

= = =

= =

= =

= +

≥ ∀ = ∀ =

≤ ∀ = ∀ =

∑∑∑∑

∑∑∑

∑∑

M5 Minimize

subject to

1 1 1

1 1

1 1

(2.27)

( , , , ) ( , ) , 1,..., , 1,..., (2.28)

( , , , ) 1, 1,..., , 1,..., (2.29)

( , , , ) , 1,..., , 1,..., (2.30)

( , , , ) ( , , , ) ( ,

N

pN T

i q j p

T M

p w

N T

i j

x i j q w Buf p w p T w M

y i j p w i N j T

y i j p w N T p T w M

x i j p w y i j p w Dem i

= = = +

= =

= =

≤ ∀ = ∀ =

≥ ∀ = ∀ =

≤ ∀ = ∀ =

∑∑ ∑

∑∑

∑∑) ,

1,..., , 1,..., , 1,..., , (2.31)( , , , ) ( , ) ( , , ) ,

1,..., , 1,..., , 1,..., , (2.32)

ji N p T w M

y i j p w Set i w z i p wi N p T w M

∀ = ∀ = ∀ =≤

∀ = ∀ = ∀ =

Page 45: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

30

( , , , ) ( , ) ( , , ) ,1,..., , 1,..., , 1,..., , (2.32)

( , , , ) ( , ) ( , ) ( ) ,1,..., , 1,..., , , 1,..., | (2.33)

( , , , ) ( , ) ( ) ( ) , 1,..., , 1,..., ,, 1,..

y i j p w Set i w z i p wi N p T w M

C i j p w p i w H i w j pi N w M j p T j p

C i j p w p i w D i p j i N w Mj p

≤∀ = ∀ = ∀ =

= + −∀ = ∀ = ∀ = ≥

= + − ∀ = ∀ =∀ =

1

1 2

2

1 2

., | (2.34)( , , , ) , 1,..., ,

1,..., | , , , (2.35)( , , , ) , 1,..., ,

1,..., | , , , (2.36)( , , , ) , , 1,..., | ( ) (2.37)

T j pC j p w w Mi

p T p j j ji iC j p w w Mi

p T p j j ji iC i j p w j p T i f w

<= ∝ ∀ =

∀ = ≥= ∝ ∀ =

∀ = <= ∝ ∀ = ∉

and

and

The objective of M5 is to minimize variable and fixed costs. Constraints (2.26) ensure

that demand for product i in period j is satisfied. Constraints (2.27) and (2.28) apply

limits on the processing capacity and storage space available, respectively at each site in

each period. Constraints (2.29) and (2.30) ensure that each demand is assigned to a

resource and that each site in each period can at most satisfy N demands of T periods.

Constraints (2.31) link the various variables. Constraints (2.32) define the set up times

incurred. Constraints (2.33) and (2.34) specify the make-up of the variables costs.

Constraints (2.35) and (2.36) model precedence constraints. Finally, constraints (2.37)

define dedicated processing areas as they pertain to customer demands.

Jia et al. (2003) present a multifunctional scheduling system for a multifactory

environment, composed of a central scheduling kernel, which houses a GA, and

application components, which comprise of a stand-alone module, scheduling agent

module and an e-scheduling module. The genes in the GA carry encoded information

about the factory-job combinations and the job operations, and the chromosomes that

are generated by mixing all the genes construct a feasible schedule for the entire

distributed scheduling problem including schedules for all the local sites. The kernel

could be combined with any of the three application components to form a complete

scheduling module. Although not explicitly stated, the situation considered here is one

Page 46: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

31

where once a job is assigned to a site/factory; all operations of that job are carried out in

the same factory. This is in contrast to our problem where a job may undergo processing

for only a subset of its operations at one site before traveling to another.

Gnoni et al. (2003) describe a hybrid MILP and simulation approach for lot sizing and

scheduling (LSSP) in a multisite, multiproduct environment with capacity constraints and

uncertain demand. Two different scenarios are considered, the first where each

production site solves its own LSSP, and thereby, pursues what is referred to as the

‘Local optimization strategy’, and second, in which the LSSP for all the sites combined is

solved under what is called the ‘Global optimization strategy’. Although conclusions

derived are restricted to the study in question, the results show that the global

optimization strategy leads to a better overall economic performance than the local

optimization strategy.

2.2.2 Other Approaches: Distributed Artificial Intelligence Approaches: Sauer, et al. (1998) describe a multi-site scheduling problem which is partitioned into a

global scheduling problem and a local scheduling problem that derives its inputs from the

global level. Both the global and local components are capable of predictive as well as

reactive scheduling using heuristics, genetic algorithms, neural networks and multiagents.

Global level predictive scheduling comprises of the allocation of orders (for producing

intermediate as well as final products) to the individual plants or sites. The local

predictive scheduling involves generation of detailed production schedules for each site.

The local reactive scheduling component handles machine breakdown, maintenance etc.

Should such reorganization in the production schedule of a plant affect other plants, the

global reactive scheduling component assumes control and ‘reestablishes consistency by

due-date relaxation, redistribution of orders, splitting of orders etc’ (Sauer, et al., 1998).

The global as well as the local scheduling systems are based on knowledge-based

techniques and communication between them is achieved using a blackboard approach.

Page 47: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

32

2.3 Stochastic Scheduling:

The NMJS problem is characterized by stochasticity in job arrival times, processing

times, processing sequence lengths and their compositions. Stochastic scheduling,

especially for flowshops and job shops, focuses almost exclusively on uncertainty

resulting from machine breakdowns and job processing times with a few exceptions that

also consider stochasticity in job arrivals and due dates. It would be insightful to review

them here. A summary of contributions in the area of stochastic flow shop scheduling is

presented in Table 2.1. The literature has been classified based on the number of

machines, number of jobs, stochastic parameters considered, performance measure

evaluated and the methodology used. We elaborate on the literature in the section

following Table 2.1 (Section 2.3.1)

2.3.1 Stochastic Flow Shop Scheduling:

Two-machine stochastic Flow shops:

Makino (1965) studied the 2 job, 2 machine and 2 job, 3 machine stochastic flow shop

problems where job processing times on the machines followed an exponential

distribution. The buffers in between the machines are assumed to have unlimited storage

capacity. It was shown that sequencing the jobs in decreasing order of i ia b− , where 1ia

is the mean of the exponential processing time distribution for job i on machine A, and

1ib is the mean of the exponential processing time distribution for job i on machine B,

minimizes the expected makespan. Talwar (1967) extended the work of Makino to the

case of 3 jobs and 2 machines with exponentially distributed processing times and

provided a conjecture for the optimal sequencing of n jobs. The conjecture (Talwar’s

rule) states that when job processing times are exponentially distributed on machines A

and B with ( ) 1i

i

E Aa

= and ( ) 1i

i

E Bb

= for 1,...,i n= , job i should precede job j if

and only if i i j ja b a b− ≥ − .

Page 48: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

33

Table 2.1 : Summary of contributions in stochastic flow shop scheduling:

Number of machines

Number of jobs Stochastic Parameters Objective Methodology Special Conditions

References

2,3 2 Processing time – exponential distribution

Expected makespan

Exponential dominance rule

No blocking Makino (1965)

2 3 Processing time – exponential distribution

Expected makespan

Exponential dominance rule,

conjecture for ‘n’ jobs

No blocking Talwar (1967)

2 4 Processing time – exponential distribution

Expected makespan

Exponential dominance rule

No blocking Bagga (1970)

2 n Processing time – exponential distribution

Expected makespan

Exponential dominance rule

No blocking Cunningham et al. (1973)

2 n Processing times – independent random

variables

Expected makespan

Sufficient conditions on job processing time

distributions

No blocking Ku (1986)

2 n Processing time – geometric distribution

Expected makespan

Dominance rule No blocking Prasad (1981)

2 n Processing time – general distribution

Expected makespan

Shown to be equivalent to the deterministic Travelling salesman

problem

Blocking Pinedo (1981)

2 n Random processing times Expected variation of

waiting times

Dominance rule Blocking Jia (1998)

2 n Processing time – exponential distribution

Total expected

costs

Dominance rule No blocking Forst (1983)

2 n Processing time Expected makespan

Sufficient conditions on job processing time

distributions

No blocking Kamburowski (1999)

Page 49: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

34

Table 2.1 : Summary of contributions in stochastic flow shop scheduling (continued): Number of machines

Number of jobs Stochastic Parameters Objective Methodology Special Conditions

References

2 n Processing time – arbitrary distribution

Expected makespan

Exact approach, Heuristics

Elmaghraby et al. (1999)

2 n Processing time – only

lower and upper bounds known

Expected makespan

Dominance rules _ Allahverdi (2003)

3 n Processing time exponential distribution

Expected makespan

Sufficient conditions on job processing time

distributions

No blocking Kamburowski (2000)

m n Processing time – non-overlapping distribution

Expected makespan

SEPT-LEPT rule With and without blocking

Pinedo (1982)

m n Processing time – nonoverlapping

distribution

Expected makespan

SEPT-LEPT dominance rule

Blocking Foley et al. (1984)

m n Processing time – discrete as well as continuous

probability distribution

Expected makespan

MILP – Branch and Bound

Zero wait flowshop

Balasubramanian et al. (2001)

m n Processing time Expected makespan

Fuzzy set theory, MILP, restrictive tabu search

No blocking Balasubramanian et al. (2003)

m n Processing time Expected makespan

Hypothesis testing wth genetic algorithms

No blocking Wang et al. (2003)

m n Processing time – exponential distribution

Expected makespan

Iterative algorithm based on Markovian

analysis

No blocking Gourgand et al. (2003)

Page 50: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

35

Bagga (1970) proved the optimality of Talwar’s rule for the case of 2, 3 and 4 jobs. For

the case of n jobs, he provides an expression to calculate the expected makespan that

involves n fold integrals. Cunningham and Dutta (1973) rigorously prove the optimality

of Talwar’s rule for the case of n jobs and 2 machines. Using Markov Chain analysis,

they also provide a closed form Expression that involves a set of recursive relations in

order to calculate the expected makespan. Mittal and Bagga (1977), in the same vein,

tackle a slightly different problem involving n jobs and 2 machines where one of the

jobs must be processed before a certain time. The processing times on both machines

follow exponential distributions. A procedure involving the interchange of jobs is used to

determine the optimal sequence. Pinedo and Schrage (1981), for the case of two machine

stochastic flowshops with general processing time distributions and no intermediate

storage, show that the problem of minimizing the expected makespan is equivalent to

maximizing the total distance in a deterministic traveling salesman problem where the

distance matrix is appropriately modified. Prasad (1981) presents an optimal rule to

minimize expected makespan for the n job, 2 machine flow shop problem but with the

important exception that job processing times come from a geometric distribution. This

study also presents yet another optimal rule for the case where processing times on one

machine are identical and follow some general distribution with a finite mean and those

on the other machine follow a geometric (or exponential) distribution. Ku and Niu

(1986) use an adjacent pairwise interchange argument to establish a connection between

Johnson’s rule (1954) for 2 machine deterministic flowshops and Talwar’s rule for 2

machine stochastic flow shops. They also show that Talwar’s rule minimizes the

makespan stochastically. A random variable X is said to be stochastically less than

another random variable Y if P X t P Y t> ≤ > for all t . Forst (1983) minimizes the

total expected costs by providing a dominance rule and develops dominance relations for

three special cases. These dominance relations enable a partial schedule to be built after

which an enumeration technique can be used to obtain the optimal schedule.

Jia (1998) examines the static case of a two machine flowshop with blocking where jobs

have uncertain processing times. The goal is to determine optimal sequences that

minimize the expected variation of waiting times until processing begins on the second

Page 51: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

36

machine when jobs are stochastically ordered. Towards this end, analytical results are

presented. Elmaghraby, et al. (1999) propose an exact approach (for small problems) to

minimize the expected makespan for the two machine flow shop problem where job

processing times are from arbitrary distributions. The exact approach relies on the

estimation of the expected completion time of a project, concepts of control networks

and activity networks. For larger problems two heuristics are presented.

Kamburowski (1999, 2000) deals with the problem of scheduling a set of jobs with

random processing times in two-machine and three-machine flowshops to stochastically

minimize the makespan. Sufficient conditions on job processing time distributions are

presented that show that the makespan can be made stochastically smaller when two

adjacent jobs in the sequence are interchanged. Also rules developed for the 2-machine,

exponential processing time, stochastic flow shop problem have been extended for the 3-

machine case.

Allahverdi, et al. (2003) present dominance rules on a set of schedules that specify the

optimal order of two jobs for the two machine flow shop problem where job processing

times are known only by their lower and upper bounds. They further propose dominance

relations between sequences that differ only in the placement of two adjacent jobs.

Kouvelis et al. (2000) consider a n job, 2 machine stochastic flow shop where processing

times of jobs on the machines are uncertain with the objective of generating a ‘robust’

schedule that minimizes the expected makespan. Scenarios are considered where the

scenario s S∈ represents a unique set of processing times of each job on each machine.

The set of processing times for a given scenario is denoted by

: 1,..., , 1, 2ssij i n jpP = = = where s

ijp denotes the processing time of job i on

machine j under scenario s . Let θ represent the set of all permutation sequences

constructed using n jobs and let (1),..., ( )nπ π π= denote a given sequence in θ and

Page 52: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

37

* * * (1),..., ( )s s s nπ π π= denote the optimal sequence for a given scenario s where ( )kπ

and *( )s kπ denote the position of a job that occupies the thk position in sequences π

and *sπ . If ( , )sPϕ π gives the makespan of a schedule π θ∈ under scenario s , then

the optimal schedule *sπ must satisfy:

*( , ) min ( , )s s ssz P P

π θϕ ϕ ππ

∈= =

The objective in this study is to minimize the maximum possible regret where regret is

measured by the absolute difference between the makespan of a schedule for a given

scenario and the makespan of the optimal Johnson schedule for that scenario. Therefore,

the quest is for an absolute deviation robust schedule that satisfies *min max ( , ) ( , )s ss

s SP P

π θϕ π ϕ π

∈ ∈−

Processing time uncertainty is captured through two different approaches. In the first

approach, scenarios are used whereas in the second approach time intervals that

represent the lower and upper bound on processing times possible for each job on each

machine are employed. A branch and bound procedure as well as a heuristic procedure is

proposed to solve the problem under both of the above approaches. Computational

results show that the branch and bound procedure is feasible only for small problems

and that variability in the individual job processing times significantly influences the

effort expended in solving the problem. Additionally it was found that robust schedules

offer excellent makespan results even in the presence of processing time uncertainty.

Allahverdi and Mittenthal (1995) tackle the problem of minimizing the expected

makespan in a two machine flow shop subject to stochastic breakdowns. The breakdown

of a certain machine k is represented by a sequence of finite-valued, positive random

vectors rk rk,U D for 1,2... 1,2r = =and k where rkU denotes the thr uptime of

machine k and rkD denotes the thr downtime of machinek . A preempt resume

scenario is assumed. Under these circumstances they show that it is sufficient to consider

the same sequence of jobs on both machines in order to optimize any regular

performance measure. Dominance relationships between the jobs are derived in order to

determine their relative positions in a schedule that minimizes makespan with probability

Page 53: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

38

1. By definition, the makespan of a sequence is a minimum with probability 1 if it is

smaller than the makespan of any other sequence for all realizations ω , where, if Ω

denotes the set of all possible realizations of uptimes and downtimes, then ω denotes

one realization of uptimes and downtimes. Furthermore, they provide conditions that, if

the breakdown distributions were to satisfy, would ensure that Johnson’s rule

stochastically minimizes the makespan.

Alcaide et al. (2002) consider a two machine flow shop problem with stochastic

processing times and breakdowns with the objective of minimizing the expected

makespan. The random variables are ijX ; the processing time of the i th operation ijO

of job jJ , iY τ ; the time between the ( )1τ − th and τ th breakdowns of machine iM and

iZ τ ; the time to repair the thτ breakdown of machine , 1,..., , 1, 2i i mM τ= = . They

provide a general procedure to convert the original breakdowns problem into a sequence

of subproblems without breakdowns. They show that if the subproblems can be solved

optimally, then an optimal solution to the original problem with breakdowns can be

obtained. The problem is broken down into time intervals whose end points are

determined by instants in time when a machine breaks down or resumes operation after

undergoing repairs. In each of these disjoint intervals a scheduling problem must be

solved that involves sequencing a set of unfinished jobs on a set of currently operating

machines. Of course, with breakdowns and repairs being stochastic, the end point of an

interval is not known in advance and therefore one must decide optimally with whatever

information is available at the beginning of the interval. Hence, the procedure is dynamic.

Computational results study the effect of changes in the random variables on the

expected makespan. The results are intuitive in that they indicate that the makespan can

be positively influenced by investing in machines whose operative or up times are long.

m -machine Flow shops:

Pinedo and Schrage (1981) address the case of m machines with nonoverlapping

processing time distributions and unlimited intermediate storage with expected makespan

as the objective. Distributions kD and lD are said to be nonoverlappingly ordered if

Page 54: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

39

( )ik ilP p p≥ is either 0 or 1. This indicates that the probability density functions do not

overlap. Here, ikp denotes the processing time of job k on machine i . For the above

case, they state that any SEPT-LEPT sequence is optimal. A sequence of jobs 1,...,n is in

a SEPT-LEPT sequence if there exists a k such that

( ) ( ) ( )1 2 ...i i ikE E Ep p p≤ ≤ ≤ and ( ) ( ) ( )1 ...ik ik inE E Ep p p+≥ ≥ ≥ .

Moreover, for the case where 2n − jobs have deterministic processing times (not

necessarily identical) and 2 jobs have nondeterministic processing time distributions, they

argue that any sequence in which one of the stochastic jobs appears first and the other

stochastic job ranks appears last, is optimal for stochastic minimization of the makespan.

In the more restrictive situation where blocking may occur and where jobs have

nonoverlapping processing time distributions, the above study states that the expected

makespan is minimized if and only if the sequence is SEPT-LEPT. Also presented is a

conjecture for the case where there exist m -1 identical machines with distributions

1D and m identical machines with distribution 2D with 2D being nonoverlappingly

larger than 1D . The conjecture states that the optimal sequence alternately places slow

and fast machines in the sequence with a slow machine occupying the first position.

Foley and Suresh (1984) consider the n job, m machine flow shop with no intermediate

storage and prove that for 2n − jobs that have identical processing time distributions

that are nonoverlapping and slower than the other two identical jobs, regardless of the

ordering of the machines, the makespan is stochastically minimized by placing one of the

fast jobs first and the other fast job last in the sequence. Another key result of this study

is that given n jobs with nonoverlapping processing time distributions, the makespan is

stochastically minimized if and only if the sequence is SEPT-LEPT.

Balasubramanian et al. (2001) address the problem of scheduling in a stochastic flow

shop where job processing times are uncertain and are modeled using discrete probability

distributions. A MILP that relies on an analytical Expression for the expected makespan

Page 55: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

40

for the case of a zero-wait flowshop, is proposed. To solve this problem, they present a

successive probability disaggregation branch and bound algorithm that uses a bounding

property of aggregated probability models; for both the zero-wait and unlimited storage

flowshops. An extension to the case of continuous distributions is also described and

through a discretization scheme, the branch and bound algorithm previously proposed

has been used.

Balasubramanian et al. (2003) examine a flowshop scheduling problem where job

processing times are uncertain and are modeled using concepts from fuzzy set theory.

MILP models for the flowshop problem and a new product development problem are

presented along with a procedure to estimate the most likely, optimistic and pessimistic

values for the expected makespan. While these models are computationally tractable for

moderately sized problems, for larger problems, however, an alternative approach that

uses a restrictive tabu search strategy is proposed.

Wang, et al. (2003) combine hypothesis testing with genetic algorithms (GA) to solve

flow shop scheduling problems with stochastic processing time. By doing so, premature

convergence can be avoided in the GA and the search is prevented from cycling in

regions that provide solutions with a similar performance in a statistical sense. In this

approach, using random initialization, initial solutions are generated out of which a few

good ones are chosen as seeds using a certain selection operator. Cross over operators

are then used to generate new solutions, which are ranked based on their estimated mean

performance and rated from the best to worst. The best solution is retained for the next

population and beginning with the second best to the last, each current solution is

compared with the best. Based on a hypothesis test, if there is no significant difference

between their performances, the current solution is discarded; else it is retained and

moved into the next population. Once the hypothesis based comparison process is

completed, the discarded solutions are replaced by new ones that are randomly generated

and these new solutions are transferred to the next population. Thereby population

diversity is ensured. Computational results suggest that the Hypothesis Test based GA

Page 56: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

41

(HTGA) performs much better than the simple GA for stochastic flow shop scheduling

problems and the HTGA gives results very close to the optimum of these problems in

the expected sense. Additionally, for large scale problems, the average performance of

HTGA was found to be better than the simple GA.

Gourgand et al. (2003) consider the scheduling of jobs with exponentially distributed

processing times in an m machine flowshop with unlimited buffers where the goal is to

minimize the expected makespan. An iterative algorithm based on Markovian analysis is

used to compute the expected makespan. For a system with very large number of states

and other restraining conditions such as limited buffers, use of a stochastic simulation

model is suggested. In order to solve stochastic flow shop problems, the authors propose

techniques that integrate heuristics/metaheuristics to generate a schedule and the above

Markovian algorithm or the stochastic simulation model to evaluate the schedule.

We now turn our attention to stochastic job shop scheduling and begin by presenting a

summary of contributions in this area in Table 2.1. Literature is classified based on

stochastic parameters considered, performance measure evaluated and the methodology

used. Also, some special conditions relevant to the study are mentioned.

2.3.2 Stochastic Job Shop Scheduling:

Ginsberg, et al. (1995) examine a job shop problem with due dates, priorities and random

processing times. Each task duration has a known average value and variance. The

objective is to determine the starting times for the operations at decision points (i.e. when

a machine or a job becomes available) given that we know 1p , the desired probability for

a job to be completed on time, and 2p , the least permissible probability for the job to

meet its due date. When multiple jobs in queue compete, a pair-wise comparison rule

initiates a competition by calculating the job’s delivery performance i.e. the probability of

being able to meet its due date. Two heuristics are also presented for choosing a job from

the queue and they are compared via simulation.

Page 57: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

42

Table 2.2: Summary of contributions in stochastic job shop scheduling:

Stochastic Parameters

Objective Methodology Special Conditions References

Processing time – mean and variance known

Maximize the probability of a job finishing before its

due date. Minimize makespan

Heuristics For a job, the desired as well as the least permissible probability of it completing before its due-date is known.

Golenko-Ginzburg et al. (1995)

Processing time – mean and variance known

Maximize the probability of a job finishing before its

due date. Minimize makespan

Look ahead techniques and pair wise comparison

For a job, the desired as well as the least permissible probability of it completing before its due-date is known.

Golenko-Ginzburg et al. (1997)

Processing time exponential distribution

Minimize average value of total penalty and storage

expenses

Heuristics and simulation

Penalty and storage costs are given.

Golenko-Ginzburg et al. (2003)

Processing times, arrival times, due dates and part

priorities

Expected part tardiness and earliness costs

Lagrangian relaxation and

stochastic dynamic programming

- Luh et al. (1999)

Processing time Expected total weighted tardiness

Heuristic and deterministic

scheduling algorithm

- Singer (2000)

Machine breakdowns and processing times

Weighted tardiness Simulation - Kutanoglu (2001)

Processing time Minimize the expected value for the total elapsed

time.

Genetic Algorithm - Yoshitomi (2002)

Page 58: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

43

Table 2.2: Summary of contributions in stochastic job shop scheduling (continued):

Stochastic Parameters

Objective Methodology Special Conditions References

Processing time Mean flow time Heuristics, algorithms

Processing time can take on any real value

between a lower and upper bound

Lai et al. (2004)

Processing time – mean and variance known

Minimize sum of variation between actual and planned processing time, operational

costs and idle costs.

Neural network and simulated annealing

- Moghaddam et al. (2005)

Page 59: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

44

This problem is extended by Ginsberg, et al. (1997) wherein they combine ‘look-ahead’

techniques and pair-wise comparison to choose a job from the queue. The look-ahead

techniques account for the case where it may be advantageous to wait for a ‘bottleneck

job’ instead of picking a job from the queue and mounting it on the machine. A further

extension to the same problem was proposed by Ginsberg, et al. later (2003) but with the

objective of determining the earliest start times for all the jobs so as to minimize the

average penalty and storage costs. A new rule which is a combination of the previous pair

wise comparison rule and the cost objective is proposed to pick a job from the queue. A

simulation model is presented to determine the average costs over many replications and

since these costs are a function of the operation starting times, the coordinate descent

method is applied to the simulation model to determine the optimal starting times that

minimize the expenses.

Luh, et al. (1999) describe an approach that combines Lagrangian relaxation and

stochastic dynamic programming for a flexible job shop problem with uncertain arrival

times, processing times, due dates and part priorities for the objective of minimizing the

expected part tardiness and earliness cost. From the set of dual solutions obtained with

this approach, the best one is selected using ordinal optimization and an on-line heuristic

massages the dual solution to remove infeasibilities. An implementable schedule is then

dynamically constructed based on the realization of random events. A lower bound to

the stochastic problem is derived in order to evaluate the quality of the schedules

obtained with this approach.

Singer (2000) presents a heuristic (‘augmentation forecasting policy’) for a job shop with

uncertain processing times that exaggerates expected processing times by a suitable

factor and feeds it into a deterministic scheduling algorithm to minimize a risk averse

penalty function. This policy is based on the observation that, under random

perturbations, the Gantt chart pertaining to a schedule shifts to the right.

Page 60: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

45

Kutanoglu (2001) conduct an experimental study to evaluate the effectiveness of an

iterative simulation based scheduling technique for dynamic stochastic job shops. The

impact of machine breakdown and processing time variation on the system is analyzed

and various settings of tunable parameters such as the look-ahead parameter and the

scheduling window are explored for this approach. Results in this work reveal that

although with the appropriate parameters this method is effective, its efficacy diminishes

under dynamic and stochastic conditions.

Yoshitomi (2002) proposes a genetic algorithm based technique to solve the stochastic

job shop problem with uncertain processing times. In this work, the fitness function is

determined by a random number generated according to the stochastic distribution

functions for the stochastic variables. For each generation, all individuals and their

frequencies are stored and the hypothesis that the individual with the highest frequency

through all generations presents the best solution in terms of the expected value, is tested

through numerical experiments.

Lai, et al. (2004) examine a job shop scheduling problem with random processing times

that can take on any real value between an upper and a lower bound with mean flow time

as the performance criteria. They develop algorithms and heuristics to determine a set of

schedules that contains amongst them an optimal schedule for any permissible realization

of the processing times. In particular, they present a formula for calculating the stability

radius of the optimal schedule, which is the largest possible perturbation in the

processing times that can be tolerated without altering the optimal schedule. A similar

approach was followed by Lai, et al. (1999, 1997) for the same problem but with the

objective of minimizing the makespan.

2.4 MEMS Scheduling:

Wang, et al. (2002) focus on a production scheduling problem in a MEMS manufacturing

environment. The MEMS production process considered is a three-stage process

Page 61: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

46

comprising of the wafer front-end, the wafer cap and the wafer back-end process. The

authors develop scheduling rules to keep the cycle time and WIP low in the line. An

important aspect is the synchronization of flow of the MEMS wafers into the wafer

front-end and the wafer cap process so that they exit their respective lines at

approximately the same time, upon which the two wafers are bonded before being

dispatched to wafer back end. A discrete event simulation model of the process flow is

described that accounts for 106 process steps in the wafer front end, 24 in the cap

process and 18 in the back end. Although reentrant flow is considered, it is assumed that

the shop manufactures a single product. The authors propose four new synchronization

rules called Simplesyn, Delaysyn, Wbsyn and Littlesyn. These are used in conjunction

with two release rules, namely, Poisson input and CONWIP, and scheduling rules,

namely, FIFO, LIFO, SRPT, LRPT and EDD, and their performance is compared using

cycle time and WIP as the performance measures. A total of 40 rule combinations are

considered. Results indicate that while Littlesyn achieves the best co-ordination between

the wafer release and wafer cap processes, the best combination of rules is Littlesyn-

CONWIP-SRPT.

Wang, et al. (2002) extend the above approach by proposing additional synchronization

rules: LINESYN and SASYN. They perform an extensive analysis using these new rules

in conjunction with other release rules such as Uniform, starvation avoidance and

workload regulation. Furthermore, the above rules are coupled with a set of newly

developed dispatching rules: CAPFIFO, FRONTFIFO, CAPSRPT and FRONTSRPT.

Their performances are recorded using cycle time and WIP as the measures. A total of

150 rules combinations are analyzed. Results indicate significant two and three factor

interactions among the rules and suggest that WR-SIMPLESYN-FRONTSRPT or WR-

DELAYSYN-FRONTSRPT is the best combination if cycle time is the primary concern.

On the other hand, WR-SASYN-CAPFIFO or WR-LINESYN-CAPFIFO is the best

choice if WIP assumes more importance.

Page 62: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

47

Benard (2003) proposes two new job and staff dispatching policies for a MEMS job shop

with characteristics similar to the one considered in the study here. However, the focus is

only on a single shop as opposed to a network of job shops. The objective is to maximize

the realizable capacity of the shop while minimizing the cycle time of the jobs processed

in the shop. The job dispatching policies developed are normalized shortest remaining

processing time and nominal expected delay. In the case of the first policy, the remaining

process duration of a job is multiplied by the raw process duration of the job and priority

is given to the job with the smallest value of this product. The second policy comprises

of three steps. In the first step, the current total delay for a machine is calculated by

summing the processing time of operations waiting in queue and the one in progress. In

the second step, the current delay for the remaining processing steps of a job is calculated

by summing up the delays on the machines to be used in these operations. Observe that

the delays on the machines are calculated from step 1. In the third step, priority is

assigned to each job j by adding the job delays from step 2 to the total estimated time for

job j augmented by a multiplier. Priority is given to the job with the lowest value

determined in step 3. Results indicate that both the job dispatch policies produce

excellent mean X-factor performance. Integrated queue and distribution preservation

were the two staff dispatching policies developed. The nominal expected delay job

dispatch policy, in conjunction with the integrated queue staff dispatch policy, was found

to result in savings of nearly 40% in terms of normalized delay which is defined as the

delay divided by the amount of process work (given in time units) in the job.

2.5 Stochastic Programming Models:

Two-stage stochastic linear programs with recourse:

This class of problems was independently proposed by Dantzig (1955) and Beale (1955).

The problems were varyingly referred to as ‘Linear Programming under Uncertainty’ or

‘Linear Programming with Random Coefficients’. The distinguishing feature about these

problems is that the decisions can be split into two groups, those that must be made

before the future (or the uncertainty) is realized and those that are made in light of

decisions already taken and the realized uncertainty. The decisions that must be made in

Page 63: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

48

the present before events actually unfold are called first-stage decisions whereas those

that are made after the uncertainty is revealed are called second-stage or recourse

decisions. The first-stage decisions are actually made assuming that the probability

distribution of the random events is known.

The general formulation of a two-stage stochastic program with recourse can be given as:

: E[ ( , ( ))](2.38)(2.39)

sQ ω+=≥

Tx x ξcAx bx 0

Minimizesubject toP1

First-stage problem

where,

: Q( , ( )) ( )

( ) ( ) ( ) (2.40)

s s

s s s

ω ω

ω ω ω

=

= −≥

Tx ξ yq

W y h T xy 0

Minimizesubject toP2

Second-stage problem

Here, ⊆ 1nc R and ⊆ 1mb R are known vectors, A is of size 1 1m n× and is assumed to

be given. W and T denote the recourse and technology matrices, respectively, and

1( ) ( ), ( ), ( ), ( ), ( ).... ( )( ),.... ( ),s s s s s s ss sω ω ω ω ω ω ωω ω= m2m2 21 2ξ q h TW T TW W is a

vector containing all the random coefficients where ( )sωiW and ( )sωiT represent the

thi row of the matrix ( )sωW and ( )sωT respectively. Given that x is the first-stage

decision, the goal of the second-stage or recourse problem is to find the best recourse

decision y for every possible realization of the random event. On the other hand, the

first-stage decision vector x must be chosen such that it allows for a feasible recourse

action to be taken in order to correct for any discrepancies.

If we denote the set of all those values of x or the first-stage variables that are feasible to

the first-stage problem by 1 | , K = = ≥∈ 1nx Ax b x 0R , and the set of all those values of

x or the first-stage variables that result in a feasible second-stage problem

by 2 |Q( , ( )) K ω= ∈ <∞1nx x ξR , then the following types of recourse problems can

be identified.

Page 64: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

49

1. Fixed Recourse: ( )sω =W W , i.e. W is non-stochastic.

2. Complete Recourse: 2 | K≥ = ⇒ =2 1m nWy y 0 R R . This indicates that the

second-stage problem is feasible for any first-stage decision.

3. Relatively Complete Recourse: The second-stage problem is feasible for any feasible

first-stage decision.

4. Simple Recourse: = [ ] ( ) ( )s sω ω⇒ = −+ -W I, - I - h T xy y . Note that simple

recourse automatically implies complete recourse.

2.5.1 Applications of two-stage stochastic linear programs:

Two-stage stochastic linear programs with recourse have found wide applicability in

problems involving electricity distribution, power generation and expansion, capacity

planning, agriculture, disaster response, supply chain planning, scheduling etc. This

section provides a summary of prior work in an attempt to highlight methods and

applications.

Haneveld and van der Vlerk (2000) consider a planning problem in electricity distribution

with demand uncertainty where electricity can be supplied either from power plants or

small generators. Quotas that define capacity ranges for supply at any moment during the

contract year need to be decided upon for the power plants and these decisions must be

made assuming uncertain demand and the possibility of also using small generators to

supply electricity. Demand is modeled using discrete variables and the objective is to

minimize immediate costs for reserving capacity and expected future costs for satisfying

demand. The decisions on quotas constitute the first-stage variables whereas those

pertaining to the small generators make up the second-stage or recourse variables. A two-

stage mixed integer recourse model is described which is solved using valid cuts and

Lagrangian relaxation.

Darby-Dowman et al. (2000) describe a two-stage stochastic programming with recourse

model to determine optimal planting plans for a vegetable crop where crop yield is a

Page 65: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

50

major cause of uncertainty. The first-stage variables generate a planting plan while the

second-stage variables determine a harvesting schedule for each yield scenario. The

objective function is a trade-off between expected profit and risk, where a risk aversion

coefficient is used to control the weight assigned to each component of the objective

function. The deterministic equivalent of the stochastic model was solved using an

interior point method of a commercial solver.

Chen et al. (2002) develop a scenario based stochastic program to determine technology

and capacity choices under uncertain demand. The objective function minimizes the sum

of investment and operational costs, and the scenario dependent variables involve

capacity additions to the different technology types and the allocation of different

technology types to products in time periods. The solution methodology involves an

augmented Lagrangian method where the Lagrangian function is solved using restricted

simplicial decomposition. Computational results reveal that strategies that call for all

dedicated or all flexible equipment result in severe cost penalties. Also, the timing and the

amount of capacity additions depend on several factors related to demand and costs, and

so generalizations based on limited data could produce misleading results. It was also

found that with increase in demand variability and decrease in cost of flexible technology,

flexible capacity tends to increase.

Nurnberg and Romisch (2002) consider a planning problem in power plant scheduling

with thermal units, pumped hydro storage plants and delivery contracts. A two-stage

stochastic programming model is described for the mid-term cost optimal power

production planning with electrical load, fuel and electricity prices being the uncertain

parameters. The first-stage decisions involve on/off decisions, production levels for the

thermal units and generation and pumping levels for the hydro units. The second-stage

decisions comprise of the recourse actions for each unit in each time period made in light

of the realized load and prices and the first-stage decisions. A stochastic version of

Lagrangian relaxation is used that employs stochastic multipliers with the coupling

constraints. The problem is decomposed into single unit subproblems that are solved

using stochastic dynamic programming. The Lagrangian dual problem itself is solved

through a proximal bundle method. Finally, Lagrangian heuristics are used to obtain a

Page 66: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

51

feasible primal solution, and an economic dispatch method produces a near optimal

solution. Test results on real data from a German utility power plant are reported.

Alonso-Ayuso et al. (2003) develop a splitting variable representation of a two-stage 0-1

stochastic programming model for strategic supply chain planning where the first-stage

decisions involve plant sizing, product allocation to plants and raw materials vendor

selection. The second-stage decisions are tactical and pertain to raw material volume

supply from vendors, inventory, product volume to be processed at the plants and

component volume to be transported from plants to destinations. The uncertainties lie in

product net price and demand and these are modeled using scenarios. A branch and fix

coordination algorithmic approach, originally developed for a multistage problem, has

been modified to fit the two-stage stochastic problem in this case. It works by

coordinating the selection of branching nodes and branching variables in the scenario

subproblems so that they can be jointly optimized. Computational results for large data

sets are presented and reveal the stochastic model to be a more robust approach that

never has worse expected performance than the average scenario based solution.

Albornoz et al. (2004) present a two-stage stochastic integer programming model with

recourse to obtain the optimum policy in a capacity expansion planning problem for a

thermal-electric power system. The first-stage decision variables comprise of binary

investment decisions and total capacity decisions for different kinds of diesel engines, gas

engines and gas turbines. The random parameters pertain to the availability of current

operating plants and these are modeled using discrete variables. The second-stage

variables determine the power generated by the different engines in each time period and

duration mode of the demand curve assuming the available capacities are known for all

the engines. The stochastic integer model is solved using the L-shaped method (Van

Slyke and Wets, 1969) and results indicate that the stochastic model provides superior

solutions compared to the deterministic model. We describe the L-shaped method of

Van Slyke and Wets (1969) in section 2.6.

Schaefer and Schafer (2004) use the L-shaped method (Van Slyke and Wets, 1969) for

optimally locating power generation units with the goal of minimizing location as well as

Page 67: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

52

power distribution costs. Demand is the uncertain variable and it is modeled through a

discrete number of scenarios. The first-stage determines the location of the generators

whereas the second-stage optimally delivers power for a given demand scenario and

location of generators. They implement the model on real-life data and provide results

for the single-cut, multi-cut and extensive form formulations of the problem. Their

results show that the single-cut method is preferable to the multi-cut method for

stochastic integer programs with continuous second-stage variables even when the

number of scenarios is small.

Engell et al. (2004) present a two-stage stochastic programming model for aggregated

scheduling of a multiproduct batch plant in the chemical industry under demand

uncertainty. The first-stage decision variables involve number of polymerization starts

and the states of the finishing lines whereas the second-stage decision variables are

comprised of mixer content variables, production surplus/deficit variables and auxiliary

variables. Both stages of their model involve integrality restrictions and the deterministic

equivalent has a block-angular structure. They use scenario decomposition to split the

problem into subproblems, where the subproblems are coupled by non-anticipativity

constraints. Specialized pre-processing procedures and heuristics are developed to solve

the problem, and test results indicate that the heuristics provide good upper bounds for

all computed instances.

Barbarosoglu and Arda (2004) develop a two-stage stochastic program with full recourse

using a multi-commodity, multi-modal network flow formulation to plan transportation

of first-aid commodities for disaster relief during emergency. Uncertainty is represented

through scenarios and occurs in the supply, demand and arc capacity parameters. The

first-stage of the problem allocates initial supply quantities from the supply nodes to the

other nodes before second-stage demand is realized whereas the second-stage solves a

transportation problem for a given supply plan and realized demand and arc capacities.

The objective function minimizes the sum of first-stage transportation costs and

expected recourse costs. The model is validated using actual data from an earthquake in

Turkey and is solved using a commercial optimization solver. Computational results are

reported.

Page 68: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

53

From the foregoing survey, it is clearly evident that situations that involve decision-

making under uncertainty call for a methodology that is capable of generating robust

solutions which perform well under all scenarios of interest. A comparison of such a

methodology with its deterministic counterpart has repeatedly highlighted the

shortcomings of the latter approach. However, a stochastic program because of the

presence of scenarios is much larger than a pure deterministic model. Nevertheless, it

must be emphasized that decomposition methods, relaxation techniques and heuristics,

specifically tailored for these problems, can be immensely valuable in making these

problems tractable, especially with the kind of computational speeds achievable today.

The above survey also reveals that a significant percentage of the literature dealing with

two-stage stochastic linear programs has focused on problems with complete or relatively

complete recourse. Since the first-stage solution is always second-stage feasible in these

problems, feasibility cuts are not required. For problems that do not possess relatively

complete recourse, feasibility cuts or constraints that induce second-stage feasibility will

become necessary. These cuts increase the size of the master problem, thereby, making it

a lot harder to solve, especially if the master problem is an integer program. Moreover,

with each addition of a feasibility cut, the entire master problem must be re-solved thus

increasing computation times.

2.6 The L-shaped Method of Van Slyke and Wets:

Consider the general two-stage stochastic linear program described in Section 2.5

(problems P1 and P2) for the case with fixed recourse. If we let Ξ ⊆ NR be the support

of ξ , i.e. Ξ denotes the smallest subset in NR such that ( ) 1P ∈Ξ =ξ , then, if ξ has

finite support, there exists a discrete probability distribution that describes ξ . Given

elementary events, sω , 1,...,s S= , with probabilities sp , 1,...,s S= , if we define

( )sωξ = sξ , ( )sωq = sq , ( )s sω =h h and ( )sωT = sT , then the two-stage stochastic linear

program can be written in its equivalent deterministic form as:

Page 69: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

54

1: ( )

(2.41)(2.42)

S

ss

Qp=

+ ∑

=≥

T sx x,c ξ

Ax bx 0

Minimize

subject to

P3

where,

: ( , ) )((2.43)(2.44)

Ts s

s s

Q =

= −≥

x yξ qWy xh Ty 0

Minimize

subject to

P4

In expanded form, this is given as:

1 21 2: .....( ) ( ) ( )

..

.

T T TT ss

S S

s

p y p y p+ + +

=

+ =

+ =

1 2 s

1 1 1

2 2 2

S

xc yq q qAx b

x WT y hx + W =T y h

x WT y hx, 0y

Minimize

subject to

P5

The expanded form typically involves a large number of constraints and variables making

the stochastic program rather difficult to solve. Decomposition methods provide the

means to solve such problems by transforming them into a sequence of smaller, easier

problems. Additionally, the subproblems that are generated using decomposition

methods are generally linear, quadratic or non-linear problems and these are required to

be solved over rather simple deterministic versions of the model. Readily available

commercial optimization solvers, can hence, be used towards this end (Ruszczynski

2003).

The characteristic structure of the problem in its expanded form (P5) makes it amenable

to a primal decomposition technique. Primal decomposition techniques work by solving

Page 70: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

55

many subproblems (as many as the number of events) of type P4 and using information

from the solution of these subproblems to create approximations (cuts) of the recourse

costs ( , ), 1,...,sQ s S=x ξ and its expected value. These approximations are used in a

‘master problem’ that produces new values of the stage-I (first-stage) variables at each

iteration. The master and the subproblems are, thus, solved iteratively with new

approximations being appended to the master problem, until certain termination criteria

are met.

One such primal decomposition technique is the L-shaped method originally proposed

by Van Slyke and Wets (1969) for solving two-stage stochastic linear programs. This

method employs feasibility and optimality cuts that are included as part of the master

problem and each iteration comprises of solving the master (first-stage) problem and the

subproblem (second-stage problem) in an alternating fashion until termination criteria are

met. We now describe these cuts in an attempt to develop the algorithm gradually. For

the general two-stage stochastic linear program with fixed recourse as described by

problems P1 and P2 in section 2.5, if we let

Q( ) [ ( , ( ))]sE Q ω=x x ξ , then P1 can be rewritten as

: Q( )(2.45)(2.46)

T +=

x xcAx bx 0

Minimizesubject toP6

Recall that 1K is the set of all those values of x or the first-stage variables that are

feasible to the first-stage problem and 2K is the set of all those values of x or the first-

stage variables that result in a feasible second-stage problem.

We are interested in those values of the first-stage variables that lie in the intersection of

1K and 2K . More specifically, we would like to:

21

: Q( )) (2.47)(

T

K K+

∈ ∩x xc

xMinimize

subject toP7

P7 can be expressed equivalently as

Page 71: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

56

21

:Q( ) (2.48)

) (2.49)(

T

K K

θθ

+≥∈ ∩

xcx

x

Minimizesubject toP8

Denoting the vector of dual variables corresponding to problem P2 byπ , the dual

formulation for this problem is given by:

: ( ( ) ( ) )( ) (2.50)

T s s

T s

ω ωω

h T xπW qπ

Maximizesubject toD2

Therefore, at the thv iteration of the algorithm, when dealing with the ths realization of

the random variables, if an optimal solution is obtained, the following equation must

hold (obtained by equating the primal and dual objective values at optimality):

( ) ( )( , ) . (2.51)T vs vv

s ssQ = − xξ πx h T

Also, from (2.50) and due to the convexity of the recourse function Q, the following

relation must hold

( ) ( )( , ) . (2.52)Ts v s s

sQ ≥ − xξ πx h T

Now, taking the expectation of equation (2.51), we have

( ) ( )( ) ( )

1

Q( ) [ ( , )] [ ]

. (2.53)

T vs vv v s ss

S T vv s sss

s

QE E

p

ξ ξ

=

= = −

= −∑

xξ πx x h T

xπ h T

Once again, owing to convexity we must have,

( ) ( )1

Q( ) . (2.54)S Tv s s

sssp

=≥ −∑ xπx h T

Now, if we set ( )1

S Tv sss

se p

== ∑ π h and ( )

1

S Tv sss

sE p

== ∑ π T , then in conjunction with

Q( )θ ≥ x and (2.54) we obtain the optimality cut

. (2.55)E eθ+ ≥x

Page 72: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

57

The optimality cut is a linear approximation of Q( )x and, as seen from (2.54), it provides

a lower bound or a linear support on Q( )x .

The feasibility cut on the other hand, is meant to ensure that a first-stage decision results

in a feasible second-stage problem. More precisely, a first-stage decision say vx is said to

be second-stage feasible if there exists a vector ‘ y ’ such that the constraints in the

second-stage problem are satisfied. To determine if a feasible solution to the recourse

problem exists, we solve a Phase-I problem for the second-stage problem. The Phase-I

problem involves adding artificial variables to the constraints of the second-stage

problem and minimizing the sum of these artificial variables in the objective function. If

the objective function value is positive then it indicates that vx does not permit a feasible

solution to the recourse problem. Mathematically, if we let

:(2.56)

, , , (2.57)

T T

v

z + −

+ −

+ −

= +

+ − = −

e v e vWy h Tv v xy 0v v

Minimizesubject toP9

then for the dual vector γ , the dual formulation of P9 is given by

( )

(2.58)

| | . (2.59)

T v

T

Z = −

h Tγ xW 0γ

Maximize

subject to

D9 :

From P9, it is obvious that 0z≥ . If 0z= , then a feasible second-stage solution is

reached. If 0z> , then it implies that the sum of artificial variables in P9 is positive,

which, in turn, implies that the second-stage problem given by P2 is infeasible for the

given first-stage decision vx . For the case where 0z> , note that, by the duality theorem

we must also have ( ) 0T v− >h Tγ x . For the thv iteration of the algorithm and the

ths realization of random variables, feasibility in the second-stage can be enforced

through the following constraint:

Page 73: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

58

( ) 0 . (2.60)( )Tv s s v− ≤γ h T x

If we let ( )Tv sd = γ h and ( )

Tv sD = γ T , then the feasibility cut is given by

(2.61)D d≥x

The L-shaped algorithm of Van Slyke and Wets (1969) is presented as a flow-chart in

Figure 2.4.

2.7 The Multicut Algorithm:

The multicut method (Birge and Louveaux 1988) is a variation of the single cut L-shaped

method (Van Slyke and Wets 1969) where multiple optimality cuts, each corresponding

to a scenario, are added to the stage-I (master) problem in each major iteration.

The L-shaped method solves an approximation of problem P6 (constraint set 2.45 and

2.46) through an outer linearization of ( )Q x . This outer linearization is carried out

through optimality cuts that are generated using dual information from scenario-based

recourse problems with finite objective values. A single optimality cut is generated at

each major iteration and this cut aggregates information from various scenario based

recourse problems. Such aggregation leads to a loss of information. The multicut method

on the other hand, creates an outerlinearization of ( , ( ))sQ x ξ ω by generating an

optimality cut for each scenario based recourse problem. This increases the size of the

master problem compared to the single cut method. However, by doing so, this method

sends more information on the shape of ( )Q x back to the master problem at each major

iteration. While this property makes this method more effective, Birge and Louveaux

(1988) also provide examples where the single cut L-shaped method does better than the

multicut method. More specifically, they prove that, for a sequence of iterates vm ,

generated by the multicut method, and vn , generated by the single cut method, if

vm and vn belong to the same cells of the decomposition of ( )Q x , then the number

Page 74: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

59

No

Yes

No

Yes

Figure 2.4: Flow chart depicting the L-shaped Method of Van Slyke and Wets (1969)

Initialize v = r = s = 0

v = v+1 Solve Minimize.

1...1...

, unrestricted

k k

k k

s tk rk s

θ

θθ

+=≥ =+ ≥ =

c xA x b

x dDeE

x 0x

Denote the solution ( , )v vx θ

For 1...s S= solve the linear programs

Minimizesubject to

, ,

T Ts

v

z + −

+ −

+ −

= +

+ − = −

e v e vWy h Tv v xy 0v v

Add the feasibility cut r rdD ≥x

r = r + 1 For some ‘s’ s.t. 0sz >

( )Tr svd = hγ

( )Tvr sD = γ T

Is

0s sz = ∀ ?

For 1...s S= solve the linear programs Minimize Minimizesubject to

ss

s s v

w =

= −≥

yqWy h T x

y 0

Determine the optimal dual

multipliers vsπ

Is v vwθ ≥ ?

Add the optimality cut

s seE θ+ ≥x

Stop, vx is optimal.

Set s = s + 1

( )1

S Tvs sss

spE Tπ

== ∑ ,

( )1

S Tvs sss

spe hπ

== ∑ , v v

ssw e xE= −

Page 75: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

60

of major iterations required by the multicut method will be less than or equal to that

required by the single cut method. vm and vn are said to belong to the same cells of

decomposition of ( )Q x if ( , ( ))sQ x ξ ω is linear inξ for each iterate x . However, if it is

non-linear, the iterates are believed to generally diverge. Numerical studies show that the

multicut method is preferable when the number of realizations of random variables is not

significantly larger than the number of first-stage constraints. However, the work of

Smith, Schaefer and Yen (2004) reveals that when the first-stage is an integer program,

the single cut L-shaped method performs better than the multicut even when the number

of scenarios is far fewer than the number of first-stage constraints. Adding further

credence to this hypothesis is a another study by Schaefer and Schaefer (2004) who show

that the single cut L-shaped method performs better than the multicut method for a

power generating unit location problem with first-stage integer variables and second-

stage continuous variables, where the number of realizations is far smaller than the

number of first-stage constraints.

Page 76: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

61

C h a p t e r 3

THE STOCHASTIC MODEL FOR A NETWORK OF

MEMS JOB SHOPS (NMJS)

3.1 Background:

The MEMS industry unlike the semiconductor industry where processes over the years

have converged to a few standard technologies such as CMOS, BICMOS or Bipolar, is

still struggling to grasp the manufacturing problems associated in fabricating these

intricate devices. Since a large percentage of the MEMS devices manufactured today tend

to be highly application specific, each device design calls for a unique processing

sequence, which is why there has been little to no standardization of technologies in this

field yet. The low volume and sophisticated processing for these custom-built devices

necessitate the use of expensive and specialized manufacturing equipment, which are

often available only in certain geographically remote facilities. The devices may, therefore,

be required to travel to various fabs that house the needed equipment, until all of their

processing requirements are met. Hence, at any given fab, depending on its

manufacturing expertise, a subset of the total number of operations required for a device

may be performed. The number of operations in this subset may, however, vary

significantly from fab to fab for the same device and will also vary from one device to

another within the same fab. Since the machinery in these fabs is expected to process a

broad spectrum of devices typically in low volumes, the use of dedicated machinery for

individual parameter settings of a specific process is not economically justified. Hence,

we assume that the processing time durations for the operations to be executed on the

machines are not deterministic. To further compound the situation, job arrivals are

dynamic and their arrival times may not be known in advance. Also, the composition of

these jobs or in other words, the number and type of component processing steps

required for each individual job, often become known only after the arrival of the job.

Page 77: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

62

In this chapter, we present a stochastic programming based approach for the NMJS

problem by taking into consideration a real-life, dynamic, stochastic environment in the

presence of transfer times, transfer costs, and sequence-dependent set-ups. In

comparison with other techniques such as Markovian decision processes or discrete

event simulation, a stochastic programming approach has important benefits in that

models and approaches in this domain have been well-studied over the past few decades.

Therefore, a general purpose modeling framework with the ability to capture real-world

features through constraints on states and decision variables is readily possible. Besides,

the tools of convex analysis and duality theory can be applied to yield insights and

decompose large problems into tractable chunks that lead to the attainment of near-

optimal solutions. Advances in computational speed over the years have only served to

make this approach all the more attractive for the NMJS problem.

In what follows, we present the stochastic NMJS problem, modeled as a two-stage

stochastic program with recourse, where the first-stage variables are binary and the

second-stage variables are continuous, assuming uncertainty in processing time durations

only. Stochasticity in processing times is a major cause of concern since this parameter

directly impacts lead time or cycle time. The key decision variables are binary and pertain

to the assignment of jobs to machines and their sequencing for processing on the

machines. The assignment variables essentially fix the route of a job as it travels through

the network because these variables decide the machine on which each job-operation

must be performed out of several candidate machines. Once the assignment is decided

upon, sequencing of job-operations on each machine follows. Since job-operations that

are processed on machines in the network entail both time as well as cost associated with

both travel between the various manufacturing sites as well as processing within each site,

the assignment and sequencing of jobs must be such that the dual objectives of

minimizing their completion times while staying within a pre-specified budget must be

achieved. Moreover, the assignment and sequencing must be such that they offer the best

solution (in terms of the objective) possible in light of all the processing time scenarios

that can be realized.

Page 78: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

63

We present two approaches for solving the stochastic NMJS problem. The first approach

is based on the L-shaped method (van Slyke and Wets, 1969). Since the NMJS problem

lacks relatively complete recourse, in the later sections, we develop constraints that

induce second-stage feasibility. In the second approach we use a scheme that relies on

the solution of the LP relaxation of the deterministic equivalent of the stochastic

program.

Even though, above, we have identified each fab separately, yet we can incorporate the

fabs in our formulation by assigning to every machine of the fabs a unique identification

number that distinguishes it from the other machines and by appropriately considering

the intermachine transportation time and costs. To illustrate, the transportation time and

costs between machines belonging to the same fab would either be zero or a relatively

small number depending on whether intra-fab transportation times and costs are

significant. For all other cases (i.e. those involving machines from different fabs), we

assume that a non-trivial cost and time will be incurred when the wafers are being

transported. Therefore, in our formulation, we do not consider the fab index explicitly.

3.2 The two-stage stochastic program with recourse for the NMJS problem:

Notation Indices Job index, 1,...,i N= Operation index, 1,..., ij J= Machine index, 1,..., | |m M= Scenario index, 1,...,s S= Parameters H = a large positive number iw = number of wafers in job i

M = set of all machines sφ = probability of scenario s

( , )mi jc = cost of performing operation j of job i on machine m

Page 79: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

64

( , )( , )m si jp = processing time of operation j of job i on machine m under scenario

s mz = set of job-operations that can be processed on machine m .

( , )i jM = set of machines capable of processing operation j of job i .

( , , , )mi j k lu = set up time between operation j of job i and operation l of job k on

machine m ib = budget for job i ir = ready time for job i ( , )e fd = travel time between machines e and f

( , )e fdc = travel cost between machines e and f iα = penalty term for job i corresponding to its completion time.

iβ = penalty term for job i corresponding to its budget excess Decision Variables

( , )mi jx

=

1, if operation of job is assigned to machine , 0, otherwise.

j i m

( , , , )

mi j k ly

=

1, if operation of job directly precedes operation of job on machine ,

0, otherwise.

j i l km

( , )( , , 1)e fi j jv +

=

1, if operation of job is performed on machine and operation of job is performed on machine ,0, otherwise.

j i e j +1i f

( , )si jt = completion time of operation of job under scenario . j i s si∆ = budget surplus for job i under scenario s .

Formulation NMJSP:

( , )

( , )( , ) ( , )( , )

1 1 1 1 1 1 1

( , )( , , , ) ( , , 1)( , , , )

1 ( , ) ( , ) ( , ) ( , 1)( , ) ( , )

i

i

i j

m m

N S N S S N Jm ss ms

ii i i jJ is s s i ji s i s s i j m M

Mm e fm

i j k l i j ji j k le fM Mm i j k lz z i j i j

i j k l

Minimize z pt x

y u v

φ φ φβα= = = = = = = ∈

+∈ ∈= ∈ ∈ +

= + +∆

+ + ∑

∑ ∑ ∑ ∑ ∑ ∑∑ ∑

∑ ∑ ∑1

( , )1 0

iN J

e fi j

d−

= =∑∑∑

(3.1)

Page 80: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

65

Subject to:

( , )( , )

( , ) ( , ) ( , 1)( , , 1)( , 1) ( , 1)( , 1) ( , ) ( , 1)

) ,(

1, ..., , 0 , ..., 1, 1, ..., .

m e fm ss si j e f i ji j ji j i j

m e fM M Mi j i j i j

i

p xt v d t

i N j s SJ

+++ +∈ ∈ ∈+ +

+ + ≤∑ ∑ ∑

∀ = = − =

(3.2)

( , )( , ) ( , , , ) ( , )( , )( , ) ( , , , )(1 ) ,

( , ) , ( , ) , ( , ) ( , ), 1,..., , .

mm ss m smi j i j k l k lk lk l i j k l

m m

Hp yt u txi j k l i j k l s S m Mz z

+ + ≤ + −

∀ ∈ ∈ ≠ ∀ = ∈ (3.3)

( , )( , )

1 , 1, ..., , 1, ..., .mi j i

m M i j

i N jx J∈

= ∀ = =∑ (3.4)

( , )( , )( , , , )

( , )( , ) ( , )

, 1, ... , , 1, .. . , , .m mk lk l ki j k l

i j z mi j k l

k N l my x J M∈≠

≤ ∀ = = ∈∑ (3.5)

( , )( , )( , , , )

( , )( , ) ( , )

, 1,..., , 1,..., , .m mk lk l kk l i j

i j zmi j k l

k N l my x J M∈≠

≤ ∀ = = ∈∑ (3.6)

( , )( , , , )( , ) ( , ) ( , )

( , ) ( , )

1, .m m m

m mi ji j k l

i j k l i jz z zi j k l

m My x∈ ∈ ∈

≥ − ∀ ∈∑ ∑ ∑ (3.7)

( , ) ( , ) ( , 1)

1( , ) ( , )

( , ) ( , ) ( , )( , , 1)( , )1 1

,

1,..., , 1,..., .

i i

i j i j i j

J Jm s e fm m s

ii j i j e f n ii j ji jj m j e fM M M

pc x v dc w b

i N s S+

+= ∈ = ∈ ∈

+ − ≤ ∆

∀ = ∀ =

∑ ∑ ∑ ∑ ∑ (3.8)

( , )

( , )( , , 1)

( , )( , ) ( , 1)( , , 1) ( , 1)

( , )( , )( , , 1) ( , 1)

, 1, ..., , 1, ..., , , .

1

e f ei ji j j

e f fi j i jii j j i j

e f fei ji j j i j

v x

i N j e lv x J M M

v x x

+

++ +

+ +

≤ ∀ = = ∈ ∈

≥ + −

(3.9)

( ,0) , 1,..., .s

ii i Nt r≥ ∀ = (3.10)

( , ) 0 , 1,..., , 1,..., , 1,..., .si j ii N j s St J≥ ∀ = = = (3.11)

0 , 1,..., , 1,..., .s

i i N s S≥ ∀ = =∆ (3.12)

Page 81: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

66

( , )( , ) 0,1, 1,..., , 1,..., , .mi ji j ii N j mx J M∈ ∀ = = ∈ (3.13)

( , , , ) 0,1, 1,..., , ( , ) , ( , ) , ( , ) ( , ).mm mi j k l m M i j k l i j k ly z z∈ ∀ = ∈ ∈ ≠ (3.14)

( , )

( , ) ( , 1)( , , 1) 0,1, 1,..., , 1,..., 1, , .e fi j i jii j j i N j e fv J M M ++ ∈ ∀ = = − ∈ ∈ (3.15)

The objective function (Expression 3.1) is composed of five terms. The first and second

terms minimize the sum of job completion times and budget surpluses. Penalty

coefficients in the objective function model customer emphasis on lead time and cost

incurred. The third term minimizes the sum of expected processing times for all

operations of all jobs, the fourth term minimizes the total set up time on the machines

whereas the fifth term minimizes the sum of travel times incurred by all the jobs. Note

that terms three, four and five in the objective function support the first term by aiding

lower completion times and while they may appear redundant, they are necessary. This is

because when the NMJSP is decomposed into the first and second-stage problems (see

Section 3.3), terms one and two of the objective function, owing to the scenario

dependent variables ( ( , )si jt and s

i∆ ) automatically form the objective of the Stage-II

problem and this leaves the Stage-I problem devoid of an objective function if terms

three, four and five were absent. In the absence of an objective function, the Stage-I

problem may generate random feasible solutions which could ultimately result in longer

convergence times because a large number of optimality cuts may be required to force

the master problem to converge to the optimum. It is, therefore, important to choose an

appropriate Stage-I objective function. The terms (three, four and five in Expression 3.1)

chosen here, determine an assignment of values to the variables in the Stage-I problem

such that they support the goal of the second-stage problem. Note that all the terms in

Expression (3.1), except for the second, are measured in the same units of time. The

second term, however, is measured in terms of dollars. In order to ensure consistent

units, we assume henceforth (unless specified otherwise) that a unit of time costs a dollar.

Constraints (3.2) capture precedence relationships between operations of the same job.

Specifically, it states that the completion time of operation 1+o of job i must at least be

Page 82: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

67

equal to the completion time of operation o of job i plus the processing time of

operation 1+o plus any travel time incurred between the two operations.

Constraints (3.3) express the relationship between operations to be performed on the

same machine. Given two job-operations, say ( , )i j and ( , )k l , capable of being

performed on a certain machine, m , if ( , )i j were to directly precede ( , )k l , (i.e.

( , , , ) 1mi j k ly = ), then the completion time of ( , )k l must be at least equal to the completion

time of ( , )i j plus the processing time of ( , )k l plus a sequence-dependent set up time

between the two operations. Observe that, when ( , , , ) 0mi j k ly = , that is, ( , )i j does not

directly precede ( , )k l , the constraint becomes redundant in that it only suggests that the

left side of the constraint be less than a large positive number.

Constraints (3.4) ensure that each job-operation is assigned to exactly one machine out of

the several candidates available for processing it.

Constraints (3.5 and (3.6)) state that if a job-operation, say ( , )i j , is assigned to a

machine m , it can be either preceded (succeeded) by at most one job-operation from the

set of operations that the machine is capable of processing. Note that if ( , )i j is the first

operation to be processed on this machine, it will not be preceded by any other operation

and likewise if ( , )i j is the last operation to be processed, it will not be succeeded by any

other operation. In both of these cases, the left side of Expressions (3.5) and (3.6) will

sum to zero despite the fact that ( , )i j is assigned to machine m (i.e. the right side is

equal to 1).

Constraints (3.5) and (3.6) also state that if ( , )i j is not assigned to machine m , then all

direct precedence variables that relate ( , )i j with other operations on machine m must

be equal to zero. Clearly, if an operation is not assigned to a machine, it cannot directly

precede or succeed any other operation that the machine is capable of processing. This

relationship is important to model because constraints (3.3) are enforced for every

Page 83: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

68

machine and every combination of operations that the machine is capable of processing.

While a machine may be capable of processing many operations (as defined by the set

mz ), it is not necessary that all of the operations in this set are assigned to it for

processing. For those operations in this set not assigned to a machine, constraints (3.5)

and (3.6) ensure that the direct precedence variables ( y vector) linking these operations

with other operations on the machine are set equal to zero. This has the effect of making

constraints (3.3) redundant whenever it is enforced for a pair of operations out of which

either one is assigned to the machine or none is assigned.

Note that the following relation is implicitly enforced by constraints ((3.5) and (3.6))

If ( , , , ) 1mi j k ly = , then ( , ) 1m

i jx = and ( , ) 1mk lx =

This relation states that if ( , )i j directly precedes ( , )k l on machinem , then, both ( , )i j

and ( , )k l must be assigned to machine m .

To illustrate this, let ( , )i j = (2,2), ( , )k l = (3,3), (2,2) 1 2,M M M= and

(3,3) 2 3,M M M= . Also, let 2(2,2,3,3) 1My = . In other words, (2,2) directly precedes (3,3) on

machine 2M . To see whether this implies that 2(2,2) 1Mx = and 2

(3,3) 1Mx = , note that

constraint (3.4) for ,3,3 == lk and Mm 2= reads as

22

2

( 3 ,3 )( , ,3 ,3 )( , )( , ) ( , )

M

MMi j

i j Zi j k l

y x∈≠

≤∑

Also, 2 (2,2), (3,3)Mz = . Therefore,

2 2(3,3)(2,2,3,3)

M My x≤ and since 2(2,2,3,3) 1My = , it must be that 2

(3,3) 1Mx =

Likewise, constraint (3.5) for ,2,3 == lk and Mm 2= reads as

Page 84: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

69

22

2

( 2 ,2 )( 2 ,2 , , )( , )( , ) ( , )

M

MMi j

i j zi j k l

y x∈≠

≤∑

Once again, with 2 (2,2), (3,3)Mz = , it must be that

2 2(2.2)(2,2,3,3)

M My x≤ and since 2(2,2,3,3) 1My = , 2

(2,2) 1Mx = must hold.

Constraints (3.7) guarantee that if a machine has n operations assigned to it for

processing, then there must exist 1−n direct precedence variables. This constraint is

specified as an inequality instead of an equality to account for the case where the number

of operations assigned to a machine is actually zero. In this case, the summation of the

binary y variables on the left side of the constraint is prevented from actually equalling -

1.

Constraints (3.8) enforce budget restrictions on each job i under every processing time

scenario s . This constraint permits the sum of processing costs and travel costs for all

operations of a job to exceed the budget by an amount equal to si∆ but with a

corresponding penalty in the objective function.

Constraint set (3.9) depicts the relationship between the tracking variables v and the

assignment variables x . Specifically, it ensures that whenever two successive operations

of a job are assigned to different machines, a tracking variable tracks this information

about the job and its operations as well as the machines they have been assigned to by

setting itself equal to 1. Note that ( , )( , , 1) 1e fi j jv + = if and only if both ( , ) 1e

i jx = and

( , 1) 1fi jx + = . If either ( , ) 0e

i jx = or ( , 1) 0fi jx + = , ( , )

( , , 1) 0e fi j jv + = . The v variables are essential

in our formulation because they assist in accounting for the time required to travel

between machines in constraint (3.2).

Page 85: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

70

Constraints (3.10) set the completion time of the zeroeth operation of every job to be

equal to the ready time of the job. The zeroeth operation is a dummy operation solely

intended to model job arrival times. Observe that constraints (3.2) which establishe

precedence relationships between operations of the same job, require information on the

completion time of the zeroeth operation of every job. This information is supplied as

the job ready-time parameter.

Constraints (3.11) – (3.15) place non-negativity and binary restrictions on the decision

variables. Next, we present two approaches for the solution of the NMJS problem.

Approach 1 is based on the adaptation of the L-shaped method to the NMJS problem

and Approach 2 relies on the deterministic equivalent of the NMJS problem. These are

presented in Sections 3.3 and 3.4 respectively.

3.3 Approach 1: The L-shaped method for the NMJS problem

Formulation NMJSP can be split into the following Stage-I (master) and Stage-II

(recourse) problems.

Stage 1: Master Problem

MP: Minimize ( , )

( , )( , ) ( , , , )( , ) ( , , , )

1 1 1 1 ( , ) ( , )( , ) ( , )

i

i j m m

S N MJm s mm m

i j i j k ls i j i j k ls i j m m i j k lM Z Z

i j k l

p yx uφ= = = ∈ = ∈ ∈

+∑ ∑∑ ∑ ∑ ∑ ∑

1( , )

( , )( , , 1)1 0 1( , ) ( , 1)

Q( , , , )iN SJ

se fe fi j j s

e fM Mi j si j i j

v d φ−

+∈ ∈= = =+

+ +∑ ∑∑∑ ∑ x y v ξ (3.16)

Subject to:

( , )( , )

1 , 1, ..., , 1, ..., .mi j i

m M i j

i N jx J∈

= ∀ = =∑ (3.17)

( , )( , )( , , , )

( , ), 1, .. . , , 1, .. . , , .m m

k lk l ki j k li j z m

k N l my x J M∈

≤ ∀ = = ∈∑ (3.18)

( , )( , )( , , , )

( , ), 1,..., , 1,..., , .m m

k lk l kk l i ji j zm

k N l my x J M∈

≤ ∀ = = ∈∑ (3.19)

Page 86: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

71

( , )( , , , )( , ) ( , ) ( , )

( , ) ( , )

1,m m m

m mi ji j k l

i j k l i jz z Zi j k l

m My x∈ ∈ ∈

≥ − ∀ ∈∑ ∑ ∑ (3.20)

( , )

( , )( , , 1)

( , )( , ) ( , 1)( , , 1) ( , 1)

( , )( , )( , , 1) ( , 1)

, 1, ..., , 1, ..., , ,

1

e f ei ji j j

e f fi j i jii j j i j

e f fei ji j j i j

v x

i N j e lv x J M M

v x x

+

++ +

+ +

≤ ∀ = = ∈ ∈

≥ + −

(3.21)

( , )( , ) 0,1, 1,..., , 1,..., , .mi ji j ii N j mx J M∈ ∀ = = ∈ (3.22)

( , , , ) 0,1, 1,..., , ( , ) , ( , ) , ( , ) ( , ).mm mi j k l m M i j k l i j k ly z z∈ ∀ = ∈ ∈ ≠ (3.23)

( , )

( , ) ( , 1)( , , 1) 0,1, 1,..., , 1,..., 1, , .e fi j i jii j j i N j e fv J M M ++ ∈ ∀ = = − ∈ ∈ (3.24)

where Q( , , , )sx v y ξ is the recourse function corresponding to the optimal value of the

subproblem that minimizes sum of job completion times and budget for a given

assignment vector x , sequencing vector y , tracking vector v , and for a processing time

scenario sξ . The linear subproblem for scenario s is given by:

Stage II: Recourse Problem

: Q( , , , )s

RP =x v y ξ Min ( , )1

i

Ns

i i Ji

tα=∑ +

1

Nsii

=∆∑ (3.25)

Subject to:

( , )( , )( , ) ( , ) ( , 1)( , 1) ( , , 1)( , 1)

( , 1) ( , ) ( , 1)

) ,(

1, ..., , 0 , ..., 1, 1, ..., .

e fm ss smi j e f i ji j i j ji j

m e fM M Mi j i j i j

i

pt v d tx

i N j s SJ

++ ++∈ ∈ ∈+ +

+ + ≤∑ ∑ ∑

∀ = = − =

(3.26)

( , )( , ) ( , , , ) ( , )( , )( , ) ( , , , )(1 ) ,

( , ) , ( , ) , ( , ) ( , ), 1,..., , .

mm ss m smi j i j k l k lk lk l i j k l

m m

Hp yt u txi j k l i j k l s S m Mz z

+ + ≤ + −

∀ ∈ ∈ ≠ ∀ = ∈ (3.27)

( , ) ( , ) ( , 1)

1( , ) ( , )

( , ) ( , ) ( , )( , , 1)( , )1 1

,

1,..., , 1,...,

i i

i j i j i j

J Jm s e fm m s

ii j i j e f n ii j ji jj m j e fM M M

pc x v dc w b

i N s S+

+= ∈ = ∈ ∈

+ − ≤ ∆

∀ = ∀ =

∑ ∑ ∑ ∑ ∑ (3.28)

Page 87: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

72

( ,0) , 1,...,s

ii i Nt r≥ ∀ = (3.29)

( , ) 0 , 1,..., , 1,..., , 1,..., .si j ii N j s St J≥ ∀ = = = (3.30)

0 , 1,..., , 1,..., .s

i i N s S≥ ∀ = =∆ (3.31)

We note that, in the decomposition outlined above for formulation NMJSP, the master

problem (Expressions (3.16)-(3.24)) could generate an assignment and sequencing

solution that might not be feasible to the subproblem (Expressions (3.25)-(3.31)). There

are two reasons for this. Firstly, note that constraints (3.2) in formulation NMJSP, upon

decomposition, form a part of the subproblem (Expressions (3.25)-(3.31)). These

constraints capture the fact that the completion time of a lower indexed operation of a

job must be less than or equal to any higher indexed operations of the same job. In

formulation NMJSP, these constraints in conjunction with other constraints that

determine the value of the y variables (Expressions (3.3), (3.5), (3.6) and (3.7)) ensure

that, in the case of reentrant flow where a job visits a machine for multiple operations,

the lower indexed operations of a job are sequenced before a higher indexed operation of

the same job. But, since constraints (3.2) no longer form a part of the master problem, its

absence may result in an assignment and sequencing vector that does not honor the

reentrant flow conditions.

In addition to violating the reentrant flow conditions, the assignment and sequencing

vector from the master problem may create yet another infeasibility, called a deadlock.

This occurs in the face of a certain configuration of assignment and sequencing decisions

that result in a circuit or a cycle wherein each operation in the cycle waits for another

operation within the cycle to complete processing. Under these circumstances, none of

the operations in the cycle are in a position to begin processing and this ultimately causes

a ‘deadlock’. Shown below in Figure 3.1 is a deadlock situation for the case of two

machines involving four operations.

Page 88: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

73

Figure 3.1: A deadlock configuration involving two machines

Note that on M1, (4,2) must wait for (2,2) to finish processing owing to sequencing

restrictions. Operation (2,2) on M1 depends on (2,1) on M2 owing to operation

precedence constraints. On M2, (2,1) depends on (4,3) to complete processing due to

sequencing restrictions. Finally, (4,3) on M2 depends on (4,2) on M1 owing to operation

precedence constraints. Thus, none of the four operations can begin processing resulting

in a deadlock. Once again, this occurs due to the absence of the crucial operation

precedence constraints (Expression 3.2) in the master problem.

Thus, when formulation NMJSP is decomposed into the first and second-stage problems

as given by Expressions (3.16)-(3.24) and (3.25)-(3.31), respectively, we see that the first-

stage solution can be infeasible to the second-stage problem. The NMJS problem,

therefore, does not possess the property of relatively complete recourse wherein any

feasible first-stage solution is always feasible to the second-stage problem. In order to

render the first-stage solution feasible to the second-stage, it is necessary to alleviate the

infeasibility arising due to the reentrant flow and deadlock. One of the ways to achieve

this is through the use of artificial variables as described by Van Slyke and Wets (see

Section 2.6). These variables are inserted into the subproblems for every scenario, and

feasibility cuts are developed that become a part of the master problem which is then

re-solved to obtain a master solution that is feasible to the subproblems. For a given

output of x , y and v vectors from the master problem, the following subproblem

(second-stage) is solved, one for each scenario. This comprises phase-I of the approach:

M1

M2

(2,2) (4,2)

(4,3) (2,1)

Page 89: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

74

3.3.1 Subproblem/Stage-II (Recourse) problem augmented with artificial

variables – Phase I:

ARP: Minimize ( , ) ( , )2( , , ) 3( , , , ) 4( , , , )1( , , )

1 0 1 ( , ) ( , )( , ) ( , )

) ( )(i

m m

N MJm s m s

i j s i j k l i j k li j si j m i j k lz z

k l i j

a a a a= = = ∈ ∈

+ + +∑∑ ∑ ∑ ∑ (3.32)

Subject to:

( 1)

( , )( , )( , ) ( , )( , 1) ( , , 1)( , 1)

( , 1) ( , ) ( , 1)

1( , 1, ) 2 ( , 1, ) ( , 1)

)(

, 1, ..., , 0 , ..., , 1, ..., .n

e fm ss mi j e fi j i j ji j

m e fM M Mi j i j i j

si j s i j s i j

pt v dx

i N j s Sa a t o −

+ ++∈ ∈ ∈+ +

+ + +

+ +∑ ∑ ∑

+ − ≤ ∀ = = =

(3.33)

( , ) ( , ) ( , *)

( , ) ( , ) ( , , , ) ( , )3( , , , ) 4( , , , )( , ) ( , , , )(1 ) ,m s mm s m ss m m si j k l i j k l k li j k l i j k lk l i j k l Hp yt x u a a t+ + + − ≤ + −

, ( , ) , ( , ) : ( , ) ( , ), 1,..., .m mm M i j k l i j k l s Sz z∀ ∈ ∈ ∈ ≠ = (3.34)

( , ) ( , ) ( , 1)

1( , ) ( , )

( , )( , ) ( , )( , , 1)( , ) ( , )1 1

,i i

n o i j i j

J Jm s m e fm

i si j e f i ii j ji j i jj m j e fM M M

p xc v d w b+

+= ∈ = ∈ ∈

+ − ≤∆

∑ ∑ ∑ ∑ ∑

(3.35)

1,..., .i N∀ =

( ,0) 1( ,0, ) 2( ,0, ) , 1,..., , 1,..., .sii i s i s i N s St a a r+ − = ∀ = = (3.36)

1( , , ) 2( , , ), 0 , 1,..., , 0,..., , 1,..., .ii j s i j s i N j s Sa a J≥ ∀ = = = (3.37)

( , ) ( , )3( , , , ) 4( , , , ), 0 , , ( , ) , ( , ) :( , ) ( , ), 1,..., .m s m s

m mi j k l i j k l m M i j k l i j k l s Sa a z z≥ ∀ ∈ ∈ ∈ ≠ = (3.38)

( , ) 0 , 1,..., , 1,..., .si j ii N jt J≥ ∀ = = (3.39)

( , ) 0 , 1,..., .i s i N≥ ∀ =∆ (3.40)

The objective function (Expression (3.32)) minimizes the sum of artificial variables.

Expressions (3.33) and (3.34) are operation precedence and sequence-dependent set up

constraints augmented with artificial variables, respectively. Expressions (3.35) do not

require any artificial variables because these constraints will always be feasible owing to

Page 90: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

75

the budget surplus variables si∆ . Expression (3.36) depicts the ready time constraints

augmented with artificial variables. Expressions (3.37)-(3.40) represent non-negativity

restrictions on artificial variables and variables native to the subproblem.

If the value of the objective function (Expression (3.32)) equals zero for all the

subproblems, then it indicates that the solution from the master program (first-stage) is

feasible to the recourse (second-stage) problem. However, if there exists a scenario, say

s , such that the subproblem corresponding to this scenario has a non-trivial objective

value, then feasibility cuts are generated so as to eliminate the corresponding solution

from the master program. We describe this next.

3.3.2 Development of Feasibility Cuts

Re-arranging constraints (3.32) - (3.40) such that all the constants appear on the right

hand side and all the variables appear on the left hand side, we have

( , ) ( , )2( , , ) 3( , , , ) 4( , , , )1( , , )

1 1 1 ( , ) ( , )( , ) ( , )

) ( )(i

m m

N MJm s m s

i j s i j k l i j k li j si j m i j k lz z

k l i j

Minimize a a a a= = = ∈ ∈

+ + +∑∑ ∑ ∑ ∑ (3.41)

Subject to:

( , ) ( , 1) 1( , 1, ) 2 ( , 1, )

( , )( , )( , )( , 1) ( , , 1)( , 1)

( , 1) ( , ) ( , 1)

) ,(

s si j i j i j s i j s

e fm s me fi j i j ji j

m e fM M Mi j i j i j

t t a a

p v dx

+ + +

+ ++∈ ∈ ∈+ +

− + − ≤

− +∑ ∑ ∑

(3.42)

( 1)1, .., , 0,..., , 1, ..., .ni N j s So −∀ = = =

( )( , )( , ) ( , )( , ) ( , ) ( , ) ( , , , )3( , , , ) 4( , , , ) ( , , , ) ( , ) ,m m sm s m ss s m mi j k l k l i j k li j k l i j k l i j k l k lH H y pt t a a x u− + − ≤ − + + (3.43)

, ( , ) , ( , ) : ( , ) ( , ), 1,..., .m mm M i j k l i j k l s Sz z∀ ∈ ∈ ∈ ≠ =

( , ) ( , ) ( , 1)

1( , ) ( , )

( , ) ( , ) ( , )( , , 1)( , ) ( , )1 1

,i i

n o i j i j

J Jmm s e fm

i s i i j e f ii j ji j i jj m j e fM M M

xpb c v d w+

+= ∈ = ∈ ∈

− ≤ − +∆

∑ ∑ ∑ ∑ ∑ (3.44)

1,..., .i N∀ =

( ,0) 1( ,0, ) 2( ,0, ) , 1,..., , 1,..., .sii i s i s i N s St a a r+ − = ∀ = = (3.45)

Page 91: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

76

We define the following dual variables corresponding to constraints (3.42)-(3.45) for

scenario s . The ‘nth’ feasibility cut, where n = 1,…,fCUT, is as follows:

fCUT = total number of feasibility cuts ( , )( , )s ni jfo (≤ 0), the dual variables associated with the operation precedence constraints

(3.42) ( , , )( , , , )m s ni j k lfs (≤0), the dual variables associated with the sequence-dependent set-up

constraints (3.43).

( , )( )s nifb (≤ 0), the dual variables associated with the budget constraints (3.44).

( , )( , )s ni jfr (Unrestricted), the dual variables associated with the ready time constraints (3.45).

Then, in accordance with Expression (2.60), we have the following nth feasibility cut for

scenario s ,

( )1

( , ) ( , )( , )( , )( , 1) ( , , 1)( , ) ( , 1)

1 1 ( , 1) ( , ) ( , 1)

( , , ) ( , )( , )( , , , ) ( , ) ( , , , )

( , ) ( , ) :( , ) ( , )

( )

i

m m

Ns n e fm s m

e fi j i j ji j i jm e fM M Mi j i j i j i j

m s n m s mmk li j k l k l i j k l

m i j k lz zi j k l

fo p v dx

Hfs p yx

J −

+ ++∈ ∈ ∈= = + +

= ∈ ∈≠

+ +∑ ∑ ∑

+

∑∑

∑ ∑

( , ) ( , ) ( , 1)

1

1( , ) ( , ) ( , )

( , ) ( , ) ( , )( , , 1)( ) ( , )1 1 1

( , ) ( , , )( ) ( , , , )

1 ( , ) ( , ) :( , ) ( , )

(

i i

i j i j i j

m

M

Ns n m s e fm m

i j i j e f ii j ji i ji j m j e fM M M

Ns n m s n

ii i j k li i j k l z

i j k l

fb pc x v dc w

fb fsb

J J

+

+= = ∈ = ∈ ∈

= ∈≠

+

+

≥ +

∑ ∑ ∑ ∑ ∑ ∑

∑ ∑ ( , , , )1

( , )( , )

1 0

)m

Mmi j k l

m z

Ns n

ii ji j

H u

fr r

= ∈

= =

+

∑ ∑

∑∑

(3.46) Each feasibility cut is appended to the master program, which, in turn, gives a new

solution to be used in the subproblems. This process is iterative and terminates when

the objective function values for all the subproblems in Phase-I equals zero.

Note that the entire master problem needs to be solved each time a new feasibility cut is

derived. In addition to the master problem, linear programs corresponding to the

augmented subproblems need to be solved, one for each scenario, in order to check the

Page 92: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

77

feasibility of the master solution and generate another feasibility cut. There could

potentially exist a large number of infeasibilities in the master solution. This, in turn,

translates into an equal number of feasibility cuts and a re-solving of as many master

problems. The master problem, is a binary-integer program and in presence of feasibility

cuts its complexity increases further and therefore its repeated resolution could incur a

burden on computational resources and time. The other alternative is to make

provisions within the master problem, so as to generate solutions with as little

infeasibility as possible, so that, very few feasibility cuts as represented by Expression

(3.46) would need to be generated. This approach definitely merits investigation. This

would mean introducing additional constraints within the master problem to induce

second-stage feasibility. While this alternative would nevertheless contribute to increasing

the complexity of the master problem, its performance can best be predicted through

experimentation.

In the L-shaped method for the NMJS problem involving M machines, if the master

problem carries additional constraints for reentrant flow and dead lock prevention for all

M machines, over and above those given by Expressions (3.16)-(3.24) then, no feasibility

cuts will be required. This clearly follows by the fact that such a Q will trivially satisfy the

operation precedence, machine capacity/set-up and budget constraints that are present in

the Stage-II problem. Note that the budget constraints cannot be violated because any

budget excess is absorbed by the surplus variables that are penalized in the objective

function. Therefore, completion times can be determined for such a schedule in Stage-II.

In the sequel, we show how provisions can be made within the master problem to

generate solutions with as little infeasibility as possible.

3.3.3 Eliminating infeasibilities due to reentrant flow:-

Operation precedence constraints owing to reentrant flow wherein lower indexed

operations must necessarily precede higher indexed operations can be enforced in the

master problem by appending additional constraints to it. This is accomplished by

introducing indirect precedence variables. Let us define:

Page 93: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

78

( , , , )mi j k lg

=

1, if operation of job precedes operation of job anywhere on machine 0, otherwise

j i l km

The following constraints are essential in order to tie the newly defined g variable

with the existing variables in the master problem:

( , , , ) ( , , , ) 1, , ( , ), ( , ) : ( , ) ( , ),

( , ) ( , ).

m mmi j k l k l i j m M i j k l i j k lg g z

ord k l ord i j

+ ≤ ∀ ∈ ∈ ≠

> (3.47)

( , )( , , , )

( , )( , , , )

, , ( , ), ( , ) : ( , ) ( , ).

, , ( , ), ( , ) : ( , ) ( , ).

m mmi ji j k l

m mmi jk l i j

m M i j k l i j k lg x z

m M i j k l i j k lg x z

≤ ∀ ∈ ∈ ≠

≤ ∀ ∈ ∈ ≠ (3.48)

Transitivity Constraints:

( , , , ) ( , , , ) ( , , , ) 1, , ( , ), ( , ),

( , ) : ( , ) ( , ) ( , ).

m m mi j i j k l k l

m

m M i j k lg g g

i j k lZγ η γ η

γ η γ η

≥ + − ∀ ∈

∈ ≠ ≠ (3.49)

Reentrant Flow Constraints:-

( , ) ( , )( , , , ) 1, , ( , ), ( , ) : .m m mmi j k li j k l m M i j k l i k and j lg x x z≥ + − ∀ ∈ ∈ = <

(3.50)

Expression (3.47) states that given two job-operations on a machine, one of them must

either precede or succeed the other. The Expression ord ord>(k, l) (i, j) implies that the

position of (k,l) is greater than the position of (i, j) in the set mz . For instance, if

(1,1), (2,2), (3,2)mz = , then the position of (1,1) is 1, that of (2,2) is 2 and that of

(3,2) is 3. Expression (3.48) ensures that if a job-operation is not assigned to a machine

then all indirect precedence variables that relate the job-operation with other job-

operations on the machine are set to zero. The transitivity constraints (Expression (3.49))

state that given three job-operations (i,j), (k,l) and (γ,η) on machine m, if operation (i,j) is

scheduled somewhere before operation (k,l) and operation (k,l) is scheduled somewhere

before operation (γ,η), then operation (i,j) must necessarily be scheduled before operation

(γ,η). Finally, the reentrant flow conditions (Expression (3.50)) ensure that if two

Page 94: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

79

operations of the same job are assigned to a machine, the lower indexed job operation

precedes the higher indexed operation.

Logical Constraints connecting the indirect and the direct precedence variables:-

( , , , ) ( , , , ) , , ( , ), ( , ) : ( , ) ( , ).m mmi j k l i j k l m M i j k l i j k lg y z≥ ∀ ∈ ∈ ≠

(3.51)

The above constraint (Expression (3.51)) states that if a direct precedence variable is

equal to one, then the corresponding indirect precedence variable must be set equal to

one as well. In addition to the above constraints, we can also capture the exact number of indirect

precedence variables that must exist between job-operations on a machine. To that end,

( , ) ( , )( , ) ( , )

( , , , )( , ) ( , )

1

,2

m m

m m

m mi j i j

i j i jz zmi j k l

i j k lz z

x xm Mg

∈ ∈

∈ ∈

− = ∀ ∈

∑ ∑∑ ∑ (3.52)

Therefore,

( , ) ( , ) ( , )( , , , )( , ) ( , ) ( , ) ( , ) ( , )

2 ,m m m m m

m m m mi j i j i ji j k l

i j k l i j i j i jz z z z Z

g x x x∈ ∈ ∈ ∈ ∈

= −

∑ ∑ ∑ ∑ ∑ (3.53)

Consider the first term on the right side of Expression (3.53). For the purposes of

illustration say m=1 and 1 ( , ), ( , ), ( , )m i j k lz z γ η= = . Upon expanding this term, we

get

( ) ( )1 1 1 1 1 1( , ) ( , ) ( , ) ( , ) ( , ) ( , ) ( , ) ( , )

( , ) ( , )m m

m mi j i j i j k l i j k l

i j i jz zx x x x x x x xγ η γ η

∈ ∈

= + + + +

∑ ∑ (3.54)

If we define a new variable h such that

Page 95: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

80

( , , , ) ( , ) ( , )

1, if operation of job and operation of job are both assigned to machine (i.e. = 1 and = 1) ,0, otherwise,

m m mi j k l i j k lh x x

=j i l k

m

then the right side of Expression (3.54) can be written as

( , , , ) ( , , , ) ( , , , ) ( , , , ) ( , , , ) ( , , , )

( , , , ) ( , , , ) ( , , , )

m m m m m mi j i j i j k l i j k l i j k l k l k l

m m mi j k l

h h h h h h

h h h

γ η γ η

γ η γ η γ η γ η

+ + + + + + + +

(3.55)

The following logical constraints depict the desired relationship between the h and the x

variables. In particular, the h variables are set equal to 1 if and only if both the

corresponding x variables are found to be 1; else, these variables take on a value of 0.

( , , , ) ( , )

( , , , ) ( , )

( , , , ) ( , ) ( , ) 1 , 1,..., , ( , ), ( , ) : ( , ) ( , ),( , ) ( , ).

m mi j k l i j

m mi j k l k l

m m mmi j k l i j k l

h x

h xm M i j k l i j k lh x x zord k l ord i j

≥ + − ∀ = ∈ ≠

>

(3.56)

From the definition of the h variables, we have, ( , , , ) ( , ) ( , ) ( , )

m m m mi j i j i j i j i jh x x x= = . A similar

reasoning holds for ( , , , )mk l k lh and ( , , , )

mh γ η γ η .

Also, ( , , , ) ( , ) ( , ) ( , , , ) ( , ) ( , ),m m m m m mi j k l i j k l k l i j k l i jh x x h x x= = . Therefore, ( , , , ) ( , , , )

m mi j k l k l i jh h= . A similar

substitution can be made for the other variables. Consequently, Expression (3.55) reduces

to

( , ) ( , , , ) ( , , , ) ( , ) ( , , , ) ( , )2 2 2m m m m m mi j i j k l i j k l k lx h h x h xγ η γ η γ η + + + + + (3.57)

Now, considering the entire right side of Expression (3.53), we have

( )

( , ) ( , ) ( , ) ( , ) ( , , , ) ( , , , )( , ) ( , ) ( , )

( , ) ( , ) ( , ) ( , ) ( , )( , , , )

2 2

2

m m m

m m m m m mi j i j i j i j i j k l i j

i j i j i jz z z

mm m m m mk l i j k lk l

x x x x h h

hx x x x x

γ η

γ η γ ηγ η

∈ ∈ ∈

− = + + +

+ + − + +

∑ ∑ ∑ (3.58)

( , , , ) ( , , , ) ( , , , )2 2 2m m mi j k l i j k lh h hγ η γ η= + + .

Hence, referring to (3.53), we have

Page 96: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

81

( , , , ) ( , , , )( , , , ) ( , , , )( , ) ( , )

2 2 2 2m m

mm m mi j k l i ji j k l k l

i j k lz z

hg h h γ η γ η∈ ∈

= + +∑ ∑ (3.59)

Upon generalizing the above result, we get

( , , , )( , , , )( , ) ( , ) ( , ) ( , )

( , ) ( , ) ( , ) ( , )( , ) ( , )

, .m m m m

m mi j k li j k l

i j k l i j k lz z Z zi j k l i j k l

ord k l ord i j

m Mg h∈ ∈ ∈ ∈

≠ ≠>

= ∀ ∈∑ ∑ ∑ ∑ (3.60)

3.3.4 Eliminating infeasibilities due to cyclic scheduling:-

With the inclusion of the reentrant flow constraints the master (Stage-I) program is now

expressed as

MMP: min (3.16) s.t. (3.17)-(3.24), (3.47)-(3.51), (3.56), (3.60).

However, despite the addition of the reentrant flow constraints, to prevent operation

precedence infeasibilities in the subproblem, it is easy to foresee a situation where the

subproblem could still be infeasible for a master solution that honors all of the

constraints in the master problem. Consider the example introduced earlier in Figure 3.1.

The cyclic nature of this schedule generated by the master problem is shown below in

Figure 3.2.

Figure 3.2: Cyclic and acyclic schedules for the case of two machines

(2,2) (4,2)

(4,3) (2,1)

(2,2) (4,2)

(2,1) (4,3)

M1

(a)

(4,2) (2,2)

(4,3) (2,1)

M1 (4,2) (2,2)

(2,1) (4,3)

M1

(b)

(c) (d)

M2

M2M2

M1

M2

Page 97: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

82

As seen in Figure 3.2(a), operations (2,2) and (4,2) are assigned to machine M1, where

(2,2) directly precedes (4,2). Also assigned are operations (4,3) and (2,1) to machine M2,

where (4,3) directly precedes (2,1). Note that on machine M1, (2,2) may not begin

processing until its preceding operation (2,1) completes processing on machine M2. But

(2,1) cannot begin processing until (4,3) finishes processing because (4,3) directly

precedes it on M2. Now, (4,3) cannot begin processing before its preceding operation

(4,2) finishes processing on M1. However, (4,2) may not begin processing until (2,2)

finishes processing because (2,2) directly precedes in on M1. This completes a cycle and

produces what is called a ‘deadlock’ wherein none of the operations in the cycle can begin

processing because each depends on the completion of another in the cycle. In Figure

3.2(a), the dashed black arrows represent operation precedence constraints that exist

between operations of the same job, whereas the block arrows represent precedence

constraints due to the prescribed sequencing on the machines.

Note that, for a given assignment of job-operations to the machines, Figures 3.2(a), (b),

(c) and (d) depict all the possible configurations that can be obtained by varying their

sequencing on the machines. Out of all these configurations, only the sequence in Figure

(a) results in a deadlock. In Figures 3.2(b), (c) and (d), the green arrows represent a valid

sequence for processing operations on the two machines. While there might exist

multiple valid sequences, the Figures above depict only one of them. For instance,

although Figure 3.2(d) depicts (2,1) as being performed before (4,2), this need not

necessarily hold true because (2,1) and (4,2) can begin processing at the same time on

their respective machines because these operations are unrelated. Likewise, once (2,1) and

(4,2) complete processing, operations (2,2) and (4,3) can begin processing without (4,3)

having to wait for (2,2) because these are again unrelated. The key deduction from these

figures is that, given the above four operations and two machines with assignments as

shown, there exists only one sequencing configuration that produces a deadlock or

infeasibility. Note that for a given assignment of (2,2) and (4,2) on M1 such that (2,2)

precedes (4,2) and an assignment of (4,3) and (2,1) on M2, such that (4,3) precedes (2,1),

if the sequence on M2 is altered so as to have (2,1) precede (4,3), then it would result in

an acyclic schedule as depicted by configuration 3.2(b).

Page 98: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

83

Figure 3.3 illustrates other deadlock possibilities.

Figure 3.3: Other forms of deadlock

In Figure 3.3(b), the deadlock that exists is the same as in Fig 3.2(a), however, the

operations involved in the deadlock may not directly precede each other on their

respective machines. Figure 3.3(c) represents a deadlock between operations of the same

job. Careful observation reveals that this situation is not possible because the reentrant

flow constraints (Expressions (3.70)-(3.74), (3.79) and (3.83)) would prevent (2,5) from

being sequenced before (2,2) on machine M2. Figure 3.3(d) illustrates multiple deadlocks,

the first between operations (3,3) and (4,2) on M1 and between operations (4,3) and (3,2)

on M2, as shown by block arrows, and the second between operations (2,2) and (4,2) on

M1 and between operations (4,3) and (2,1) on M2, as shown by the dashed black and blue

arrows. Consider the first deadlock between operations (3,3) and (4,2) on M1 and

between operations (4,3) and (3,2) on M2. As noted earlier, operations (4,3) and (3,2)

(2,2) (4,2)

(4,3) (2,1)

M1

(2,3) (2,4)

(2,5) (2,2)

(a)

(2,2) (4,2)

(4,3) (2,1)

(3,3)

(1,4)

(b)

(c)

(2,2) (4,2)

(4,3) (2,1)

(3,3)

(3,2)

(d)

M2

M2

M1

M2

M1

M2

M1

Page 99: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

84

must be swapped so as to have (3,2) precede (4,3) in order to resolve the deadlock. This

move, however, does not resolve the second deadlock that exists between operations

(2,2) and (4,2) on M1 and between operations (4,3) and (2,1) on M2. In order to resolve

the second deadlock, the positions of (4,3) and (2,1) must now be swapped so as to have

(2,1) precede (4,3). The final deadlock free configuration is depicted in Figure 3.4 below.

Figure 3.4: A deadlock free configuration

We now present a generalization of the above arguments.

3.3.5 Analysis of Deadlock Occurrence and Prevention

The two-machine case:

Definitions:

Relationship R2: A relationship R2 is said to exist between job operations (a,b) and (c,d)

assigned to machine M1 and (e,f) and (g,h) assigned to machine M2 (see Figure 3.5) if and

only if

1. (a,b) and (e,f) are related such that a = e and f = b - α, where α > 0

2. (c,d) and (g,h) are related such that c = g and h = d + β, where β > 0

3. (a,b) and (c,d) are unrelated.

4. (e,f) and (g,h) are unrelated.

(e)

(2,2) (4,2)

(3,2) (4,3)

(3,3)

(2,1)

M1

M2

Page 100: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

85

Sequence S2: A sequence S2 is said to exist between job operations related by R2 if and

only if

1. (a,b) precedes (c,d) on M1.

2. (g,h) precedes (e,f) on M2.

Proposition 1: A two-machine deadlock results if and only if the job operations (a,b),

(c,d), (e,f) and (g,h) are related by R2 and sequenced according to S2.

Proof: We begin by first showing that if job operations (a,b), (c,d), (e,f) and (g,h) are related

by R2 and sequenced according to S2, then it results in a deadlock. To see this, note that

the above conditions result in the following configuration.

Figure 3.5: Operations related by R2 and sequenced according to S2

1. On M1, (c,d) must wait for (a,b) to finish processing owing to sequencing

restrictions.

2. (a,b) must wait for (a,b-α) on M2 to finish processing owing to operation

precedence constraints that exist between operations of job a.

3. On M2, (a,b-α) must wait for (c,d+β) to finish processing owing to sequencing

restrictions.

4. (c,d+β) must wait for (c,d) on M1 to finish processing owing to operation

precedence constraints that exist between operations of job c.

The above configuration along with the resulting precedences that it generates can be

represented as a network as shown in Figure 3.6.

(a,b) (c,d)

(c,d+β)= (g,h) (a,b-α)= (e,f)

M1

M2

Page 101: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

86

Figure 3.6: Network representation of a two-machine deadlock

The network in Figure 3.6 represents a directed cycle. Therefore, none of the operations

can begin processing thereby resulting in a deadlock.

Proceeding further, we must now show that if there exists a deadlock in the case of two

machines, then it must be caused by

i. Job operations related by R2.

ii. Job operations sequenced according to S2.

The above statement is of the type P ⇒ Q. We shall prove it by proving its

contrapositive i.e. ∼Q ⇒ ∼P. We, therefore, need to show that, in the case of two

machines, if job operations are not related by R2 or if they are not sequenced according

to S2, the resulting configuration is deadlock free. Towards that end, we shall prove the

following three cases:

i. Job operations not related by R2 but sequenced according to S2 result in a

deadlock free configuration.

ii. Job operations related by R2 but not sequenced according to S2 result in a

deadlock free configuration.

iii. Job operations neither related by R2 nor sequenced according to S2 result in a

deadlock free configuration.

(c,d+β)

(a,b) (c,d)

(a,b-α)

Page 102: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

87

Case i: Job operations not related by R2 but sequenced according to S2.

The configurations under this category comprise of those that arise by negating each

clause under the definition of relationship R2 and their combinations thereof. We first

examine a subset of these configurations denoted by Ω, where Ω = Configurations

obtained by negating clauses 1 or 2 or both under (the definition of relationship) R2.

Also, let Π= configurations where clauses 3 and 4 under R2 are preserved

Subset Ω ∩ Π:

I. (a,b) and (c,d) assigned to M1, (a, b+α) (=(e,f)) and (c,d-β) (=(g,h)) assigned to M2.

II. (a,b) and (c,d) assigned to M1, (a, b-α) (=(e,f)) and (c,d-β) (=(g,h)) assigned to M2.

III. (a,b) and (c,d) assigned to M1, (a, b+α) (=(e,f)) and (c,d+β) (=(g,h)) assigned to M2.

IV. (a,b) and (c,d) assigned to M1, (e,f) and (c,d-β) (=(g,h)) assigned to M2 where (a,b) and

(e,f) are unrelated.

V. (a,b) and (c,d) assigned to M1, (e,f) and (c,d+β) (=(g,h)) assigned to M2 where (a,b)

and (e,f) are unrelated.

VI. (a,b) and (c,d) assigned to M1, (a, b-α) (=(e,f)) and (g,h) assigned to M2 where (c,d)

and (g,h) are unrelated.

VII. (a,b) and (c,d) assigned to M1, (a, b+α) (=(e,f)) and (g,h) assigned to M2 where (c,d)

and (g,h) are unrelated.

Figure 3.7 illustrates the network representation for the above configurations.

Case (i)-I Case (i)-II

(c,d-β)

(a,b) (c,d)

(a,b+α) (c,d-β)

(a,b) (c,d)

(a,b-α)

Page 103: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

88

Case (i)-III Case (i)-IV

Case (i)-V Case (i)-VI

Case (i)-VII

Figure 3.7: Acyclic network configurations

In contrast to the directed cyclic network represented by the deadlock configuration in

Figure 3.6, all seven configurations in Figure 3.7, represent an acyclic network. Note that,

with respect to Figure 3.6, in Cases (i)-I, (i)-II and (i)-III illustrated above, at least

diagonal arrows changes direction thus breaking the cycle. In the remaining cases, bi-

directional arrows connect operations that are unrelated, thus, breaking the cycle.

(c,d+β)

(a,b) (c,d)

(a,b+α)

(c,d-β)

(a,b) (c,d)

(e,f)

(c,d+β)

(a,b) (c,d)

(e,f)

(g,h)

(a,b) (c,d)

(a,b-α)

(g,h)

(a,b) (c,d)

(a,b+α)

Page 104: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

89

Let Π1 = Configurations where clauses 3 or 4 or both under R2 are violated

Subset Ω ∩ Π1:

Configurations in which clauses 3 or 4 or both under R2 are violated do not cause a

change in the direction of the horizontal arrows in the configurations ∈ Ω because, if

(a,b) and (c,d) or (e,f) and (g,h) or both are related, we must have ,c a d b= > or

,e g h f= > or both because job-operations in Case (i) are sequenced according to S2.

Therefore, configurations ∈ Ω ∩ Π1will continue to stay acyclic.

Let Σ = Configurations where clauses 1 and 2 under R2 are preserved

Subset Σ ∩ Π1:

A contradiction arises when clauses 1 and 2 under R2 co-exist with (a,b) and (c,d) being

related or (e,f) and (g,h) being related as ,c a d b= > or ,e g h f= > because it imposes

contradictory conditions of all four operations being related and two operations being

unrelated simultaneously.

Moreover, clauses 1 and 2 under the definition of relationship R2 cannot co-exist with

(a,b) and (c,d) being related and (e,f) and (g,h) being related as ,c a d b= > and

,e g h f= > because it imposes contradictory conditions of (g,h) having to precede and

succeed (c,d) simultaneously.

Case ii: Job operations related by R2 but not sequenced according to S2.

The following configurations exist when job operations are related by R2 but not

sequenced according to S2.

I. (a,b) precedes (c,d) on M1 but (a,b- α) (=(e,f)) precedes (c,d+β ) (=(g,h)) on M2.

II. (c,d) precedes (a,b) on M1 and (c,d+β) (=(g,h)) precedes (a,b-α) (=(e,f)) on M2.

III. (c,d) precedes (a,b) on M1 but (a,b-α) (=(e,f)) precedes (c,d+β) (=(g,h)) on M2

Figure 3.8 depicts the resulting acyclic network obtained for each of the above

configurations. Note that they are all acyclic.

Page 105: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

90

Case ii-I Case ii-II

Case ii-III

Figure 3.8: Operations related by R2 but not sequenced according to S2

Case iii Job operations neither related by R2 nor sequenced according to S2.

The configurations under this category comprise of those that arise by negating each

clause under the definition of relationship R2 and sequence S2 and their combinations

thereof. We first examine a subset of these configurations (relationships) denoted by

Γ ∩ ς ∩ Π, where Γ= Configurations obtained by negating clauses 1 or 2 or both

under R2 such that a e= and c g= and ς = Configurations where one or more

clauses under S2 are violated and Π= configurations where clauses 3 and 4 under R2

are preserved

Subset Γ ∩ ς ∩ Π:

I. (a,b) and (c,d) assigned to M1 with (a,b) preceding (c,d), (a, b+α) (=(e,f)) and (c,d-β)

(=(g,h)) assigned to M2 with (a, b+α) preceding (c,d-β).

(c,d+β)

(a,b) (c,d)

(a,b-α)

(c,d+β)

(a,b) (c,d)

(a,b-α) (c,d+β)

(a,b) (c,d)

(a,b-α)

Page 106: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

91

II. (a,b) and (c,d) assigned to M1 with (c,d) preceding (a,b), (a, b+α) (=(e,f)) and (c,d-β)

(=(g,h)) assigned to M2 with (c,d-β) preceding (a, b+α).

III. (a,b) and (c,d) assigned to M1 with (c,d) preceding (a,b), (a, b+α) (=(e,f)) and (c,d-β)

(=(g,h)) assigned to M2 with (a, b+α) preceding (c,d-β).

IV. (a,b) and (c,d) assigned to M1 with (a,b) preceding (c,d), (a, b-α) (=(e,f)) and (c,d-β)

(=(g,h)) assigned to M2 with (a, b-α) preceding (c,d-β) .

V. (a,b) and (c,d) assigned to M1 with (c,d) preceding (a,b), (a, b-α) (=(e,f)) and (c,d-β)

(=(g,h)) assigned to M2 with (a, b-α) preceding (c,d-β).

VI. (a,b) and (c,d) assigned to M1 with (c,d) preceding (a,b), (a, b-α) (=(e,f)) and (c,d-β)

(=(g,h)) assigned to M2 with (c,d-β) preceding (a, b-α).

VII. (a,b) and (c,d) assigned to M1 with (a,b) preceding (c,d), (a, b+α) (=(e,f)) and (c,d+β)

(=(g,h)) assigned to M2 with (a, b+α) preceding (c,d+β).

VIII. (a,b) and (c,d) assigned to M1 with (c,d) preceding (a,b), (a, b+α) (=(e,f)) and (c,d+β)

(=(g,h)) assigned to M2 with (c,d+β) preceding (a, b+α).

IX. (a,b) and (c,d) assigned to M1 with (c,d) preceding (a,b), (a, b+α) (=(e,f)) and (c,d+β)

(=(g,h)) assigned to M2 with (a, b+α) preceding (c,d+β).

Figure 3.9 illustrates the network representation for configurations ∈ Γ∩ ς ∩ Π.

Case (iii)-I Case (iii)-II

(c,d-β)

(a,b) (c,d)

(a,b+α) (c,d-β)

(a,b) (c,d)

(a,b+α)

Page 107: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

92

Case (iii)-III Case (iii)-IV

Case (iii)-V Case (iii)-VI

Case (iii)-VII Case (iii)- VIII

(c,d+β)

(a,b) (c,d)

(a,b+α) (c,d+β)

(a,b) (c,d)

(a,b+α)

(c,d-β)

(a,b) (c,d)

(a,b-α) (c,d-β)

(a,b) (c,d)

(a,b-α)

(c,d-β)

(a,b) (c,d)

(a,b-α) (c,d-β)

(a,b) (c,d)

(a,b+α)

Page 108: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

93

Case (iii)-IX

Figure 3.9: Network representation of the configurations obtained by negating clauses 1

or 2 or both under R2 and S2

All of the above cases (I-IX) except (III) deviate from the directed cyclic network

represented by Figure 3.6 in that the directions of atleast one horizontal and one diagonal

arrow is reversed. Hence, the cycle is broken and we have an acyclic network. In case

(III) however, the directions of all arrows are reversed, thereby, resulting in a directed

cyclic network. This configuration is the antithesis of the one represented by Figure 3.6.

Note that, operations in case (III) are infact related according to R2 and sequenced as per

S2. The roles of operations (a,b) and (c,d) are merely reversed.

Let Ψ= Configurations obtained by negating clauses 1 or 2 or both under R2 such that

a e≠ or c g≠ or both

Subset Ψ∩ ς ∩ Π :

Configurations belonging to this subset have bi-directional diagonal arrows because one

or more pair(s) of operations are unrelated thus yielding an acyclic network.

Recall that, Σ = Configurations where clauses 1 and 2 under R2 are preserved and

Π1 = Configurations where clauses 3 or 4 or both under R2 are violated

(c,d+β)

(a,b) (c,d)

(a,b+α)

Page 109: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

94

Subset Γ ∩ ς ∩ Π1:

Configurations belonging to Γ ∩ ς but where (a,b) and (c,d) or (e,f) and (g,h) or both are

related, are acyclic because the direction of the horizontal arrows continues to stay the

same.

Subset Σ ∩ Π1 ∩ ς:

Configurations that result by preserving clauses 1 and 2 under R2 with (a,b) and (c,d)

being related and (e,f) and (g,h) being related but where operations are not sequenced

according to S2, are similar to those described in Case (ii) and are acyclic. Observe that, a

contradiction arises when configurations are born as a result of preserving clauses 1 and

2 under R2 with either (a,b) and (c,d) being related or (e,f) and (g,h) being related but where

operations are not sequenced according to S2. This is because it imposes contradictory

conditions of all four operations being related and two operations being unrelated

simultaneously.

Proposition 2: For the case of two machines, where job operations are related by R2 and

are sequenced according to S2, the deadlock can be broken by switching the sequence on

any one of the two machines.

Proof: Figure 3.6 depicts the two-machine deadlock configuration in the form of a

directed cyclic network. In this configuration, jobs operations are related by R2 and

sequenced according to S2. If the sequence on any of the two machines were to be

switched, this would have effect of reversing the direction of one of the horizontal

arrows thus breaking the cycle.

Thus, in the case of two machines, if operations (a,b) and (c,d) are assigned to one of the

machines such that (a,b) precedes (c,d) and if operations (a,b-α) and (c,d+β) where α, β > 0

are assigned to the other machine, then (a,b-α) must necessarily precede (c,d+β) to avoid a

deadlock. Mathematically, this is expressed as:

( , , , ) ( , , , )( , , , ) ( , , , ) 2n mm na b c d a b c da b c d a b c dg gh h α βα β − +− + ≥ + + −

, , ( , ), ( , ) , ( , ), ( , ) , , 0m nFor m n M a b c d a b c dz zα β α β∈ ∈ − + ∈ > (3.61)

Page 110: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

95

The three-machine case:

Definitions:

Relationship R3: A relationship R3 is said to exist between job operations (a,b) and (c,d)

assigned to machine M1 and (e,f) and (g,h) assigned to machine M2 and (i,j) and (k,l)

assigned to machine M3 if and only if

1. (a,b) and (g,h) are related such that a = g and h = b - α where α > 0

2. (e,f) and (k,l) are related such that e = k and l= f- β where β > 0

3. (c,d) and (i,j) are related such that c = i and j = d+γ where γ > 0

4. (a,b) and (c,d) are unrelated

5. (e,f) and (g,h) are unrelated

6. (i,j) and (k,l) are unrelated.

Sequence S3: A sequence S3 is said to exist between job operations related by R3 if and

only if

1. (a,b) precedes (c,d) on M1.

2. (e,f) precedes (g,h) on M2.

3. (i,j) precedes (k,l) on M3.

Proposition 3: In a three machine configuration, that is free from all two machine

deadlocks, a deadlock results if and only if job operations (a,b), (c,d), (e,f),(g,h) and (i,j), (k,l)

are related by R3 and sequenced according to S3

Proof: We begin by first showing in the case of three machines, that is free of all two

machine deadlocks, that, if job operations (a,b), (c,d), (e,f),(g,h) and (i,j), (k,l) are related by

R3 and sequenced according to S3, then it results in a deadlock. To see this, note that the

above conditions lead to the configuration shown in Figure 3.10.

Page 111: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

96

Figure 3.10: A deadlock involving three machines

Note that

i. On M1, (c,d) must wait for (a,b) to finish processing owing to sequencing

restrictions.

ii. (a,b) must wait for (a,b-α) on M2 to finish processing owing to operation

precedence constraints that exist between operations of job a.

iii. On M2, (a,b-α) must wait for (e,f) to finish processing owing to sequencing

restrictions.

iv. (e,f) must wait for (e,f-β) on M3 to finish processing owing to operation

precedence constraints that exist between operations of job e.

v. On M3, (e,f-β) must wait for (c,d+γ) to finish processing owing to sequencing

restrictions.

vi. (c,d+γ) must wait for (c,d) on M1 to finish processing owing to operation

precedence constraints that exist between operations of job c.

The above configuration along with the resulting precedences that it generates can be

represented as a network as shown in Figure 3.11.

(a,b) (c,d)

(e,f) (a,b-α)

(c,d+γ) (e,f-β)

M1

M2

M3

Page 112: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

97

Figure 3.11: Network representation of a three-machine deadlock

Clearly, the network in Figure 3.11 is a directed cycle. Therefore, none of the operations

can begin processing, thereby, resulting in a deadlock.

Proceeding further, we must now show that if there exists a deadlock in the case of three

machines, then it must be caused by

iii. Job operations related by R3.

iv. Job operations sequenced according to S3.

The above statement is of the type P ⇒ Q. We shall show it by proving its contrapositive

i.e. ∼Q ⇒ ∼P. We, therefore, need to show that, in the case of three machines, if job

operations are not related by R3 or if they are not sequenced according to S3, the

resulting configuration is deadlock free. Towards that end, we shall prove the following

three cases:

i. Job operations not related by R3 but sequenced according to S3 result in a

deadlock free configuration.

ii. Job operations related by R3 but not sequenced according to S3 result in a

deadlock free configuration.

iii. Job operations neither related by R3 nor sequenced according to S3 result in a

deadlock free configuration.

(e,f)

(a,b) (c,d)

(a,b-α)

(c,d+γ) (e,f-β)

Page 113: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

98

Case i: Job operations not related by R3 but sequenced according to S3.

The configurations under this category comprise of those that arise by negating each

clause under the definition of relationship R3 and their combinations thereof. We first

examine a subset of these configurations denoted by Ω ∩ Π, where Ω = Configurations

obtained by negating clauses 1, 2 or 3 or all three under (the definition of relationship)

R3 and Π= configurations where clauses 4, 5 and 6 under R3 are preserved

Subset Ω ∩ Π:

I. (a,b), (c,d) assigned to M1, (e,f), (a,b-α ) assigned to M2, (c,d+γ), (e,f+β) assigned to

M3.

II. (a,b), (c,d) assigned to M1, (e,f), (a,b-α ) assigned to M2, (c,d-γ), (e,f-β) assigned to M3.

III. (a,b), (c,d) assigned to M1, (e,f), (a,b-α ) assigned to M2, (c,d-γ), (e,f+β) assigned to

M3.

IV. (a,b), (c,d) assigned to M1, (e,f), (a,b+α ) assigned to M2, (c,d-γ), (e,f-β) assigned to

M3.

V. (a,b), (c,d) assigned to M1, (e,f), (a,b+α ) assigned to M2, (c,d-γ), (e,f+β) assigned to

M3

VI. (a,b), (c,d) assigned to M1, (e,f), (a,b+α ) assigned to M2, (c,d+γ), (e,f-β) assigned to

M3

VII. (a,b), (c,d) assigned to M1, (e,f), (a,b+α ) assigned to M2, (c,d+γ), (e,f+β) assigned to

M3

VIII. Configurations where clauses 1 or 2 or 3 or all three are negated such that one or

more pairs of operations are unrelated.

Page 114: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

99

Case (i)-I Case (i)-II

Case (i)-III Case (i)-IV

Case (i)-V Case (i)-IV

(e,f)

(a,b) (c,d)

(a,b-α)

(c,d+γ (e,f+β)

(e,f)

(a,b) (c,d)

(a,b+α)

(c,d+γ) (e,f-β)

(e,f)

(a,b) (c,d)

(a,b-α)

(c,d-γ) (e,f-β)

(e,f)

(a,b) (c,d)

(a,b-α)

(c,d-γ) (e,f+β)

(e,f)

(a,b) (c,d)

(a,b+α)

(c,d-γ) (e,f-β)

(e,f)

(a,b) (c,d)

(a,b+α)

(c,d-γ) (e,f+β)

Page 115: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

100

Case (i)-VII

Figure 3.12: Acyclic network configurations

In contrast to the directed cyclic network represented by the deadlock configuration in

Figure 3.11, the first seven (Case (i)-i,…, Case(i)-VII) configurations in Figure 3.12,

represent an acyclic network. Note that, with respect to Figure 3.11 one or more diagonal

arrows changes direction thus breaking the cycle. Moreover, for the configurations under

Case (i)-VIII, where clauses 1 or 2 or 3 or all three are negated such that one or more

pairs of operations are unrelated, this implies bi-directional diagonal arrows in the

network configuration of Figure 3.11. Therefore, such configurations will be acyclic.

Let Π1 = Configurations where one or more of clauses 4, 5 and 6 under R3 are

violated

Subset Ω ∩ Π1:

The configurations that exist under this subset fall into one of the following categories.

They are either: 1) Feasible, or 2) Infeasible. Infeasible configurations are born as a result

of contradictory conditions arising out of the intersection of Ω and Π1. We exclude the

infeasible configurations from further consideration. Amongst the feasible configurations

that are possible in this subset, it is easy to see that, when one or more of clauses 4, 5 and

6 under R3 are violated, we must have one or more of the relations c = a,d > b or

e = g,h > f or i = k, l > j hold, since job-operations in case (i) must be sequenced

(e,f)

(a,b) (c,d)

(a,b+α)

(c,d+γ) (e,f+β)

Page 116: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

101

according to S3. These configurations will, therefore, be acyclic since they do not cause

any reversal in the direction of horizontal arrows for configurations ∈ Ω that are

sequenced according to S3.

Let Σ = Configurations where clauses 1, 2 and 3 under R3 are preserved

Subset Σ ∩ Π1:

Configurations belonging to Σ but where one of the clauses 4, 5 and 6 are negated,

cannot exist because these configurations produce a two-machine deadlock which we

have already assumed our configurations to be free of. Figure 3.13 depicts an instance

where clauses 1, 2 and 3 under R3 are preserved but clause 4 is negated in

that c = a,d > b .

Figure 3.13: Resulting configuration obtained by preserving clauses 1, 2 and 3 under R3

but negating clause 4.

By temporarily eliminating M1 from consideration, we have a two-machine deadlock as

seen in Figure 3.14.

M1

M2

M3

(e,f)

(a,b) (a,b+φ) = (c,d)

(a,b-α)

(a,b+φ+γ)= (c,d+γ)

(e,f-β)

Page 117: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

102

Figure 3.14: A two-machine deadlock

Note that, configurations belonging to Σ, but where any two of the clauses 4, 5 and 6 are

negated, cannot exist because this automatically implies that the third clause is also

negated. Also, since job-operations are sequenced according to S3, configurations

belonging to Σ but where all of clauses 4, 5 and 6 are negated (such that c = a,d > b or

e = g, f > h or i = k, l > j ) are not possible because these configurations impose

contradictory conditions of (i,j) succeeding as well preceding (c,d) simultaneously.

Case ii: Job operations related by R3 but not sequenced according to S3.

The following sequences exist when job operations are related by R3 but not sequenced

according to S3.

I. (c,d) precedes (a,b) on M1, (e,f) precedes (a,b- α) on M2, (c,d+γ ) precedes (e,f-β) on

M3.

II. (a,b) precedes (c,d) on M1, (a,b- α) precedes (e,f) on M2, (c,d+γ ) precedes (e,f-β) on

M3.

III. (c,d) precedes (a,b) on M1, (a,b- α) precedes (e,f) on M2, (c,d+γ ) precedes (e,f-β) on

M3.

IV. (a,b) precedes (c,d) on M1, (e,f) precedes (a,b-α) on M2, (e,f-β) precedes (c,d+γ ) on

M3.

V. (c,d) precedes (a,b) on M1, (e,f) precedes (a,b-α) on M2, (e,f-β) precedes (c,d+γ ) on

M3.

(e,f) (a,b-α)

(a,b+φ+γ) (e,f-β)

Page 118: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

103

VI. (a,b) precedes (c,d) on M1, (a,b-α) precedes (e,f) on M2, (e,f-β) precedes (c,d+γ ) on

M3.

VII. (c,d) precedes (a,b) on M1, (a,b-α) precedes (e,f) on M2, (e,f-β) precedes (c,d+γ ) on

M3.

Note that, when job-operations are related by R3 but not sequenced according to S3, the

directions of one or more horizontal arrows in Figure 3.11 gets reversed, thus breaking

the cycle. For clarity, we illustrate the configurations under this case in Figure 3.15 below.

Case (ii)-I Case (ii)-II

Case (ii)-III Case (ii)-IV

(e,f)

(a,b) (c,d)

(a,b-α)

(c,d+γ) (e,f-β)

(e,f)

(a,b) (c,d)

(a,b-α)

(c,d+γ) (e,f-β)

(e,f)

(a,b) (c,d)

(a,b-α)

(c,d+γ) (e,f-β)

(e,f)

(a,b) (c,d)

(a,b-α)

(c,d+γ) (e,f-β)

Page 119: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

104

Case (ii)-V Case (ii)-VI

Case (ii)-VII

Figure 3.15: Operations related by R3 but not sequenced according to S3.

(e,f)

(a,b) (c,d)

(a,b-α)

(c,d+γ) (e,f-β)

(e,f)

(a,b) (c,d)

(a,b-α)

(c,d+γ) (e,f-β)

(e,f)

(a,b) (c,d)

(a,b-α)

(c,d+γ) (e,f-β)

Page 120: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

105

Case iii: Job operations neither related by R3 nor sequenced according to S3 result in a

deadlock free configuration.

Since it is tedious to prove this case through enumeration, we shall prove the above

statement for the case of m machines where 2 ≤ m ≤ |M|, and thereby, use that proof to

justify the above case for three machines.

Proposition 4: For the case of three machines, where job operations are related by R3

and are sequenced according to S3, the deadlock can be broken by switching the

sequence on any one of the three machines.

Proof: Figure 3.11 depicts the three-machine deadlock configuration in the form of a

directed cyclic network. In this configuration, jobs operations are related by R3 and

sequenced according to S3. If the sequence on any of the three machines were to be

switched, this would have effect of reversing the direction of one of the horizontal

arrows thus breaking the cycle.

From Proposition 4, we see that, in the case of three machines, that is free of all two

machine deadlocks, if operations (a,b) and (c,d) are assigned to machine 1 such that (a,b)

precedes (c,d), and if operations (e,f) and (a,b-α) are assigned to machine 2 such that (e,f)

precedes (a,b-α) and operations (c,d+γ) and (e,f-β) are assigned to machine 3, then (e,f-β)

must necessarily precede (c,d+γ) on machine 3 to avoid a deadlock. Here, machines may

be numbered in any order and α,β,δ > 0. Mathematically, this is expressed as:

( , , , ) ( , , , ) ( , , , )( , , , ) ( , , , ) ( , , , ) 4n m mm n ka b c d e f a b c d e fe f c d a b c d e f a bg g gh h hα γ ββ γ α− + −− + −≥ + + + + −

, , , ( , ), ( , ) , ( , ), ( , ) , ( , ),( , ) , , , 0.

m n

k

m n k M a b c d e f a b c dz ze f z

α γβ α β γ

∀ ∈ ∈ − ∈ +− ∈ >

(3.62)

Page 121: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

106

The ‘m’-machine case:

Relationship Rm: In the case of m machines, a relationship Rm is said to exist between

the job operations (a,b) and (c,d) assigned to machine M1, (e,f) and (g,h) assigned to

machine M2, (i,j) and (k,l) assigned to machine M3 and,…, (p,q-δ) and (r,s) are assigned to

machine M(m-1), where δ >0, and (u,v) and (x,y) assigned to machine Mm if and only if

1) (a,b) and (e,f) are related such that a = e and f = b - α, where α > 0

2) (g,h) and (i,j) are related such that g = i and j = h - β, where β > 0

and so on…

….. until

m-1) (r,s) and (u,v) are related such that r = u and v = s - φ, where φ > 0

m) (c,d) and (x,y) are related such that c = x and y = d+ γ, where γ > 0

and

1*) (a,b) and (c,d) are unrelated.

2*) (e,f) and (g,h) are unrelated.

3*) (i,j) and (k,l) are unrelated and so on…..

…..until

m-1*) (p,q-δ) and (r,s) are unrelated

m*) (u,v) and (x,y) are unrelated.

Sequence Sm: In the case of m machines a sequence Sm is said to exist between job

operations related by Rm if (a,b) precedes (c,d) on machine M1, (g,h) precedes (e,f) on

machine M2, (k,l) precedes (i,j) machine M3 and so on…….until (r,s) precedes (p,q-δ) on

machine M(m-1) and (x,y) precedes (u,v) on machine Mm.

Page 122: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

107

Proposition 5: In an m machine configuration, that is free from all 1,2,…,(m-1) machine

deadlocks, a deadlock results if and only if job operations are related by Rm and

sequenced according to Sm.

Proof: We begin by first showing in the case of m machines, that is free of all 1,2,…,

(m-1) machine deadlocks, that, if job operations are related by Rm and sequenced

according to Sm, then it results in a deadlock. To see this, note that the above conditions

lead to the following configuration.

Figure 3.16: A deadlock configuration involving ‘m’ machines where operations are

related by Rm and sequenced according to Sm.

Note that

1. On M1, (c,d) must wait for (a,b) to finish processing owing to sequencing

restrictions.

2. (a,b) must wait for (a,b-α) on M2 to finish processing owing to operation

precedence constraints that exist between operations of job a.

(g,h)

(c,d)(a,b)

(a,b-α)= (e,f)

(k,l) (g,h-β) = (i,j)

(r,s) (p,q-δ)

(c,d+γ)=(x,y) (r,s-φ)=(u,v)

M1

M2

M3

M(m-1)

Mm

Page 123: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

108

3. On M2, (a,b-α) must wait for (g,h) to finish processing owing to sequencing

restrictions.

4. (g,h) must wait for (g,h-β) on M3 to finish processing owing to operation

precedence constraints that exist between operations of job g.

5. On M3, (g,h-β) must wait for (k,l) to finish processing owing to sequencing

restrictions and so on until…

6. On M(m-1) (p,q-δ) must wait for (r,s) to finish processing owing to sequencing

restrictions.

5. (r,s) must wait for (r,s-φ) on Mm to finish processing owing to operation

precedence constraints that exist between operations of job r.

6. On Mm, (r,s-φ) must wait for (c,d+γ) to finish processing owing to sequencing

restrictions.

7. Finally, (c,d+γ) must wait for (c,d) on M1 to finish processing owing to operation

precedence constraints that exist between operations of job c.

The above configuration along with the resulting precedences that it generates can be

represented as a network as shown in Figure 3.17. Clearly, this network is a directed

cycle. Therefore, none of the operations can begin processing thereby resulting in a

deadlock.

Proceeding further, we must now show that if there exists a deadlock in the case of ‘m’

machines, then it must be caused by

v. Job operations related by Rm.

vi. Job operations sequenced according to Sm.

The above statement is of the type P ⇒ Q. We shall prove it by proving its

contrapositive i.e. ∼Q ⇒ ∼P. We therefore need to show that, in the case of m machines,

if job operations are not related by Rm or if they are not sequenced according

Page 124: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

109

Figure 3.17: Network representation of an m-machine deadlock

to Sm, the resulting configuration is deadlock free. Towards that end, we shall prove the

following three cases:

i. Job operations not related by Rm but sequenced according to Sm result in a

deadlock free configuration.

ii. Job operations related by Rm but not sequenced according to Sm result in a

deadlock free configuration.

iii. Job operations neither related by Rm nor sequenced according to Sm result in a

deadlock free configuration.

We shall use the following terminology in our proofs for the different cases. Observe

that in Figure 3.16 which represents the deadlock configuration for the case of m

(g,h)

(a,b) (c,d)

(a,b-α)

(k,l) (g,h-β)

(r,s) (p,q-δ)

(c,d+γ) (r,s-φ)

Page 125: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

110

machines, the operations on the right side of each machine are actually preceding

operations of those on the left hand of the previous machine. For instance, operation

(a,b-α) on M2 is the preceding operation of (a,b) on M1, (g,h-β) on M3 is the preceding

operation of (g,h) on M2 and so on. In the same vein, (c,d) on M1 can be thought of as

the preceding operation of (c,d+γ) on Mm. We shall refer to operations (c,d), (a,b-α), (g,h-

β),…,(r,s-φ) as n (as in negative) operations and the operations that succeed them (c,d+γ),

(a,b), (g,h),…,(r,s) as p (as in positive) operations. Thus, in the case of an m machine

deadlock, every machine contains a p operation followed by an n operation where this n

operation is a preceding operation of the p operation on the previous machine. This is

illustrated below in Figure 3.18:

Figure 3.18: n and p operations

In the proofs that follow, we shall also use the concept of ‘preceding’ and ‘succeeding’

machines. For instance, in Figure 3.18, machine M(m-1) precedes machine Mm and

machine M(m-2) precedes machine M(m-1) and so on until machine M1 precedes

machine M2 and machine M(m) precedes machine M1. Likewise, machine M2 succeeds

machine M1, machine M3 succeeds machine M2 and so on until machine M(m) succeeds

machine M(m-1) and machine M1 succeeds machine M(m).

(g,h)

(c,d)(a,b)

(a,b-α)

(k,l) (g,h-β)

(r,s) (p,q-δ)

(c,d+γ) (r,s-φ)

M1

M2

M3

M(m-1)

Mm

n operation

p operation

Page 126: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

111

Case i: Job operations not related by Rm but sequenced according to Sm.

Proof: The configurations under this category comprise of those that arise by negating

each clause under the definition of relationship Rm and their combinations thereof. We

first examine a subset of these configurations (relationships) denoted by Ω ∩ Π where Ω

=Configurations obtained by negating clauses 1,…,m under (the definition of

relationship) Rm and Π = Configurations where clauses 1*,…,m* under Rm are

preserved

Subset Ω ∩ Π:

These configurations must involve at least one or more machines, where the n operation

on the right hand side of the machines is now replaced by an operation that is positive

with respect to the p operation on the previous machine or the n operation is no longer

related to the p operation on the previous machine. To clarify, if (a,b-α) on machine M2,

is replaced by (a,b+α), it would succeed (a,b) on machine M1 as opposed to preceding it.

We shall refer to such operations as np, that is, an operation that was originally negative

in the deadlock configuration with respect to its p operation on the previous machine has

now turned positive with respect to the same p operation. In light of the directed cyclic

network represented by Figure 3.17, this has the effect of reversing the direction of one

of more diagonal arrows thus breaking the cycle. In the case where the n operation is no

longer related to the p operation on the previous machine, the diagonal arrows become

bi-directional thus breaking the cycle.

Let Π1 = Configurations where one or more of clauses 1*,…,m* under Rm are

violated

Subset Ω ∩ Π1:

The configurations that exist under this subset fall into one of the following categories.

They are either: 1) Feasible, or 2) Infeasible. Infeasible configurations are born as a result

of contradictory conditions arising out of the intersection of Ω and Π1. We exclude the

infeasible configurations under this subset from further consideration. Amongst the

feasible configurations that are possible in this subset, it is easy to see that, when one or

more of clauses 1*,…,m* under Rm are violated, we must have one or more of the

Page 127: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

112

relations c = a,d > b or e = g, f > h or i = k, j > l and so on until p = r, q - δ> s or

u = x, v > y hold since job-operations in case (i) are sequenced according to Sm. These

configurations, will therefore, be acyclic since they do not cause any reversal in the

direction of horizontal arrows for configurations belonging to Ω that are sequenced

according to Sm.

Let Σ = Configurations where clauses 1,…,m under Rm are preserved

Subset Σ ∩ Π1:

Configurations belonging to Σ but where one or more (upto m*-2) of the clauses

1*,…,m* are negated, cannot exist because these configurations produce a deadlock

involving less than m machines which we have already assumed our configurations to be

free of. Figure 3.19 depicts a configuration belonging to Σ but where clauses 1* and 2*

under Rm are negated such that c = a,d > b and e = g, f > h

Note that, if we temporarily ignore machine M1, then, there exists an m-1 machine

deadlock amongst the other machines. The same is true, if we temporarily ignore

machine M2 and view the other machines.

Moreover, configurations belonging to Σ but where (m*-1) of the clauses 1*,…,m* are

negated, cannot exist because these configurations automatically imply that all m* clauses

must be negated. Finally, since job-operations are sequenced according to Sm,

configurations belonging to Σ but where all of clauses 1*,…,m* are negated (such that

c = a,d > b and e = g, f > h and i = k, j > l and so on until

p = r, q - δ> s and u = x, v > y ), are not possible because these configurations impose

contradictory conditions of (x,y) succeeding as well preceding (c,d) simultaneously.

Page 128: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

113

Figure 3.19: Network representation of a configuration belonging to Σ but where clauses

1* and 2* under Rm are negated

Case ii: Job operations related by Rm but not sequenced according to Sm.

The ‘m’ machine configurations that exist under this category must involve at least one

or more machines where the n operation on the right hand side of the machines now

moves to the left hand side and the p operation on the left hand side moves to the right

hand side.

In terms of the directed cyclic network represented by Figure 3.17, this implies that the

direction of one or more of the horizontal arrows is reversed, thus breaking the cycle.

(g,h)

(a,b) (a,b+φ) =(c,d)

(g,h+ε)= (a, b-α) = (e, f)

(k,l) (g,h-β) = (a,b-α-µ)

(r,s) (p,q-δ)

(a,b+φ+γ)= (c,d+γ) (r,s-φ)

M1

M2

M3

M(m-1)

Mm

Page 129: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

114

Note that, all the configurations that are possible in the case of m machines, by negating

one or more of clauses 1,…,m under Rm, and one or more of clauses 1,…,m under Sm,

are those where each machine falls into one of the following categories.

C1: A machine with a p operation on the left hand side and a np operation on the right

hand side.

C2: A machine with a np operation on the left hand side and a p operation on the right

hand side.

C3: A machine with a p operation on the left hand side and a n operation on the right

hand side.

C4: A machine with a n operation on the left hand side and a p operation on the right

hand side.

We have already seen that configurations that contain machines solely from category C1

or a combination of C1 and C3 are cycle (deadlock) free. This was shown under Case (i).

Moreover, we have also seen that configurations that contain machines solely from

category C4 or a combination of C3 and C4 are also deadlock free. This was shown

under Case (ii). Configurations that contain machines solely from either category C2 or

C3 result in a deadlock. This was seen in Figure 3.17 for the case where machines solely

belong to category C3. Now consider Figure 3.20 for the case where all machines solely

belong to category C2.

Page 130: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

115

Figure 3.20: Deadlock configuration – machines belonging to category C2.

Note that, the above configuration is equivalent to the one illustrated in Figure 3.21

because it merely involves a renumbering of the machines:

Figure 3.21: An equivalent configuration after re-numbering the machines

(r,s+φ)

(a,b)(c,d+γ+η)

(c,d+γ)

(p,q+δ) (r,s)

(g,h+β) (k,l)

(a,b+α) (g,h)

M1

M2

M3

M(m-1)

Mm

(a,b+α)

(a,b)(c,d+γ+η)

(g,h)

(g,h+β) (k,l)

(p,q+δ) (r,s)

(r,s+φ) (c,d+γ)

M1

M2

M3

M(m-1)

Mm

Page 131: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

116

The operations in the above configuration now appear to be related in a form similar to

that seen in Figure 3.16. Note that, in the above configuration, machines can be thought

of as belonging entirely to category C3 because operations on the right hand side of each

machine are n operations as they constitute the preceding operations of the p operations

on the left hand side of the previous machine. The network representation of Figure 3.21

is shown below in Figure 3.22. Observe that it is similar to Figure 3.17.

Figure 3.22: Network representation of Figure 3.21

We now turn our attention to the remaining configurations and note that, these must be

a subset of all the configurations possible under Case (iii), where, this case comprises of

configurations whose jobs are neither related by Rm nor sequenced by Sm. In the sequel,

we show that these configurations are also deadlock free.

(r,s+φ)

(c,d+γ+η) (a,b)

(c,d+γ)

(p,q+δ) (r,s)

(g,h+β) (k,l)

(a,b+α) (g,h)

Page 132: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

117

Case iii: Job operations not related by Rm nor sequenced according to Sm.

Let Γ= Configurations obtained by negating one or more of clauses 1,…,m under Rm

such that a = e , g = i ,…, r = u and x = c , ς = Configurations where one or more

clauses under Sm are violated and Π= Configurations where clauses 1*,…,m* under

Rm are preserved

Subset Γ ∩ ς ∩ Π:

1. Combinations of C2 and C3.

2. Combinations of C2 and C4.

3. Combinations of C1 and C4.

4. Combinations of C1 and C2.

5. Combinations of C1, C2 and C3.

6. Combinations of C1, C3 and C4.

7. Combinations of C2, C3 and C4

8. Combinations of C1, C2 and C4.

9. Combinations of C1, C2, C3 and C4.

Case 1: Combinations of C2 and C3.

Since there must exist atleast one machine belonging to category C2, this has the effect of

reversing the directions of atleast one horizontal (because the np operation precedes the p

operation on a C2 machine) and one diagonal (because the np operation on a C2 machine

must be preceded by the p operation on a previous machine) arrow in Figure 3.17 thus

breaking the cycle.

Case 2: Combinations of C2 and C4

Follows the same reasoning as that for Case 1.

Case 3: Combinations of C1 and C4

Since there must exist atleast one machine belonging to category C1, this has the effect of

reversing the direction of at least one diagonal arrow in Figure 3.17 (because the np

operation on a C1 machine must be preceded by the p operation on a previous machine).

Also since there must exist atleast one machine belonging to category C4, this has the

Page 133: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

118

effect of reversing atleast one horizontal arrow in Figure 3.17 (because the n operation

precedes the p operation on a C4 machine) thus breaking the cycle.

Case 4: Combinations of C1 and C2

Since there must exist atleast one machine belonging to category C1, this has the effect of

reversing the direction of at least one diagonal arrow in Figure 3.17 (because the np

operation on a C1 machine must be preceded by the p operation on a previous machine).

Also, since there must exist atleast one machine belonging to category C2, this has the

effect of reversing atleast one horizontal arrow (because the np operation precedes the p

operation on a C2 machine) and one diagonal arrow in Figure 3.17 (because the np

operation on a C2 machine must be preceded by the p operation on a previous machine)

thus breaking the cycle.

Case 5: Combinations of C1, C2 and C3.

Follows the same reasoning as Case 4.

Case 6: Combinations of C1, C3 and C4.

Follows the same reasoning as Case 3.

Case 7: Combinations of C2, C3 and C4.

Proof: Follows the same reasoning as Case 1. Moreover, since there must exist atleast

one machine belonging to category C4, this has the effect of reversing atleast one

horizontal arrow in Figure 3.17 (because the n operation precedes the p operation on a

C4 machine) thus breaking the cycle.

Case 8: Combinations of C1, C2 and C4.

Proof: Follows the same reasoning as Case 4.

Case 9: Combinations of C1, C2, C3 and C4.

Proof: Follows the same reasoning as Case 4.

Page 134: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

119

Let Ψ= Configurations obtained by negating one or more of clauses 1,…,m under Rm

such that pairs of operations are unrelated. That is, one or more of ≠a e or ≠g i or so

on until ≠r u or ≠x c is satisfied

Subset Ψ∩ ς ∩ Π :

Configurations belonging to this subset have bi-directional diagonal arrows because one

or more pair(s) of operations are unrelated thus yielding an acyclic network.

Recall that, Σ = Configurations where clauses 1,…,m under Rm are preserved and

Π1 = Configurations where one or more of clauses 1*,…,m* under Rm is violated

Subset Γ ∩ ς ∩ Π1:

Configurations belonging to Γ ∩ ς but where one or more of clauses 1*,…,m* under Rm

is negated (such that the right and left side operations on a machine are now related), will

continue to remain acyclic. This is because the directions of the horizontal and diagonal

arrows in these configurations (i.e. Γ ∩ ς) stays preserved even if one or more of clauses

1*,…,m* under Rm is negated.

Subset Σ ∩ Π1 ∩ ς:

Configurations that belong to Σ ∩ ς have been shown to be acyclic under Case (ii). In

conjunction with the scenario where one or more of clauses 1*,…m* under Rm may be

negated, these configurations will continue to stay acyclic because the directions of the

horizontal and diagonal arrows in these configurations (i.e. Σ ∩ ς) stays preserved. It is

easy to see that when any m*-1 clauses under Rm are negated (such that the right and left

side operations on a machine are now related), the last clause is automatically negated

too.

Proposition 6: For the case of m machines, where job operations are related by Rm and

are sequenced according to Sm, the deadlock can be broken by switching the sequence

on any one of the m machines.

Page 135: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

120

Proof: By switching the sequence on any one of the m machines, where job operations

are related by Rm and sequenced according to Sm, the direction of the horizontal arrow

on this machine gets reversed thus breaking the cycle.

Thus, for the case of m machines, that is free of all 2,3,…,(m-1) deadlocks, if operations

(a,b) and (c,d) are assigned to machine 1 such that (a,b) precedes (c,d), and if operations (g,h)

and (a,b-α) are assigned to machine 2 such that (g,h) precedes (a,b-α) and operations (k,l)

and (g,h-β ) are assigned to machine 3 such that (k,l) precedes (g,h-β ) on machine 3 and so

on until (r,s) and (p,q-δ) are assigned to machine m-1 such that (r,s) precedes (p,q-δ) and

(c,d+γ) and (r,s-φ) are assigned to machine m, then (r,s-φ) must necessarily precede (c,d+γ)

on machine m to avoid a deadlock. Here, machines may be numbered in any order and α,

β,δ,γ,φ > 0. Mathematically, this can be expressed as:

1 2 31 2 3

( , , , ) ( , , , ) ( , , , )( , , , ) ( , , , ) ( , , , ) ( , , , )

( 1)( 1)( , , , )( , , , ) ( , , , )....

Mm M M MM M Ma b c d g h a b k l g hr s c d a b c d g h a b k l g h

M mM m Mmr s c dr s p q r s p q

g g g gh h h

mgh h

α βφ γ α β

φ γδ δ

− −− + − −

−−− +− −

≥ + + + + + +

+ + + −

1 2

3,...., ( 1) ( )

1, 2, 3,..., ( 1), , ( , ), ( , ) , ( , ), ( , ) ,( , ), ( , ) ( , ), ( , ) , ( , ), ( , ) ,

, , , , 0

M M

M M m M m

M M M M m Mm M a b c d g h a bz zk l g h r s p q c d r sz z z

αβ δ γ φ

α β δ γ φ−

∀ − ∈ ∈ − ∈− ∈ − ∈ + − ∈

>

(3.63)

Note that, in an environment comprising of m machines, in addition to the reentrant flow

constraints, deadlock prevention constraints must be included in the Stage-I problem for

all 2, 3,…,m-1, m machines in order to completely eliminate the need for any feasibility

cuts. A k -machine deadlock constraint, where 2 ≤ ≤k m only eliminates deadlocks

pertaining to k machines assuming that other deadlocks pertaining to 1,2,…k-1

machines have been eradicated by their corresponding deadlock prevention constraints.

The deadlock prevention constraints quickly become cumbersome to specify with an

increment in the number of machines. This is because a k machine deadlock constraint

demands that a relationship and sequencing pattern be specified amongst its 2k

Page 136: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

121

operations involved in the deadlock. Ideally, though one might wish to include these

constraints for all 2,3,…,m machines, practically, it may be possible to include constraints

for only up to a few machines.

3.3.6 Optimality cut:

After the Stage-I (master) problem generates a solution that is feasible to the Stage-II

(recourse) problem, this problem (Expressions 3.25 – 3.31) is solved for each scenario.

For a given Stage-I problem solution, the Stage- II problems determine optimal values for

the completion time and budget surplus variables for each scenario. After the expected

Stage- II objective function 1

Q( ) Q( , , , ), ,S

ss

=

= ∑ x y vx y v ξ is evaluated, it is compared

with the value of its lower bound (θ ) previously obtained by solving the master

problem. Note that Q( ), ,x y v provides an upper bound for the problem. If the value of

the lower bound is found to be less than that of the upper bound, optimality cuts

(Expression (2.55)) can be used in order to close the gap between the two. We present

the optimality cut after introducing the following additional variables:

nCUT = Total number of optimality cuts.

n= index for optimality cuts, n= 1….nCUT

θ = lower bound on the expected second-stage objective function value

( , )( , )s ni jop (≤ 0) = dual variables associated with the operation precedence constraints (3.26)

where 1,...,s S= ( , , )( , , , )m s ni j k lsp (≤ 0) = dual variables associated with the sequence dependent set up constraints

(3.27) ( , )( )s nibp (≤ 0) = dual variables associated with the budget constraints (3.28).

( , )( , )s ni jrp (Unrestricted) = dual variables associated with the ready time constraints (3.29).

Unlike the feasibility cut that is generated for the scenario that results in a non-negative

objective function value for the Stage-II problem augmented by artificial variables

(Expressions 3.32-3.40), the optimality cut on the other hand, is generated by aggregating

all scenarios.

Page 137: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

122

The optimality cut is given by:

( )1

( , ) ( , )( , )( , )( , , 1)( , 1)( , ) ( , 1)

1 1 1 ( , 1) ( , ) ( , 1)

( , , ) ( , )( , )( , , , ) ( , )

1 ( , ) ( , ) : 1( , ) ( , )

(

i

m m

N Ss n e fm s m

e fi j ji ji j i jm e fM M Mi j s i j i j i j

M Sm s n m s m

k li j k l k lm i j k l sz z

i j k l

Jop p v dx

sp p

θ−

+++∈ ∈ ∈= = = + +

= ∈ ∈ =≠

+ + +∑ ∑ ∑

∑ ∑∑

∑ ∑ ∑ ∑

( , ) ( , ) ( , 1)

( , , , )

1( , ) ( , ) ( , )

( , ) ( , ) ( , )( , , 1)( ) ( , )1 1 1 1

( , ) ( , , )( ) ( , , , )

1 1 1

)

(

i i

i j i j i j

mi j k l

N Ss n m s e fm m

i j i j e f ii j ji i ji s j m j e fM M M

N S Ss n m s n

ii i j k li s s

H yx

J Jbp pc x v dc w

bp spb

+

+= = = ∈ = ∈ ∈

= = =

+

+

≥ +

∑∑ ∑ ∑ ∑ ∑ ∑

∑∑ ∑ ( , , , )1 ( , ) ( , ) :

( , ) ( , )

( , )( , )

1 0 1

)m m

Mmi j k l

m i j k lz zi j k l

N Ss n

ii ji j s

H u

rp r

= ∈ ∈≠

= = =

+

∑ ∑ ∑

∑∑∑

(3.64)

The optimality cut is appended to the MMP and the new MMP is solved. Should any

infeasibilities be detected in the master problem solution, feasibility cuts are generated

and added on to the MMP, which is re-solved. This is done until all infeasibilities are

removed from the master solution. Thereafter, subproblems for each scenario are solved

and using dual multiplier values, optimality cuts are generated if a gap between the lower

and upper bound is detected. These cuts are added to the MMP and the cycle continues

until the lower and upper bounds converge at the optimum. We define the MMP with the

necessary deadlock prevention constraints, feasibility and optimality cuts as follows:

MMP1 = min (3.16) s.t. (3.17)-(3.24), (3.47)-(3.51), (3.56), (3.60), (3.63),

r cuts of type (3.46) and nCUT cuts of type (3.64).

The entire procedure is depicted in a flow chart in Figure 3.23 .

Page 138: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

123

No

Yes

No

Yes

Figure 3.23: The L-shaped method as applied to the NMJS problem.

Initialize v = fCUT = nCUT= 0

For 1,...,s S= solve ARP

Add the feasibility cut Expression (3.46) to the master problem

fCUT= fCUT + 1 For some s such that Expression (3.32) > 0 Generate the feasibility cut, Expression (3.46)

Is Expression (3.32) = 0, s∀

For 1,...,s S= solve RPDetermine the optimal dual multipliers

Is v vwθ ≥ ?

Add the optimality cut, Expression (3.64) to the

master problem

Stop, , ,v vv y vx is optimal.

v = v+1 Solve MMP1 (Initially MP is solved in the absence of feasibility and optimality cuts.)

Obtain the solution ( , ), ,vvv v θyx v

Set nCUT = nCUT + 1 compute

exp secv ected ond stageobjectivewfunction=

Page 139: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

124

3.3.7 Alternative optimality cut:

Observe that in the optimality cut given by Expression (3.64), the dual prices bp and sp

on the right hand side of the equation, are less than equal to zero whereas rp is

unrestricted. Also, H is a large positive number which, in conjunction with sp ,

contributes heavily to making the right hand side negative. Likewise, all the terms on the

left hand side of Expression 3.64 are less than or equal to zero except for theta (θ ). The

lower bound theta is assigned a value of negative infinity at the beginning of the

algorithm and its value is determined by the master problem in conjunction with the

optimality cut that is appended to it at every iteration. Since the objective function in the

master problem attempts to minimize θ , the value of θ increases by an amount just

enough to satisfy the optimality cut. Given the presence of large negative numbers on

either side of the equation, prominently due to the big H terms, there is little increase in

the value of θ with each iteration and this results in a large number of optimality cuts

before the gap between the lower and upper bounds is reduced to zero. If the right hand

side of the optimality cut is made ‘less negative’, the cut may become stronger and this

could accelerate the ascent of θ (lower bound) towards the upper bound. One of the

ways to achieve this is by eliminating some or all of the terms containing big H. In order

to accomplish this let us reconsider the Stage-II problem (RP). In particular, we would

like to see if Expression 3.27 of the RP that contains the big H terms can be modified.

Consider Expression 3.27 ( , )

( , ) ( , ) ( , , , )( , ) ( , , , )( , ) (1 ) , 1... , ( , ) ,sm s ms m mmi j k l i j k lk l i j k lk l H m M i jtp yt x u z+ + ≤ + − ∀ = ∈

( , ) : ( , ) ( , ), 1,..., .mk l i j k l s Sz∈ ≠ =

(3.27)

Upon multiplying both sides of Expression (3.27) by ( , , , )

mi j k ly and observing that

( , , , ) ( , , , ) ( , , , )m m mi j k l i j k l i j k ly y y= , we get

( )( , )( , ) ( , ) ( , , , )( , ) ( , ) ( , , , ) 0, 1... , ( , ) ,s mm ss m m

mi j k l i j k lk l k l i j k l m M i jt ypt x u z+ + − ≤ ∀ = ∈ ( , ) : ( , ) ( , ), 1,..., .mk l i j k l s Sz∈ ≠ = (3.65)

Page 140: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

125

Upon expanding Expression (3.65), we have

( , )( , ) ( , ) ( , , , )( , , , ) ( , ) ( , , , ) ( , , , )( , , , ) ( , ) 0, 1... ,m sm m s m ms m mi j k l i j k li j k l k l i j k l i j k li j k l k l m My ty p y yt x u+ + − ≤ ∀ =

( , ) , ( , ) : ( , ) ( , ), 1,..., .m mi j k l i j k l s Sz z∈ ∈ ≠ = (3.66) We can further simplify Expression (3.66) as stated in the proposition below. Proposition 7: The first, second and fourth terms of Expression (3.66) can be simplified

to the following form,

a) ( , )( , ) ( , , , ) ( , , , ),

m m ssi j i j k l i j k ly tyt = where

( , )( , , , ) ( , , , )0 , 1,..., , ( , ) , ( , ) : ( , ) ( , ),

1,..., .

mm sm mi j k l i j k lH m M i j k l i j k lyty z z

s S

≤ ≤ ∀ = ∈ ∈ ≠

=

b) ( , )( , )( , ) ( , , , ) ( , ) ( , , , ) , 1... , ( , ) ,m s m m mm

mk lk l i j k l k l i j k l m M i jp y p yx z= ∀ = ∈

( , ) : ( , ) ( , ), 1,..., .mk l i j k l s Sz∈ ≠ =

c) ( , )( , ) ( , , , )( , , , )

m m ssk l k l i ji j k ly tyyt = , where

( , )( , , , ) ( , , , )0 , 1,..., , ( , ) , ( , ) : ( , ) ( , ),

1,..., .

mm sm mk l i j i j k lH m M k l i j k l i jytyy z z

s S

≤ ≤ ∀ = ∈ ∈ ≠

=

Proof: a) Consider the first term ( , ) ( , , , )

msi j i j k lyt :

Let

( , )( , ) ( , , , ) ( , , , ), 1... , ( , ) , ( , ) ,: ( , ) ( , ), 1,..., .m m ss

m mi j i j k l i j k l m M i j k l i j k l s Sy tyt z z= ∀ = ∈ ∈ ≠ = (3.67) Summing both sides of Expression (3.67)

( , ) ( , )

( , )( , ) ( , , , ) ( , , , )

( , ) ( , )( , ) ( , ) ( , ) ( , )

, 1,..., , 1,..., ,i j m i j m

m m ssi j ii j k l i j k l

m k l m k lM z M zk l i j k l i j

i N jy tyt J∈ ∈ ∈ ∈

≠ ≠

= ∀ = =∑ ∑ ∑ ∑

1,..., .s S= (3.68) We can extricate ( , )

si jt from the left side of Expression (3.68), so that we now have

Page 141: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

126

( , ) ( , )

( , )( , ) ( , , , ) ( , , , )

( , ) ( , )( , ) ( , ) ( , ) ( , )

1, ..., , 1,..., ,i j m i j m

m m ssi j ii j k l i j k l

m k l m k lM z M zk l i j k l i j

i N jy tyt J∈ ∈ ∈ ∈

≠ ≠

= ∀ = =∑ ∑ ∑ ∑

1,...,s S= (3.69) Now, note that, in Expression (3.69) we must have

( , )

( , , , )( , )( , ) ( , )

1, 1,... , 1,...,i j m

mii j k l

m k lM zk l i j

i N jy J∈ ∈

≤ ∀ = =∑ ∑ (3.70)

This is because every job-operation (i, j) can be assigned to only one of the machines

( )( , )i jm M∈ capable of processing it and, on this machine, it may either directly precede

another job-operation (k, l) or it may be the last job on that machine in which case it does

not directly precede any other job-operation, or it may be the only job-operation assigned

to that machine in which case too, there are no other job-operations on that machine that

it can directly precede.

Expression (3.70) is now augmented with slack variables

( , )

( , , , ) ( , )( , )( , ) ( , )

1, 1..... , 1..... .i j m

mii j k l i j

m k lM zk l i j

i N jy ys J∈ ∈

+ = ∀ = =∑ ∑ (3.71)

Now, using Expression (3.71) in (3.69) we get,

( )( , )

( , )( , ) ( , ) ( , , , )

( , )( , ) ( , )

1 , 1,..., , 1,..., , 1,..., .i j m

m ssi j ii j i j k l

m k lM zk l i j

i N j s Sys tyt J∈ ∈

− = ∀ = = =∑ ∑ (3.72)

Let ( , ) ( , )( , ) , 1,..., , 1,..., , 1,..., .s s

i j i j ii j n N j s Syst Jλ= ∀ = = = (3.73) From Expression (3.72), we then have,

( , )

( , )( , ) ( , )( , , , )

( , )( , ) ( , )

, 1,..., , 1..... , 1,...,i j m

m ss si j i j ii j k l

m k lM zk l i j

i N j s Styt Jλ∈ ∈

= + ∀ = = =∑ ∑ (3.74)

Using Expressions (3.67) and (3.73), we can set the lower and upper bounds for

( , , , )mi j k lty and ( , )

si jλ as

Page 142: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

127

( , )( , , , ) ( , , , )0 , 1,..., , ( , ) , ( , ) : ( , ) ( , ),mm s

m mi j k l i j k lH m M i j k l i j k lyty z z≤ ≤ ∀ = ∈ ∈ ≠ (3.75) 1,..., .s S=

( , ) ( , )0 , 1,..., , 1,..., , 1,..., .si j ii jH i N j s Sys Jλ≤ ≤ ∀ = = = (3.76)

b) Now, let us consider the second term ( , )

( , )( , ) ( , , , )m s mm

k lk l i j k lp yx .

Observe that replacing the expression ( , )( , ) ( , , , )m mm

k lk l i j k lp yx by ( , ) ( , , , )m mk l i j k lp y does not alter

it because ( , )( , ) ( , , , )m mm

k lk l i j k lp yx takes on a value of 1 only when both ( , )mk lx and ( , , , )

mi j k ly are

equal to 1, else the expression equals 0. Moreover, if ( , , , )mi j k ly equals 1, ( , )

mk lx must also

equal 1 because operation (i, j) cannot directly precede operation (k, l) on machine m

unless operation (k, l) is also assigned to the same machine. On the other hand, when

( , , , )mi j k ly equals 0, ( , )

mk lx could be either 0 or 1. But, whenever ( , , , )

mi j k ly equals 0, the

expression vanishes.

c) Finally, consider the fourth term ( , ) ( , , , )

mk l i j k lt y :

Let ( , )

( , ) ( , , , )( , , , ) , 1,..., , ( , ) , ( , ) :( , ) ( , ),m m ssm mk l k l i ji j k l m M k l i j k l i jy tyyt z z= ∀ = ∈ ∈ ≠

1,..., .s S= (3.77) Summing both sides of Expression (3.77)

( , ) ( , )

( , )( , ) ( , , , )( , , , )

( , ) ( , )( , ) ( , ) ( , ) ( , )

, 1,..., , 1,..., ,k l m k l m

m m ssk l kk l i ji j k l

m i j m i jM z M zi j k l i j k l

k N ly tyyt J∈ ∈ ∈ ∈

≠ ≠

= ∀ = =∑ ∑ ∑ ∑

1,..., .s S= (3.78) By extricating ( , )

sk lt from the left side of Expression (3.78), so that we now have

( , ) ( , )

( , )( , ) ( , , , )( , , , )

( , ) ( , )( , ) ( , ) ( , ) ( , )

, 1,..., , 1,..., ,k l m k l m

m m ssk l kk l i ji j k l

m i j m i jM z M zi j k l i j k l

k N ly tyyt J∈ ∈ ∈ ∈

≠ ≠

= ∀ = =∑ ∑ ∑ ∑

1,..., .s S= (3.79) Now, note that in Expression (3.79), we must have

Page 143: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

128

( , )

( , , , )( , )( , ) ( , )

1, 1,..., , 1,..., .k l m

mki j k l

m i jM zi j k l

k N ly J∈ ∈

≤ ∀ = =∑ ∑ (3.80)

This is because every job-operation (k, l) can be assigned to only one of the machines

( )( , )k lm M∈ capable of processing it and on this machine, it may either directly succeed

another job-operation (i, j) or it may be the first job on that machine in which case it

does not directly succeed any other job-operation or it may be the only job-operation

assigned to that machine in which case too, there are no other job-operations on that

machine that it can directly succeed.

By augmenting Expression (3.80) with slack variables, we have

( , )

( , )( , , , )( , )( , ) ( , )

1, 1,..., , 1,..., .k l m

mkk li j k l

m i jM zi j k l

k N ly yss J∈ ∈

+ = ∀ = =∑ ∑ (3.81)

Now, using Expression (3.81) in Expression (3.79) we get,

( )( , )

( , )( , ) ( , ) ( , , , )

( , )( , ) ( , )

1 , 1,..., , 1,..., , 1,..., .k l m

m ssk l kk l k l i j

m i jM zi j k l

k N l s Syss tyyt J∈ ∈

− = ∀ = = =∑ ∑

(3.82) Let ( , ) ( , )( , ) , 1,..., , 1,..., , 1,..., .ss

k l kk lk l k N l s Sysst Jη= ∀ = = = (3.83) From Expression (3.82), we then have,

( , )

( , )( , ) ( , )( , , , )

( , )( , ) ( , )

, 1,..., , 1,..., , 1,..., .k l m

m s ssk l kk lk l i j

m i jM zi j k l

k N l s Styyt Jη∈ ∈

= + ∀ = = =∑ ∑ (3.84)

Using Expressions (3.77) and (3.83), we can set the lower and upper bounds for

( , )( , , , )m sk l i jtyy and ( , )

sk lη as

( , )( , , , ) ( , , , )0 , 1,..., , ( , ) , ( , ) : ( , ) ( , ),mm s

m mk l i j i j k lH m M k l i j k l i jytyy z z≤ ≤ ∀ = ∈ ∈ ≠ 1,..., .s S= (3.85)

( , ) ( , )0 , 1,..., , 1,..., , 1,..., .skk l k lH k N l s Syss Jη≤ ≤ ∀ = = = (3.86)

Page 144: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

129

In light of the above analysis, the modified Stage-II (recourse) problem (MSSP) can now

be rewritten as follows:

MSSP: Min ( , )1

i

Ns

i i Ji

tα=∑ +

1

Nsii

=∆∑ = Min ( )( , )

1i

Ns s

ii i J ii

t βα=

+ ∆∑ (3.87)

Subject to:

( , )( , )( , ) ( , ) ( , 1)( , , 1)( , 1)( , 1)

( , 1) ( , ) ( , 1)

) ,(

1, ..., , 0, ..., 1, 1, ..., .

e fs sm s mi j e f i ji j ji ji j

m e fM M Mi j i j i j

i

pt v d tx

i N j s SJ

++++∈ ∈ ∈+ +

+ + ≤∑ ∑ ∑

∀ = = − =

(3.88) ( , ) ( , ) ( , )

( , , , )( , , , ) ( , ) ( , , , ) ( , , , )( , , , ) , 1... , ( , ) ,mm s m s m m smmi j k li j k l k l i j k l k l i ji j k l m M i jyty p y tyyu z+ + ≤ ∀ = ∈

( , ) ,: ( , ) ( , ), 1,..., .mk l i j k l s Sz∈ ≠ =

(3.89)

( , ) ( , ) ( , 1)

1( , ) ( , )

( , ) ( , ) ( , )( , , 1)( , )1 1

,

1,..., , 1,..., .

i i

i j i j i j

J Jm s e fm m s

ii j i j e f n ii j ji jj m j e fM M M

pc x v dc w b

i N s S+

+= ∈ = ∈ ∈

+ − ≤ ∆

∀ = ∀ =

∑ ∑ ∑ ∑ ∑

(3.90)

( , )

( , )( , ) ( , )( , , , )

( , )( , ) ( , )

, 1,..., , 1..... , 1,...,i j m

m ss si j i j ii j k l

m k lM zk l i j

i N j s Styt Jλ∈ ∈

= + ∀ = = =∑ ∑ (3.91)

( , )

( , )( , ) ( , )( , , , )

( , )( , ) ( , )

, 1,..., , 1,..., , 1,...,k l m

m s ssk l kk lk l i j

m i jM zi j k l

k N l s Styyt Jη∈ ∈

= + ∀ = = =∑ ∑ (3.92)

( , )( , , , ) ( , , , )0 , 1,..., , ( , ) , ( , ) : ( , ) ( , ),mm s

m mi j k l i j k lH m M i j k l i j k lyty z z≤ ≤ ∀ = ∈ ∈ ≠ (3.93) 1,..., .s S=

( , ) ( , )0 , 1,..., , 1,..., , 1,..., .si j ii jH i N j s Sys Jλ≤ ≤ ∀ = = = (3.94)

( , )( , , , ) ( , , , )0 , 1,..., , ( , ) , ( , ) : ( , ) ( , ),mm s

m mk l i j i j k lH m M k l i j k l i jytyy z z≤ ≤ ∀ = ∈ ∈ ≠ 1,..., .s S= (3.95)

( , ) ( , )0 , 1,..., , 1,..., , 1,..., .skk l k lH k N l s Syss Jη≤ ≤ ∀ = = = (3.96)

Page 145: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

130

( ,0) , 1,...,sii i Nt r≥ ∀ = (3.97)

( , ) 0 , 1,..., , 1,..., , 1,..., .si j ii N j s St J≥ ∀ = = = (3.98)

0 , 1,..., , 1,..., .s

i i N s S≥ ∀ = =∆ (3.99) We now define dual multipliers for the modified second-stage problem (MSSP).

( , )( , )s ni jop (≤ 0) = dual variables associated with the operation precedence constraints (3.88)

where 1,..., , 1,...,n nCUT s S= =

( , , )( , , , )m s ni j k lmsp (≤ 0) = dual variables associated with the sequence dependent set up

constraints (3.89)

( , )( )s nibp (≤ 0) = dual variables associated with the budget constraints (3.90).

( , )( , )s ni jrp (Unrestricted) = dual variables associated with the ready time constraints (3.97).

( , , )( , , , )m s ni j k ltyp (≤ 0) = dual variables associated with the upper bound constraints (3.93).

( , )( , )s ni jlap ≤ 0 = dual variables associated with the upper bound constraints (3.94).

( , , )( , , , )m s nk l i jtyyp (≤ 0) = dual variables associated with the upper bound constraints (3.95).

( , )( , )s nk letp (≤ 0) = dual variables associated with the upper bound constraints (3.96).

In light of the MSSP and the dual multipliers defined above, the alternative optimality cut

(AOC) is given by:

Page 146: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

131

( )1

( , ) ( , )( , )( , )( , 1) ( , , 1)( , ) ( , 1)

1 1 1 ( , 1) ( , ) ( , 1)

( , , ) ( , )( , , , ) ( , ) ( , ,

1 ( , ) ( , ) : 1( , ) ( , )

(

i

m m

N Ss n e fm s m

e fi j i j ji j i jm e fM M Mi j s i j i j i j

M Sm s n m si j k l k l i j

m i j k l sz zi j k l

op p v dx

msp p

+ ++∈ ∈ ∈= = = + +

= ∈ ∈ =≠

+ + +∑ ∑ ∑

∑ ∑∑

∑ ∑ ∑ ∑

( , ) ( , ) ( , 1)

( , , , ), ) ( , , , )

1( , ) ( , ) ( , )

( , ) ( , ) ( , )( , , 1)( ) ( , )1 1 1 1

( , , )( , , , )

( , ) 1

)

(

i i

i j i j i j

m mmi j k lk l i j k l

N Ss n m s e fm m

i j i j e f ii j ji i ji s j m j e fM M M

Sm s ni j k l

k l s

y yu

bp pc x v dc w

typ

J J

+

+= = = ∈ = ∈ ∈

∈ =

+ +

+ +

∑∑ ∑ ∑ ∑ ∑ ∑

∑ ( , , )( , , , ) ( , , , ) ( , , , )

1 ( , ) : 1 ( , ) ( , ) : 1( , ) ( , ) ( , ) ( , )

( , ) ( , )( , ) ( , ) ( , ) ( , )

1 1 1 1 1 1

) ( )

( ) (

m m m m

i k

M M Sm m s n mi j k l k l i j i j k l

m i j m k l i j sz z z Zi j k l k l i j

N S N Ss n s ni j i j k l k l

i j s k l s

H Hy tyyp y

H Hlap ys etp ysJ J

= ∈ = ∈ ∈ =≠ ≠

= = = = = =

− + − +

− + −

∑ ∑ ∑ ∑ ∑ ∑ ∑

∑∑∑ ∑∑∑

( , ) ( , )( ) ( , )

1 1 1 0 1

)

N S N Ss n s n

iii i ji s i j s

s

bp rpb r= = = = =

≥ +∑∑ ∑∑∑

(3.100) Observe that the right side of Expression (3.100) contains no big H terms and is

therefore ‘less negative’ compared to the right side of Expression (3.64). However, the

left side of Expression (3.100) contains a number of big -H terms and these in

conjunction with the non-positive dual multipliers appear to render the left side of

Expression (3.100) ‘more positive’ than the left side of Expression (3.64). A few points

are of interest here though. Notice that in Expressions (3.93) and (3.95), whenever

( , , , )mi j k ly equals 1, the slack terms must be positive because H is a big positive number.

Therefore, the corresponding dual prices ( , , )( , , , )m s ni j k ltyp , ( , , )

( , , , )m s nk l i jtyyp must equal zero due to

the complementary slackness conditions that are satisfied at optimality. On the other

hand, whenever ( , , , )mi j k ly equals 0, in the Expressions (3.93) and (3.95), these dual prices

( , , )( , , , )m s ni j k ltyp and ( , , )

( , , , )m s nk l i jtyyp on the left side of Expression (3.100) could either be 0 or

negative.

Likewise, whenever ( , )i jys ( yss lk ),( ) equals 1 in Expressions 3.94 (3.96), the slack terms

must be positive because H is a big positive number. Therefore, the corresponding dual

prices ( , )( , )s ni jlap ( ( , )

( , )s nk letp ) must equal zero due to the complementary slackness conditions.

On the other hand, whenever ( , )i jys ( yss lk ),( ) equals 0, Expressions 3.94 (3.96) are tight,

Page 147: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

132

and hence, the corresponding dual prices ( , )( , )s ni jlap ( ( , )

( , )s nk letp ) must either be negative or

zero.

Note that, ( , )i jys equals 1 only when (i, j) is the only job-operation assigned to a

machine or when (i, j) is the last operation in sequence on a machine. On the other

hand, ( , )k lyss equals 1 when (k, l) is the only job-operation assigned to a machine or

when it is the first operation in sequence on a machine.

Although the alternative optimality cut given by Expression 3.100 succeeds in reducing

the number of negative terms on the right side of Expression 3.64, it also simultaneously

adds a number of positive terms to its left side. It is therefore, difficult to predict

analytically as to whether either optimality cut would outperform the other. However, it

would be interesting to compare the performance of these two cuts experimentally.

To unify the analysis in this section within the framework of the L-shaped method, we

begin by first solving the master problem which is, now, defined as

AMP: min (3.16) s.t. (3.17)-(3.24), (3.47)-(3.51), (3.56), (3.60), (3.63), (3.71), (3.81).

The solution of the master problem is input into the following Stage-II problem

augmented with artificial variables for scenario s .

AMSSP: Minimize 2( , , ) 9( , , ) 10( , , )1( , , )1 1

)(iN J

i j s i j s i j si j si j

a a a a= =

+ + + +∑∑

( , ) ( , ) ( , ) ( , )3( , , , ) 4( , , , ) 5( , , , ) 6( , , , )

1 ( , ) ( , )( , ) ( , )

( , ) ( , )12(7( , , , ) 8( , , , ) 11( , , )

1 ( , ) ( , ) 1 1( , ) ( , )

( )

( ) (

m m

k

m m

Mm s m s m s m si j k l i j k l i j k l i j k l

m i j k lz zk l i j

M N Jm s m sk l i j k l i j k l s

m k l i j k lz zi j k l

a a a a

aa a

= ∈ ∈≠

= ∈ ∈ = =≠

+ + + +

+ + +

∑ ∑ ∑

∑ ∑ ∑ ∑∑ , , ))k l sa

(3.101)

Page 148: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

133

Subject to:

( , )( , )( , ) ( , )( , , 1)( , 1)( , 1)

( , 1) ( , ) ( , 1)

1( , 1, ) 2 ( , 1, ) ( , 1)

)(

, 1, ..., , 0, ..., 1, 1, ..., .

e fs m s mi j e fi j ji ji j

m e fM M Mi j i j i j

si j s i j s i j i

pt v dx

i N j s Sa a t J

+++∈ ∈ ∈+ +

+ + +

+ +∑ ∑ ∑

+ − ≤ ∀ = = − =

(3.102)

( , ) ( , ) ( , )( , ) ( , )

( , , , ) 3( , , , ) 4( , , , )( , , , ) ( , ) ( , , , ) ( , , , )( , , , ) ,mm s m s m m sm s m smi j k l i j k l i j k li j k l k l i j k l k l i ji j k lyty p y tyyu a a+ + + − ≤

1,..., , ( , ) , ( , ) : ( , ) ( , ), 1,..., .m mm M i j k l i j k l s Sz z∀ = ∈ ∈ ≠ = (3.103)

( , ) ( , ) ( , 1)

1( , ) ( , )

( , ) ( , ) ( , )( , , 1)( , )1 1

,

1,..., , 1,..., .

i i

i j i j i j

J Jm s e fm m s

ii j i j e f n ii j ji jj m j e fM M M

pc x v dc w b

i N s S+

+= ∈ = ∈ ∈

+ − ≤ ∆

∀ = ∀ =

∑ ∑ ∑ ∑ ∑ (3.104)

( , )

( , )( , ) ( , )( , , , )

( , )( , ) ( , )

, 1,..., , 1..... , 1,..., .i j m

m ss si j i j ii j k l

m k lM zk l i j

i N j s Styt Jλ∈ ∈

= + ∀ = = =∑ ∑ (3.105)

( , )

( , )( , ) ( , )( , , , )

( , )( , ) ( , )

, 1,..., , 1,..., , 1,..., .k l m

m s ssk l kk lk l i j

m i jM zi j k l

k N l s Styyt Jη∈ ∈

= + ∀ = = =∑ ∑ (3.106)

( ,0) 1( ,0, ) 2( ,0, ) , 1,..., , 1,..., .s

in i s i s i N s St a a r+ − = ∀ = = (3.107)

( , ) ( , ) ( , )5( , , , ) 6( , , , )( , , , ) ( , , , ) , 1,..., , ( , ) ,

( , ) : ( , ) ( , ), 1,..., .

m s mm s m smi j k l i j k li j k l i j k l

m

H m M i jty ya a zk l i j k l s Sz

+ − ≤ ∀ = ∈

∈ ≠ = (3.108)

( , ) 9( , ) 10( , ) ( , ) , 1,..., , 1,..., , 1,..., .s s si j i j i j ii jH i N j s Sysa a Jλ + − ≤ ∀ = = = (3.109)

( , ) ( , ) ( , )

7( , , , ) 8( , , , )( , , , ) ( , , , ) , 1,..., , ( , ) ,

( , ) :( , ) ( , ), 1,..., .

m s mm s m smk l i j k l i jk l i j i j k l

m

H m M k ltyy ya a zi j k l i j s Sz

+ − ≤ ∀ = ∈

∈ ≠ = (3.110)

11( , ) 12( , )( , ) ( , ) , 1,..., , 1,..., , 1,..., .s s s

k l k l kk l k lH k N l s Syssa a Jη + − ≤ ∀ = = = (3.111)

Page 149: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

134

( , ) 0 , 1,..., , 1,..., , 1,..., .si j ii N j s St J≥ ∀ = = = (3.112)

( , )( , , , ) 0 , 1,..., , ( , ) , ( , ) :( , ) ( , ), 1,..., .m s

m mi j k l m M i j k l i j k l s Sty z z≥ ∀ = ∈ ∈ ≠ = (3.113) ( , ) 0 , 1,..., , 1,..., , 1,..., .s

i j ii N j s SJλ ≥ ∀ = = = (3.114)

( , )( , , , ) 0 , 1,..., , ( , ) , ( , ) :( , ) ( , ), 1,..., .m s

m mk l i j m M k l i j k l i j s Styy z z≥ ∀ = ∈ ∈ ≠ = (3.115) ( , ) 0 , 1,..., , 1,..., , 1,..., .s

kk l k N l s SJη ≥ ∀ = = = (3.116)

0 , 1,..., , 1,..., .si i N s S≥ ∀ = =∆ (3.117)

If there exists a scenario s for which the objective function given by Expression (3.101)

is positive, then a feasibility cut of the type shown below must be added.

The ‘nth’ feasibility cut for scenario s is given by:

( )1

( , ) ( , )( , )( , )( , 1) ( , , 1)( , ) ( , 1)

1 1 ( , 1) ( , ) ( , 1)

( , , ) ( , )( ,( , , , ) ( , ) ( , , , )

1 ( , ) ( , ) :( , ) ( , )

i

m m

Ns n e fm s m

e fi j i j ji j i jm e fM M Mi j i j i j i j

Mm s n m s m

i ji j k l k l i j k lm i j k lz z

i j k l

fop p v dx

fmsp p y

J −

+ ++∈ ∈ ∈= = + +

= ∈ ∈≠

+ +∑ ∑ ∑

+

∑∑

∑ ∑ ∑ ( )

( , ) ( , ) ( , 1)

, , ) ( , , , )

1( , ) ( , ) ( , )

( , ) ( , ) ( , )( , , 1)( ) ( , )1 1 1

( , , )( , , , )

( , ) ( , ) :( , ) ( , )

i i

i j i j i j

m

mmk l i j k l

Ns n m s e fm m

i j i j e f ii j ji i ji j m j e fM M M

m s ni j k l

i j k l Zi j k l

yu

fbp pc x v dc w

ftyp

J J

+

+= = ∈ = ∈ ∈

∈ ∈≠

+

+ +

∑ ∑ ∑ ∑ ∑ ∑

∑ ( ) ( , , )( , , , ) ( , , , ) ( , , , )

1 1 ( , ) ( , ) :( , ) ( , )

( , ) ( , ) ( , ) ( , )( , ) ( , ) ( , ) ( , ) ( ) ( , )

1 1 1 1 1

( )

( ) ( )

m m m

i k

M Mm m s n mi j k l k l i j i j k l

m m k l i jZ z zk l i j

N N Ns n s n s n s n

ii j i j k l k l i i ji j k j i

H Hy ftyyp y

H Hflap ys fetp yss fbp frbJ J

= = ∈ ∈≠

= = = = =

− + − +

− + − ≥ +

∑ ∑ ∑ ∑ ∑

∑∑ ∑∑ ∑1 0

N

ii j

p r= =∑∑

(3.118)

The dual prices in the above expression correspond to Expressions (3.102)-(3.117) but

belong to the scenario with a positive objective function. Expression (3.118) is appended

to the master problem and the master problem is re-solved. The procedure of solving the

master problem and adding feasibility cuts continues until a feasible solution to the

Stage-II problem is obtained. Once the master solution is deemed feasible, the MSSP

(Expressions (3.87)-(3.99)) is solved for each scenario and optimality cuts of the kind

Page 150: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

135

given by Expression (3.100) are added to the master problem and the cycle continues

until the gap between the lower and upper bounds reduces to zero at which point we

have the optimal solution.

3.3.8 Determining the magnitude of big H:

The big H terms first appear on the right side of the following disjunctive constraints

(Expression (3.27)) in the Stage-II problem. These disjunctive constraints partition a

machine’s capacity amongst jobs in such a manner that, at any given instant, not more

than one job can be processed on the machine, that is,

( , )( , ) ( , , , ) ( , )( , )( , ) ( , , , )(1 ) , ( , ) , ( , ) ,

( , ) ( , ), 1,..., ,

mm ss m smm mi j i j k l k lk lk l i j k l H i j k lp yt u t z zx

i j k l s S m M

+ + ≤ + − ∀ ∈ ∈

≠ ∀ = ∈

(3.27)

For a given , ( , ) , ( , ) , ( , ) ( , ),m mm M i j k l i j k lz z∈ ∈ ∈ ≠ if ( , , , )mi j k ly is equal to 0, then

Expression (3.27) reduces to

( , )

( , ) ( , ) ( , , , ) ( , )( , )m ss m m s

i j k l i j k l k lk l Hpt x u t+ + ≤ + (3.119)

H can be thought of as an upper bound on the value of the right side in Expression

(3.119). It may also be possible to develop a conservative upper bound for ( , )si jt and use

this number as the value for H. We may safely do so because ( , )( , )m sk lp and ( , , , )

mi j k lu are

problem parameters that are known beforehand and their values can be absorbed in our

calculations in determining the conservative upper bound for ( , )si jt . A judicious selection

of values for H is important because it restricts the search for completion time variables,

thereby, enabling easier fathoming in the branch and bound tree. To develop an

appropriate upper bound for ( , )si jt , the concept of operation due-dates can be applied

here. Although due-dates for jobs are not explicitly specified for the NMJS environment

considered in this study, hypothetical due dates can be established using processing and

travel time information for job-operations as follows:

Page 151: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

136

( , )( , )( , )

( , ) ( , )1 1( , 1)

max max _i iJ J

m ssi e fi jm eM Mi j i jj j f M i j

X factorpJDD d∈ ∈

= = ∈ +

= +

∑ ∑ (3.120)

Here, siJDD represents the job due date for job i under scenario s and X_factor is a

multiplier that accounts for job-waiting times at machines and other incidental delays that

might occur in the course of the processing of the job. Job due-dates are determined by

considering the sum of the maximum processing times for all operations and the

maximum travel times between machines for successive operations and multiplying the

net sum by the X_factor . The maximum processing time for each operation is obtained

by examining the processing time for the operation on all the machines which are

capable of processing it and choosing the maximum from amongst them. Likewise, the

maximum travel time between successive operations is obtained by considering all

combinations of machines capable of performing the operations under consideration and

choosing the maximum from amongst them. By adjusting the value of the X_factor ,

conservative estimates for job due-dates may be obtained. Working backwards from the

hypothetical job due-dates obtained above, operation due dates are given by:

1( , )

( , ) ( , )( , )( , ) ( , )1

( , 1)

min min ,i iJ J

m ss si j i e fi jm eM Mi j i jj j j o f M i j

pODD JDD d−

∈ ∈= + = ∈ +

= − +

∑ ∑ (3.121)

where, ( , )si jODD represents the operation due-date for job-operation (i, j) under

scenario s . The second term in the above expression is composed of the sum of the

minimum processing times of all those operations that succeed the current operation

whose due-date is being determined and the minimum travel times between machines for

successive operations. ‘Minimum’ values of processing and travel times are used in this

expression because we are interested in determining conservative estimates for operation

due-dates that will most likely be satisfied in the majority of cases. Having determined the

values for the operation due-dates, these can be used in place of H in Expression (3.27).

Page 152: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

137

3.3.9 An alternative Stage-I (master) objective function:

In the L-shaped method, the Stage-I (master) problem outputs a feasible solution which

is fed into the Stage-II (recourse) problem to determine recourse decisions for the given

first-stage solution. An optimality cut is generated if a gap is detected between the lower

and upper bounds. This cut when appended to the Stage-I problem serves to eliminate

the previous solution and forces the master problem to generate a new feasible solution

that complies with all the existing constraints and optimality cuts generated so far.

In section 3.3, we considered an objective function (Expression 3.16) for the Stage-I

problem that minimizes the sum of the expected processing times for all operations of all

jobs and the sum of the set up times between operations on the machines and the sum of

the travel times between successive operations of all jobs. The premise behind this

approach is to force an alignment of assignment, sequencing and tracking variables such

that they support the goal of achieving smaller completion time values for the second-

stage problem. However, the performance of this objective function with respect to the

amount of budget surplus incurred (second term in the objective function of the Stage-II

problem) is arguable. This is because an assignment of jobs to machines with the smallest

processing times and inter-machine travel times may not necessarily incur the least cost

also.

The alternative objective function addresses this latter need. Here, jobs are assigned to

machines such that the total expected cost of processing for all jobs and the cost of travel

for each job is minimized. Also, the sequencing on the machines is such that the total set

up time is minimized. If such an assignment makes it possible for certain jobs to stay

within their budget, this would mean that the slacks in constraints 3.28 of the Stage-II

problem are positive. Hence, the corresponding dual prices fbp and bp for these jobs

in the feasibility (Expressions (3.46) and (3.118)) and optimality cuts (Expressions (3.64)

and (3.100)), respectively, can be reduced to zero because of complementary slackness

conditions. The net effect is that both the right side and the left side of these expressions

become more positive. The gain in positiveness on either side would, however be

Page 153: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

138

interesting to observe through experimentation. Mathematically, this objective function is

expressed as:

( , )

( , )( , ) ( , ) ( , , , )( , ) ( , , , )

1 1 1 1 ( , ) ( , )( , ) ( , )

1( , )

( , )( , , 1)1 1 ( , ) ( , 1)

i

i j m m

i

S N MJm s mm m m

i j i j i j k ls i j i j k ls i j m m i j k lM z z

i j k l

N Je f

e fi j je fM Mi j i j i j

p yc x u

v d

φ= = = ∈ = ∈ ∈

+∈ ∈= = +

+

+ ∑ ∑

∑ ∑∑ ∑ ∑ ∑ ∑

∑∑ (3.122)

Note that all the terms in Expression (3.122) except the second are measured in the same

units of cost. The second term, however, is measured in the units of time. In order to

ensure consistent units, we assume that a unit of time costs a dollar.

3.3.10 Generating Test Instances:

The NMJS environment treated in this study can be viewed as a flexible job shop with

stochastic processing times, reentrant flow, sequence dependent set ups and interfab

travel. While test problems are available for flexible job shops (Barnes and Chambers,

1996) along with certain generalizations such as reentrant flow and sequence dependent

set ups (Deepu Phillip, 2005), not much has been reported in the literature on the NMJS

environment which, in addition to the above, also calls for interfab transportation times

and stochasticity in processing times. Therefore, a problem generator has been written in

JAVA to create test instances for this rather general setting The necessary inputs for the

generator are provided through a command line interface. These are:

• Number of jobs

• Number of operations for each job type.

• Number of reentrant loops for each job

• Number of workcenters

• Number of fabs

• Set of machines for each fab.

• Set of machines for each workcenter

• Intrafab travel time for each fab.

Page 154: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

139

• Interfab travel time between each pair of fabs.

• Intrafab travel cost for each fab.

• Interfab travel cost between each pair of fabs.

• Lower and upper limits of mean processing time and its deviation for each

workcenter.

• Lower and upper limits of mean processing cost and its deviation for each

workcenter.

• Number of scenarios

• Probability of each scenario and its processing time multiplicative factor.

• Mean of interarrival time.

• Number of wafers

• Weights for completion time and budget surplus variables in the objective

function.

• Set up factor

• X-factor

• Value for big H.

The details regarding these inputs are presented below:

Number of jobs: This number indicates the total number of customer orders received

where each order is composed of a set of wafers. The number of wafers in each order is

also specified as an input. If a customer places multiple orders (orders for multiple

products) then each order is treated as a different job.

Number of operations for each job type: This number refers to the total number of processing

steps involved in each job.

Number of reentrant loops for each job: The total number of operations for each job is

categorized into loops or passes where each loop or pass is composed of a certain set of

operations. Re-entrancy is captured in the NMJS environment through these loops

Page 155: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

140

because after a job passes through a set of workcenters in the first loop, it re-enters the

beginning workstation of the first loop and passes through the same set of workstations

for operations in the second loop and proceeds similarly until its final operation in the

last loop.

Number of workcenters: Each workcenter is composed of a set of similar machines. These

machines could, however, belong to different fabs. The number of workcenters,

therefore, refers to the total number of such groups of similar machines.

Number of fabs: This number provides the total number of fabs under consideration.

Set of machines for each fab: This is a set providing machine identification numbers for each

machine in a fab. Machines from different fabs cannot have the same identification

number. For instance, if a machine from Fab 1 is named M1, then a machine from Fab 2

cannot be named M1.

Set of machines for each workcenter: This is a set that uses the same machine identification

numbers such as those used to describe machines in a fab. Identification numbers

corresponding to machines that share similar characteristics to form a workcenter are

provided in this set.

Intrafab travel time for each fab: This number represents the travel time between machines

within a fab.

Interfab travel time between each pair of fabs: This number represents the travel time between

fabs. It must be mentioned here that the travel time between machines from different

fabs is considered to be the same as the travel time between fabs.

Lower and upper limits of mean processing time and its deviation for each workcenter: Each

workcenter is assigned lower and upper limits for its mean processing time and

processing time deviation. Using a uniform distribution, the generator picks a random

number between the two limits as the mean processing time for the workcenter. Similarly

Page 156: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

141

it uses the two limits for the processing time deviation and picks a random number using

uniform distribution between these limits. In order to determine the processing times

for the individual machines within the workcenter, the generator first defines a lower

limit given by the mean processing time minus the processing time deviation and an

upper limit defined by the mean processing time plus the processing time deviation. It,

then, uses uniform distribution to pick as many random numbers between these limits as

there are machines in the workcenter.

Lower and upper limits of mean processing cost and its deviation for each workcenter: The inputs for

these are similar to those for mean processing time and its deviation.

Number of scenarios: This number provides the total number of processing time scenarios

considered. Each scenario represents a processing time realization for all operations of all

jobs.

Probability of each scenario and its processing time multiplicative factor: The probability of

occurrence of each scenario is provided as an input where the sum of these probabilities

must equal ‘1’. Also processing time multiplicative factors are provided for each scenario.

These are multiplied with the processing times derived for each job-operation, in order to

create unique processing time realizations corresponding to each scenario.

Mean of interarrival time: This number is used as the parameter for an exponential

distribution to determine arrival times of jobs.

Number of wafers: This number describes the order size for each job.

Weights for completion time and budget surplus variables in the objective function: These numbers

represent customer priorities for completion time and budget.

Set up factor: This number expressed as a fraction, when multiplied by the sum of the

processing times of a pair of jobs on a machine, gives the sequence dependent set up

Page 157: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

142

time incurred when either job directly precedes or succeeds the other. The set up time

between operations of the same job, however, is set to zero.

X-factor: This is a multiplier that accounts for job-waiting times at machines and other

incidental delays that might occur in the course of the processing of the job. It is used in

determining hypothetical job due dates.

Value for big ‘H: A large positive number.

Using the above input information from the user, the problem generator generates a

unique processing route for each job. It does so by generating a sequence of workcenters

that the job must visit in its first pass. For each successive pass thereafter, the generator

forces the job to visit the same workcenter in order to capture the characteristic

reentrant flow that marks the NMJS environment. The number of passes for each job, is

specified by the user through reentrant loops. It must, however, be noted that although a

job may visit the same workcenter multiple times, it may or may not be assigned to the

same machine within the workcenter each time because each workcenter is composed of

a group of similar machines and the assignment of a job to a particular machine within

the workcenter depends on a host of factors such as the processing time of the job on

the machine, cost of processing and travel time between machines. The assignment of

jobs to machines is a decision variable whereas the workcenter (set of machines) assigned

for each job operation by the problem generator constitutes the parameter ( , )i jM (see

formulation NMJSP under Section 3.2).

In order to generate a unique processing route for the first pass of a job, the problem

generator randomly picks a workcenter from the set of workcenters and assigns it to the

first operation of the job. For each successive operation of the job, workstations left over

in the set are randomly picked and assigned. This is done until all the operations in the

first pass are assigned workcenters such that no two operations in the first pass of a job

have the same workcenter. Once the processing route for the first pass of a job is

determined, the routes for each successive pass stay the same.

Page 158: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

143

In addition to the processing routes, the problem generator also determines the budget

for each job. Note that each job-operation is assigned a workcenter and is therefore

capable of being performed on multiple machines in the workcenter. For each machine

in the workcenter, across all possible scenarios, the maximum and the minimum product

of processing cost and time is first determined. The machines in the workcenter on

which the maximum and minimum occur for each job-operation, is also recorded. The

upper limit for the budget of a job is given by the sum of the maximum value of the

product of processing cost and time for each operation and the travel cost between

successive operations on the machine on which this maximum occurs. Similarly, the

lower limit for the budget of a job is given by the sum of the minimum value of the

product of processing cost and time for each operation and the travel cost between

successive operations on the machine on which this minimum occurs. Using a uniform

distribution, the generator picks a random number between the upper and lower limits as

the budget for the job.

Finally, the problem generator also provides upper bounds on the completion times for

job-operations for all jobs. These calculations are performed as described in Section

3.3.8. Although the problem generator is in JAVA, its output (test instance) is generated

in a format acceptable to AMPL as a data file. This data file provides the necessary model

parameters to the AMPL model file which contains the first and second-stage problems

corresponding to the L-shaped method.

3.3.11 Optimality cut for the multicut Algorithm:

The multicut method (Birge and Louveaux, 1988) as described in section 2.7 differs

from the single cut L-shaped method (Van Slyke and Wets, 1969) in that multiple

optimality cuts, each corresponding to a scenario, are added to the master problem in

each major iteration.

Page 159: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

144

Thus, the optimality cut given by Expression (3.64) for the single cut method would in

this case be generated at every iteration n , for each scenario s, where 1,...,s S= . This cut

is shown below:

( )1

( , ) ( , )( , )( , )( , , 1)( , 1)( , ) ( , 1)

1 1 ( , 1) ( , ) ( , 1)

( , , ) ( , )( , )( , , , ) ( , ) ( , , ,

1 ( , ) ( , ) :( , ) ( , )

(

i

m m

Ns n e fm s m

e fi j ji ji j i jm e fM M Mi j i j i j i j

Mm s n m s m

k li j k l k l i j km i j k lz z

i j k l

Jop p v dx

Hsp p x

θ−

+++∈ ∈ ∈= = + +

= ∈ ∈≠

+ + +∑ ∑ ∑

+

∑∑

∑ ∑ ∑

( , ) ( , ) ( , 1)

)

1( , ) ( , ) ( , )

( , ) ( , ) ( , )( , , 1)( ) ( , )1 1 1

( , ) ( , , )( ) ( , , , )

1 ( , ( , ) :( , ) ( , )

)

(

i i

i j i j i j

m

ml

Ns n m s e fm m

i j i j e f ii j ji i ji j m j e fM M M

Ns n m s n

ii i j k li i j k l z

i j k l

y

J Jbp pc x v dc w

bp spb

+

+= = ∈ = ∈ ∈

= ∈≠

+

≥ +

∑ ∑ ∑ ∑ ∑ ∑

∑ ∑ ( , , , )1 )

( , )( , )

1 0

)m

Mmi j k l

m z

Ns n

ii ji j

H u

rp r

= ∈

= =

+

∑ ∑

∑∑

(3.123)

Since the Stage-I problem comprises solely of binary variables and the Stage-II problem

is continuous, it will be interesting to compare the performance of the multicut method

with the single-cut method on the generated problem instances.

3.3.12 Experimentation - I:

In this section, we present computational results pertaining to

1. The impact of reentrant flow and deadlock prevention constraints on the CPU

time (run time) required to obtain the optimal solution and the number of

feasibility and optimality cuts generated.

2. The run time performance of various models that either employ (a) the standard

optimality and feasibility cut with a big H value fixed at 1000, or (b) the

standard optimality and feasibility cut with a conservative estimate (section

3.3.8) of big H, or (c) the alternative optimality and feasibility cut in conjunction

with a conservative estimate of big H, or (d) the multicut algorithm. The master

problem in all these models uses Expression (3.16) as the objective function and

Page 160: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

145

carries reentrant flow constraints for all the machines as well as two and three

machine deadlock prevention constraints.

3. The run time performance of the above models but where the master problem

uses Expression (3.122) as the objective function.

The test instances (datasets) for the experimentation have been generated to include data

within the following ranges.

Number of jobs: Upto 5 jobs

Number of operations for each job type: Upto 10 operations per job

Number of workcenters: Upto 5

Number of fabs: Upto 3

Set of machines for each fab: Upto 3

Set of machines for each workcenter: Upto 3

Intrafab travel time for each fab: 0.4 hour

Interfab travel time between each pair of fabs in hours: ~ U [8, 13]

Intrafab travel cost for each fab: $ 0.01 per wafer

Interfab travel cost between each pair of fabs: $ 0.4 per wafer

Lower limit of mean processing time for each workcenter in hours: ~U [1, 9]

Upper limit of mean processing time for each workcenter in hours: ~U [2, 11]

Lower limit for the processing time deviation factor: ~ U [0.25, 0.5]

Upper limit for the processing time deviation factor: ~ U [0.4, 0.7]

Lower limit of mean processing cost for each workcenter in dollars per hour: ~ U [30,

150]

Upper limit of mean processing cost for each workcenter in dollars per hour: ~ U [40,

170]

Lower limit for the processing cost deviation factor in dollars: ~U [3, 10]

Upper limit for the processing cost deviation factor in dollars: ~ U[5 , 15]

Number of scenarios: Upto 7

Processing time multiplicative factors: 0.8 – 1.5

Page 161: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

146

Mean of arrival time : 0.3, 0.4

Number of wafers: 25

Weights for completion time and budget surplus: Equally weighted.

Set up factor: 0.1

X-factor: 1.5, 3.0

Value for big H: 1000

All computations have been carried out on a DELL OPTIPLEX computer with a 3.4

GHz Pentium 4 processor and 1.0 GB of RAM.

3.3.12.1 Experimentation on the Impact of Reentrant Flow and Deadlock

Prevention Constraints on Performance:

In this section, Tables 3.1, 3.2 and 3.3 present a comparison of run-time performance

and Table 3.4 presents the number of feasibility and optimality cuts generated in the

absence and the presence of the reentrant flow and deadlock prevention constraints in

the master problem. The first column in Table 3.1 denotes the instance of a particular

dataset, the second column gives a description of the dataset and the third, fourth and

fifth columns specify the number of machines whose expected traffic (or number of

possible visits by operations) equals 2, 4 or 6 operations. Recall that, every job-operation

has alternative machines on which it can be processed. For every machine, if the set of

operations that it is capable of processing were determined, then the cardinality of this set

gives an estimate of the frequency with which the machine appears on the processing

routes of the various jobs thus providing an estimate of the traffic that the machine

might encounter. In the instances considered in Table 3.1, the expected traffic for a

machine is either 2, 4 or 6 operations. The run time performance of different models is

depicted in columns 6 - 10. Model A relies solely on the feasibility cuts to eliminate

infeasible master problem solutions; model B employs indirect precedence variables and

reentrant flow constraints for machines whose expected traffic equals 2 and 4 operations;

Model C is similar to Model B but, in addition, incorporates two machine deadlock

prevention constraints.

Page 162: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

147

Table 3.1: Impact of the reentrant flow and deadlock prevention constraints on the CPU

time required to obtain the optimal solution Number of

machines with

expected traffic

=

CPU Time in seconds required by

Models No. Dataset

Description

2 4 6 A B C D E

1 2 2 3 778.61 872.91 855.67 1292.19 1059.22 2 2 2 3 685.80 564.00 583.75 687.20 562.05 3 2 2 3 556.36 495.86 457.72 725.09 641.13 4 2 2 3 479.48 446.70 429.14 611.70 706.77 5

3 jobs, 6 operations/job, 8

machines, 5 scenarios

2 2 3 937.44 794.95 869.83 1558.02 1516.67 6 4 3 1 122.37 114.98 114.62 112.00 111.68 7 4 3 1 234.77 203.44 204.09 186.41 188.31 8 4 3 1 186.80 147.70 148.83 138.72 139.20 9 4 3 1 210.31 182.97 183.80 167.39 168.03 10

3 jobs, 6 operations/job, 8

machines, 5 scenarios

4 3 1 233.78 200.13 202.20 184.45 186.67 11 0 5 2 26.39 21.12 22.43 22.21 23.14 12 0 5 2 51.19 38.02 41.70 37.48 40.09 13 0 5 2 49.72 28.88 30.69 34.91 34.27 14 0 5 2 27.14 9.70 10.42 14.45 14.02 15

3 jobs, 6 operations/job, 8

machines, 5 scenarios

0 5 2 116.72 122.73 110.92 128.28 131.52 16 0 4 2 124.66 64.53 64.05 75.48 71.11 17 0 4 2 164.73 51.92 55.27 75.20 124.19 18 0 4 2 185.09 83.78 89.52 121.08 120.27 19 0 4 2 205.70 119.41 126.97 162.22 154.47 20

3 jobs, 6 operations/job, 8

machines, 5 scenarios

0 4 2 198.89 101.64 111.38 118.42 121.89 21 3 4 1 135.28 119.98 125.64 95.30 96.56 22 3 4 1 103.16 87.53 82.55 71.83 71.92 23 3 4 1 80.52 70.83 69.45 56.05 56.08 24 3 4 1 136.58 121.70 123.33 119.70 119.48 25

3 jobs, 6 operations/job, 8

machines, 5 scenarios

3 4 1 97.75 86.70 87.66 78.13 78.89 26 2 1 3 628.05 491.56 485.78 513.61 571.13 27 2 1 3 570.22 495.09 489.17 556.38 479.53 28 2 1 3 604.86 510.83 512.11 627.38 519.53 29 2 1 3 598.80 469.06 469.78 584.42 470.92 30

3 jobs, 6 operations/job, 8

machines, 5 scenarios

2 1 3 607.22 533.08 537.09 617.28 519.53 31 1 1 4 337.00 193.81 191.73 192.19 217.73 32 1 1 4 302.73 282.72 284.02 790.70 571.39 33 1 1 4 771.34 746.88 749.70 1731.03 1595.39 34 1 1 4 603.55 652.38 651.94 1221.34 1586.05 35

3 jobs, 6 operations/job, 8

machines, 5 scenarios

1 1 4 335.66 212.05 212.80 979.17 532.27 36 2 1 3 302.69 266.66 262.88 301.61 283.00 37 2 1 3 244.11 241.94 258.08 292.83 250.92 38 2 1 3 335.11 244.48 259.41 381.81 281.14 39 2 1 3 651.89 669.13 671.56 1235.80 1006.92 40

3 jobs, 6 operations/job, 8

machines, 5 scenarios

2 1 3 274.48 250.27 250.42 299.77 232.59 41 5 0 3 171.34 159.69 157.45 130.05 126.66 42 5 0 3 142.97 149.55 149.91 159.00 105.36 43 5 0 3 170.22 178.50 178.17 205.66 130.66 44 5 0 3 153.41 147.02 147.25 163.81 117.03 45

3 jobs, 6 operations/job, 8

machines, 5 scenarios

5 0 3 151.16 143.50 143.70 168.47 121.94

Page 163: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

148

Table 3.1: Impact of the reentrant flow and deadlock prevention constraints on the CPU

time required to obtain the optimal solution (continued) Number of

machines with

expected traffic

=

CPU Time in seconds required by

Models No. Dataset

Description

2 4 6 A B C D E

46 2 2 3 521.79 523.42 522.01 834.48 925.50 47 2 2 3 812.66 761.25 762.00 1179.73 1148.83 48 2 2 3 638.95 598.02 674.17 973.11 944.47 49 2 2 3 611.28 638.02 620.45 1048.27 800.84 50

3 jobs, 6 operations/job, 8

machines, 5 scenarios

2 2 3 445.72 421.83 459.98 651.25 662.20 51 2 1 3 589.38 387.30 386.11 335.94 369.27 52 2 1 3 417.47 229.03 241.47 177.75 175.61 53 2 1 3 815.94 634.67 667.41 716.63 730.06 54 2 1 3 607.83 415.84 444.22 352.50 367.77 55

3 jobs, 6 operations/job, 8

machines, 7 scenarios

2 1 3 614.19 475.75 506.16 420.17 328.77 56 2 2 2 999.42 844.59 825.81 716.30 744.44 57 2 2 2 725.80 602.89 646.84 632.83 534.05 58 2 2 2 696.69 670.97 623.27 517.19 568.55 59 2 2 2 713.06 611.67 650.84 570.33 466.91 60

3 jobs, 6 operations/job, 8

machines, 7 scenarios

2 2 2 710.91 546.34 602.41 488.48 468.55 61 3 4 1 143.39 100.11 99.48 83.80 89.67 62 3 4 1 199.48 163.42 162.13 140.89 142.23 63 3 4 1 253.02 186.92 185.83 160.02 161.44 64 3 4 1 181.38 170.88 148.66 123.91 124.47 65

3 jobs, 6 operations/job, 8

machines, 7 scenarios

3 4 1 234.94 179.17 186.55 164.95 165.41 66 4 2 2 76.98 55.30 52.81 56.30 60.20 67 4 2 2 71.45 43.67 42.27 50.28 53.22 68 4 2 2 69.63 44.67 43.34 55.14 54.84 69 4 2 2 68.06 48.00 45.84 54.05 52.11 70

3 jobs, 6 operations/job, 8

machines, 7 scenarios

4 2 2 77.03 54.19 54.11 68.91 66.09 71 4 0 3 657.75 603.11 627.09 1186.05 1149.23 72 4 0 3 2452.12 2566.45 2540.64 3622.45 3578.41 73 4 0 3 265.67 342.80 340.84 342.72 364.52 74 4 0 3 300.42 376.44 391.66 665.91 751.17 75

4 jobs, 4 operations/job, 8

machines, 3 scenarios

4 0 3 417.23 406.94 424.84 755.59 850.91 76 0 6 2 28224.80 32554.8 23916.20 19786.50 15406.60 77 0 6 2 35933.20 23778.6 37752.40 24694.50 28896.50 78 0 6 2 ** 25564.8 26665.70 18865.10 19956.40 79 0 6 2 15054.80 3934.12 4053.70 3780.05 3785.56 80

4 jobs, 4 operations/job, 8

machines, 7 scenarios

0 6 2 30507.60 15037.5 16030.40 9988.36 8611.52 81 1 2 5 2476.30 3302.94 2698.05 7668.64 6271.98 82 1 2 5 3367.19 4062.48 3397.28 6204.02 8745.91 83 1 2 5 2412.23 2335.39 2180.53 3134.52 2555.78 84 1 2 5 1834.28 1824.83 1553.33 1691.09 2318.31 85

4 jobs, 4 operations/job, 8

machines, 7 scenarios

1 2 5 549.52 695.09 637.73 662.33 654.09 86 0 5 3 29514.60 7868.66 8465.91 9601.88 9141.47 87 0 5 3 22328.50 4851.98 3524.41 4974.11 7592.45 88 0 5 3 38680.20 12375.0 11159.20 20679.50 10463.30 89 0 5 3 28733.50 4367.41 4889.69 5264.61 8201.03 90

4 jobs, 4 operations/job, 8

machines, 7 scenarios

0 5 3 38263.50 12783.9 16111.00 21352.50 15694.40

** Exceeded time limit

Page 164: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

149

Model D includes indirect precedence variables and reentrant flow constraints solely for

machines whose expected traffic equals 6 operations. Finally, Model E is similar to

Model D but, in addition, it employs two machine deadlock prevention constraints. Note

that, in Models B-E, any infeasibilities not eliminated by their inherent reentrant flow and

deadlock prevention constraints are handled by the presence of feasibility cuts.

Table 3.2 presents the difference in run times for the different models over the 90

instances presented above whereas Table 3.3 summarizes the run time performance for

the models. Given models X and Y, Table 3.3 records the average difference in

performances, the number of instances where X performed better than Y (frequency),

the largest positive and negative differences between X and Y, where a positive

difference indicates the amount by which X performed worse than Y and a negative

difference indicates the amount by which X performed better than Y. From Tables 3.2

and 3.3 we observe that all models outperform Model A in terms of average performance

with Models B and C performing better than Model A in virtually 80% of the instances

and Models D and E outperforming Model A in nearly 62% of the instances. Between

Models B and C, although Model B on an average performs better than C, it does so in

only 33 of the 90 instances. Note that, minor differences of upto 7 seconds between run

times for Models B and C have been disregarded because Table 3.4 indicates an identical

performance by both models in terms of the number of feasibility and optimality cuts

generated. This shows that for certain instances both models perform identically and

minor differences in run times can be attributed to the inherent variability in the

performance of the processor. In a comparison of Models D and E, on an average

Model E performs better and does so in nearly 62 out of the 90 instances. The

experimental evidence suggests that when there exists one or two machines whose

expected traffic equals six, Model E on an average and most frequently performs the

best. This is because indirect precedence variables and reentrant flow and deadlock

prevention constraints are specified for very few machines (in this case for a maximum

of two machines) in the master problem and the remaining infeasibilities are handled by

the feasibility cuts. Model E appears to induce just enough complexity into the master

problem so as to yield appreciable computational time savings whilst ensuring that the

Page 165: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

150

master problem is not too heavily saddled in each iteration. As the number of machines

whose expected traffic equals six increases to three or beyond, the run time performance

of Models D and E begins to deteriorate. This behavior is explained by the increasing

number of indirect precedence variables and reentrant flow and deadlock prevention

constraints that must be specified within the master problem causing it to become

sluggish. In these instances, Models B and C offer a better and near identical

performance. Despite closely competing performances by Models B and C, since Model

B performs better in only about a third of the instances, it appears that Model C would

be a better choice.

In light of the experimental evidence, it appears that a judicious incorporation of either

the reentrant flow and deadlock prevention constraints within the master problem offers

appreciable time savings over the basic model (Model A) that does not include any.

Amongst Models B, C, D and E we see that models that also include the two machine

deadlock prevention constraints (Models C and E) over and above the reentrant flow

constraints perform well.

Page 166: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

151

Table 3.2: Difference in CPU time between models Difference in CPU time (seconds)

Models No Dataset

Description A-B A-C A-D A-E B-C D-E B-D C-E

1 -94.30 -77.06 -513.58 -280.61 17.24 232.97 -419.28 -203.55 2 121.80 102.05 -1.40 123.75 -19.75 125.15 -123.20 21.70 3 60.50 98.64 -168.73 -84.77 38.14 83.96 -229.23 -183.41 4 32.78 50.34 -132.22 -227.29 17.56 -95.07 -165.00 -277.63 5

3 jobs, 6 operations/job8 machines, 5

scenarios

142.49 67.61 -620.58 -579.23 -74.88 41.35 -763.07 -646.84 6 7.39 7.75 10.37 10.69 0.36 0.32 2.98 2.94 7 31.33 30.68 48.36 46.46 -0.65 -1.90 17.03 15.78 8 39.10 37.97 48.08 47.60 -1.13 -0.48 8.98 9.63 9 27.34 26.51 42.92 42.28 -0.83 -0.64 15.58 15.77 10

3 jobs, 6 operations/job8 machines, 5

scenarios

33.65 31.58 49.33 47.11 -2.07 -2.22 15.68 15.53 11 5.27 3.96 4.18 3.25 -1.31 -0.93 -1.09 -0.71 12 13.17 9.49 13.71 11.10 -3.68 -2.61 0.54 1.61 13 20.84 19.03 14.81 15.45 -1.81 0.64 -6.03 -3.58 14 17.44 16.72 12.69 13.12 -0.72 0.43 -4.75 -3.60 15

3 jobs, 6 operations/job8 machines, 5

scenarios

-6.01 5.80 -11.56 -14.80 11.81 -3.24 -5.55 -20.60 16 60.13 60.61 49.18 53.55 0.48 4.37 -10.95 -7.06 17 112.81 109.46 89.53 40.54 -3.35 -48.99 -23.28 -68.92 18 101.31 95.57 64.01 64.82 -5.74 0.81 -37.30 -30.75 19 86.29 78.73 43.48 51.23 -7.56 7.75 -42.81 -27.50 20

3 jobs, 6 operations/job 8 machines, 5

scenarios

97.25 87.51 80.47 77.00 -9.74 -3.47 -16.78 -10.51 21 15.30 9.64 39.98 38.72 -5.66 -1.26 24.68 29.08 22 15.63 20.61 31.33 31.24 4.98 -0.09 15.70 10.63 23 9.69 11.07 24.47 24.44 1.38 -0.03 14.78 13.37 24 14.88 13.25 16.88 17.10 -1.63 0.22 2.00 3.85 25

3 jobs, 6 operations/job 8 machines, 5

scenarios

11.05 10.09 19.62 18.86 -0.96 -0.76 8.57 8.77 26 136.49 142.27 114.44 56.92 5.78 -57.52 -22.05 -85.35 27 75.13 81.05 13.84 90.69 5.92 76.85 -61.29 9.64 28 94.03 92.75 -22.52 85.33 -1.28 107.85 -116.55 -7.42 29 129.74 129.02 14.38 127.88 -0.72 113.50 -115.36 -1.14 30

3 jobs, 6 operations/job8 machines, 5

scenarios

74.14 70.13 -10.06 87.69 -4.01 97.75 -84.20 17.56 31 143.19 145.27 144.81 119.27 2.08 -25.54 1.62 -26.00 32

20.01 18.71 -487.97 -268.66 -1.30 219.31 -507.98 -287.37

Page 167: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

152

Table 3.2: Difference in CPU time between models (continued)

Difference in CPU time (seconds)

Models No Dataset

Description A-B A-C A-D A-E B-C D-E B-D C-E

33 24.46 21.64 -959.69 -824.05 -2.82 135.64 -984.15 -845.69 34 -48.83 -48.39 -617.79 -982.50 0.44 -364.71 -568.96 -934.11 35

3 jobs, 6 operations/job8 machines, 5

scenarios 123.61 122.86 -643.51 -196.61 -0.75 446.90 -767.12 -319.47 36 36.03 39.81 1.08 19.69 3.78 18.61 -34.95 -20.12 37 2.17 -13.97 -48.72 -6.81 -16.14 41.91 -50.89 7.16 38 90.63 75.70 -46.70 53.97 -14.93 100.67 -137.33 -21.73 39 -17.24 -19.67 -583.91 -355.03 -2.43 228.88 -566.67 -335.36 40

3 jobs, 6

operations/job8 machines, 5

scenarios 24.21 24.06 -25.29 41.89 -0.15 67.18 -49.50 17.83

41 11.65 13.89 41.29 44.68 2.24 3.39 29.64 30.79 42 -6.58 -6.94 -16.03 37.61 -0.36 53.64 -9.45 44.55 43 -8.28 -7.95 -35.44 39.56 0.33 75.00 -27.16 47.51 44 6.39 6.16 -10.40 36.38 -0.23 46.78 -16.79 30.22 45

3 jobs, 6 operations/job 8 machines, 5

scenarios

7.66 7.46 -17.31 29.22 -0.20 46.53 -24.97 21.76 46 -1.63 -0.22 -312.69 -403.71 1.41 -91.02 -311.06 -403.49 47 51.41 50.66 -367.07 -336.17 -0.75 30.90 -418.48 -386.83 48 40.93 -35.22 -334.16 -305.52 -76.15 28.64 -375.09 -270.30 49 -26.74 -9.17 -436.99 -189.56 17.57 247.43 -410.25 -180.39 50

3 jobs, 6 operations/job8 machines, 5

scenarios

23.89 -14.26 -205.53 -216.48 -38.15 -10.95 -229.42 -202.22 51 202.08 203.27 253.44 220.11 1.19 -33.33 51.36 16.84 52 188.44 176.00 239.72 241.86 -12.44 2.14 51.28 65.86 53 181.27 148.53 99.31 85.88 -32.74 -13.43 -81.96 -62.65 54 191.99 163.61 255.33 240.06 -28.38 -15.27 63.34 76.45 55

3 jobs, 6 operations/job 8 machines, 7

scenarios

138.44 108.03 194.02 285.42 -30.41 91.40 55.58 177.39 56 154.83 173.61 283.12 254.98 18.78 -28.14 128.29 81.37 57 122.91 78.96 92.97 191.75 -43.95 98.78 -29.94 112.79 58 25.72 73.42 179.50 128.14 47.70 -51.36 153.78 54.72 59 101.39 62.22 142.73 246.15 -39.17 103.42 41.34 183.93 60

3 jobs, 6 operations/job8 machines, 7

scenarios

164.57 108.50 222.43 242.36 -56.07 19.93 57.86 133.86 61 43.28 43.91 59.59 53.72 0.63 -5.87 16.31 9.81 62 36.06 37.35 58.59 57.25 1.29 -1.34 22.53 19.90 63 66.10 67.19 93.00 91.58 1.09 -1.42 26.90 24.39 64

3 jobs, 6 operations/job8 machines, 7

scenarios 10.50 32.72 57.47 56.91 22.22 -0.56 46.97 24.19

Page 168: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

153

Table 3.2: Difference in CPU time between models (continued) Difference in CPU time (seconds)

Models No Dataset

Description A-B A-C A-D A-E B-C D-E B-D C-E

65 55.77 48.39 69.99 69.53 -7.38 -0.46 14.22 21.14 66 21.68 24.17 20.68 16.78 2.49 -3.90 -1.00 -7.39 67 27.78 29.18 21.17 18.23 1.40 -2.94 -6.61 -10.95 68 24.96 26.29 14.49 14.79 1.33 0.30 -10.47 -11.50 69 20.06 22.22 14.01 15.95 2.16 1.94 -6.05 -6.27 70

3 jobs, 6 operations/job8 machines, 7

scenarios

22.84 22.92 8.12 10.94 0.08 2.82 -14.72 -11.98 71 54.64 30.66 -528.30 -491.48 -23.98 36.82 -582.94 -522.14 72 -114.33 -88.52 -1170.33 -1126.29 25.81 44.04 -1056.00 -1037.77 73 -77.13 -75.17 -77.05 -98.85 1.96 -21.80 0.08 -23.68 74 -76.02 -91.24 -365.49 -450.75 -15.22 -85.26 -289.47 -359.51 75

4 jobs, 4 operations/job 8 machines, 3

scenarios

10.29 -7.61 -338.36 -433.68 -17.90 -95.32 -348.65 -426.07 76 -4330.00 4308.60 8438.30 12818.20 8638.60 4379.90 12768.30 8509.60 77

12154.60 -1819.20 11238.70 7036.70 -

13973.8 -4202.0 -915.90 8855.90 78 ** ** ** ** -1100.9 -1091.30 6699.70 6709.30 79 11120.68 11001.10 11274.9 11269.24 -119.58 -5.51 154.07 268.14 80

4 jobs, 4 operations/job 8 machines, 7

scenarios

15470.10 14477.20 20519.2 21896.08 -992.90 1376.84 5049.14 7418.88 81 -826.64 -221.75 -5192.34 -3795.68 604.89 1396.66 -4365.70 -3573.93 82 -695.29 -30.09 -2836.83 -5378.72 665.20 -2541.89 -2141.54 -5348.63 83 76.84 231.70 -722.29 -143.55 154.86 578.74 -799.13 -375.25 84 9.45 280.95 143.19 -484.03 271.50 -627.22 133.74 -764.98 85

4 jobs, 4 operations/job8 machines, 7

scenarios

-145.57 -88.21 -112.81 -104.57 57.36 8.24 32.76 -16.36 86 21645.94 21048.69 19912.7 20373.13 -597.25 460.41 -1733.22 -675.56 87 17476.52 18804.09 17354.4 14736.05 1327.57 -2618.34 -122.13 -4068.04 88 26305.20 27521.00 18000.7 28216.90 1215.80 10216.20 -8304.50 695.90 89 24366.09 23843.81 23468. 9 20532.47 -522.28 -2936.42 -897.20 -3311.34 90

4 jobs, 4 operations/job 8 machines, 7

scenarios

25479.60 22152.50 16911.0 22569.10 -

3327.10 5658.10 -8568.60 416.60

Table 3.3: Performance difference between the models

Criteria

A-B A-C A-D A-E B-C D-E B-D C-E

Average (seconds) 1704.4 1626.3 1491.02 1638.9 -89.4 134.1 -136.5 86.9 Number of instances in which X outperformed Y in X-Y 15 18 34 26 33 28 57 48

Largest positive difference (seconds) 26305.2 27521 23468.89 28216.9 8638.6 10216.2 12768.3 8855.9

Largest negative difference (seconds) -4330 -1819.2 -5192.34 -5378.72 -13973.8 -4202 -8568.6 -5348.6

Page 169: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

154

We now present the impact of reentrant flow and deadlock prevention constraints on the

number of feasibility and optimality cuts generated for each of the models. The

description of the table is similar to Table 3.1 except that the entries here depict the

number of feasibility and optimality cuts generated.

Table 3.4: Impact of the reentrant flow and deadlock prevention constraints on the

number of feasibility and optimality cuts generated. Number of

machines with

expected traffic

=

CPU Time in seconds required by

Models No

.

Dataset

Description

2 4 6 A B C D E

1 2 2 3 (339,367) (200,396) (243,368) (106,368) (19,364) 2 2 2 3 (491,294) (274,260) (264,292) (109,319) (24,281) 3 2 2 3 (367,267) (204,259) (195,242) (97,254) (20,252) 4 2 2 3 (300,247) (166,242) (165,242) (60,256) (23,282) 5

3 jobs, 6 operations/job, 8

machines, 5 scenarios 2 2 3 (490,402) (311,342) (289,351) (161,364) (24,363)

6 4 3 1 (191,131) (89,131) (90,130) (55,130) (55,130) 7 4 3 1 (181,149) (85,149) (86,148) (57,148) (57,148) 8 4 3 1 (162,113) (74,110) (74,111) (49,113) (49,113) 9 4 3 1 (179,128) (86,132) (86,132) (57,132) (57,132) 10

3 jobs, 6 operations/job, 8

machines, 5 scenarios 4 3 1 (182,147) (84,147) (85,147) (58,147) (58,147)

11 0 5 2 (53,32) (14,32) (14,32) (19,32) (19,32) 12 0 5 2 (49,29) (12,35) (12,36) (15,29) (21,29) 13 0 5 2 (50,27) (11,27) (11,27) (19,27) (18,27) 14 0 5 2 (37,8) (6,8) (6,8) (13,8) (13,8) 15

3 jobs, 6 operations/job, 8

machines, 5 scenarios 0 5 2 (67,84) (46,88) (38,82) (35,87) (36,90)

16 0 4 2 (26,54) (27,54) (33,60) (32,55) (26,54) 17 0 4 2 (170,95) (22,47) (22,47) (37,57) (44,95) 18 0 4 2 (251,70) (30,75) (30,75) (59,84) (61,83) 19 0 4 2 (273,81) (62,97) (61,95) (89,102) (85,107) 20

3 jobs, 6 operations/job, 8

machines, 5 scenarios 0 4 2 (230,99) (20,100) (23,103) (50,92) (50,92)

21 3 4 1 (96,64) (52,92) (54,97) (26,83) (26,83) 22 3 4 1 (96,67) (45,67) (44,63) (19,64) (19,64) 23 3 4 1 (74,55) (35,55) (35,55) (10,54) (10,54) 24 3 4 1 (112,92) (62,80) (62,80) (46,90) (46,90) 25

3 jobs, 6 operations/job, 8

machines, 5 scenarios 3 4 1 (77,72) (36,72) (36,72) (11,76) (11,76)

26 2 1 3 (491,322) (248,306) (248,306) (100,305) (21,344) 27 2 1 3 (457,298) (242,317) (242,317) (90,293) (22,298) 28 2 1 3 (458,322) (237,330) (237,330) (86,339) (22,319) 29 2 1 3 (482,310) (247,298) (247,298) (92,314) (25,304) 30

3 jobs, 6 operations/job, 8

machines, 5 scenarios 2 1 3 (463,318) (246,342) (246,342) (94,305) (25,315)

31 1 1 4 (217,208) (130,115) (130,115) (29,97) (22,109) 32 1 1 4 (218,148) (146,148) (146,148) (86,200) (32,186) 33 1 1 4 (254,343) (181,350) (181,350) (94,340) (31,345) 34 1 1 4 (288,299) (224,318) (224,318) (96,237) (32,246) 35

3 jobs, 6 operations/job, 8

machines, 5 scenarios 1 1 4 (216,200) (128,136) (128,136) (73,247) (23,174)

36 2 1 3 (231,186) (151,177) (151,177) (76,199) (20,191) 37 2 1 3 (202,144) (132,182) (132,182) (51,140) (20,158) 38 2 1 3 (225,207) (140,179) (140,179) (54,178) (19,187) 39 2 1 3 (290,406) (212,411) (212,411) (118,443) (33,458) 40

3 jobs, 6 operations/job, 8

machines, 5 scenarios 2 1 3 (237,149) (157,161) (157,161) (69,156) (24,156)

Page 170: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

155

Table 3.4: Impact of the reentrant flow and deadlock prevention constraints on the

number of feasibility and optimality cuts generated (continued) Number of

machines with

expected traffic

=

CPU Time in seconds required by

Models No. Dataset

Description

2 4 6 A B C D E

41 5 0 3 (136,105) (116,101) (116,101) (41,87) (1,100) 42 5 0 3 (122,86) (120,93) (120,93) (33,100) (1,89) 43 5 0 3 (130,105) (128,113) (128,113) (56,113) (1,102) 44 5 0 3 (134,89) (119,91) (119,91) (48,96) (1,96) 45

3 jobs, 6 operations/job, 8

machines, 5 scenarios

5 0 3 (120,93) (105,93) (105,93) (37,98) (1,96) 46 2 2 3 (400,360) (244,362) (234,352) (101,365) (20,370) 47 2 2 3 (466,353) (239,358) (239,356) (97,365) (16,382) 48 2 2 3 (249,341) (175,332) (161,321) (94,349) (24,329) 49 2 2 3 (452,297) (247,330) (253,312) (119,335) (25,328) 50

3 jobs, 6 operations/job, 8

machines, 5 scenarios

2 2 3 (296,255) (146,243) (171,260) (73,263) (19,277) 51 2 1 3 (450,213) (186,201) (186,201) (70,202) (27,227) 52 2 1 3 (398,115) (143,115) (143,115) (37,112) (14,111) 53 2 1 3 (476,344) (245,347) (245,347) (131,358) (33,366) 54 2 1 3 (435,233) (183,236) (183,236) (68,202) (21,214) 55

3 jobs, 6 operations/job, 8

machines, 7 scenarios

2 1 3 (487,209) (223,256) (223,256) (99,221) (22,188) 56 2 2 2 (538,400) (319.381) (307,394) (108,350) (26,376) 57 2 2 2 (477,290) (297,292) (302,294) (140,286) (31,299) 58 2 2 2 (515,264) (298,335) (306,284) (121,269) (25,293) 59 2 2 2 (479,285) (276,318) (274,317) (95,292) (21,275) 60

3 jobs, 6 operations/job, 8

machines, 7 scenarios

2 2 2 (477,282) (276,265) (275,279) (90,267) (20,274) 61 3 4 1 (105,59) (35,59) (35,59) (18,59) (18,59) 62 3 4 1 (121,93) (53,91) (52,90) (32,90) (32,90) 63 3 4 1 (187,98) (62,99) (56,99) (40,97) (40,97) 64 3 4 1 (130,75) (61,92) (53,82) (32,77) (32,77) 65

3 jobs, 6 operations/job, 8

machines, 7 scenarios

3 4 1 (164,97) (53,101) (52,101) (41,101) (41,101) 66 4 2 2 (54,34) (20,34) (16,34) (24,34) (25,34) 67 4 2 2 (53,31) (11,30) (8,30) (18,30) (19,32) 68 4 2 2 (54,29) (14,29) (11,29) (27,29) (27,29) 69 4 2 2 (49,30) (17,30) (13,30) (23,30) (21,30) 70

3 jobs, 6 operations/job, 8

machines, 7 scenarios

4 2 2 (55,34) (20,33) (17,34) (28,35) (28,35) 71 4 0 3 (138,559) (128,530) (128,530) (23,545) (4,576) 72 4 0 3 (179,936) (170,936) (170,936) (36,929) (4,921) 73 4 0 3 (129,275) (150,354) (150,354) (18,298) (5,302) 74 4 0 3 (148,289) (149,375) (149,375) (30,342) (3,372) 75

4 jobs, 4 operations/job, 8

machines, 3 scenarios

4 0 3 (157,396) (157,402) (157,402) (44,459) (3,448) 76 0 6 2 (3409,1407) (336,1511) (294,1410) (239,1520) (133,1448) 77 0 6 2 (2879,1719) (329,1636) (307,1885) (223,1595) (121,1669) 78 0 6 2 ** (368,1207) (385,1266) (282,1172) (172,1178) 79 0 6 2 (3369,675) (224,732) (223,737) (154,719) (77,719) 80

4 jobs, 4 operations/job, 8

machines, 7 scenarios

0 6 2 (4742,1215) (440,1156) (414,1114) (286,1016) (178,1058) 81 1 2 5 (390,728) (339,783) (325,702) (116,673) (24,699) 82 1 2 5 (640,987) (562,1086) (541,960) (132,1092) (15,1269) 83 1 2 5 (1244,593) (918,610) (888,591) (146,702) (4,629) 84 1 2 5 (802,490) (640,527) (639,467) (140,420) (10,461) 85

4 jobs, 4 operations/job, 8

machines, 7 scenarios

1 2 5 (420,193) (380,254) (366,233) (66,237) (9,238) 86 0 5 3 (3615,1137) (375.1202 (357,1283) (350,1044) (183,1153) 87 0 5 3 (3256,632) (381,909) (333,682) (319,772) (177,961) 88 0 5 3 (3571,1506) (375,1529) (329,1255) (314,1625) (183,1396) 89 0 5 3 (3837,1022) (359,755) (332,881) (287,830) (193,948) 90

4 jobs, 4 operations/job, 8

machines, 7 scenarios

0 5 3 (4634,1483) (357,1486) (334,1580) (279,1613) (160,1361)

Page 171: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

156

In the following section (see section 3.3.12.2), we compare the performances of four

models on generated test instances (see Table 3.5 for details). All four models include

reentrant flow constraints for all the machines and two and three machine deadlock

prevention constraints within the master problem. Models 1 and 2 differ in that, Model 1

assigns to H a fixed value of 1000 whereas Model 2 uses a job-operation dependent

completion time upper bound (Expression (3.121) under Section 3.3.8) as the value for H

in the solution of the sub-problems as well as for the feasibility and optimality cuts.

Model 3 utilizes a modified disjunctive constraint in conjunction with lower-upper

bounded continuous variables in the subproblems. The resulting optimality and feasibility

cuts differ from those used in Models 1 and 2. Here, the Stage-I (master) problem carries

additional binary variables compared to those in Models 1 and 2. Finally, Model 4 differs

from the previous models, in that, it generates an optimality cut for each scenario as

opposed to an optimality cut aggregated over all scenarios. Models 3 and 4 use the same

upper bound for H as used in Model 2.

3.3.12.2 Run Time Performance of Different Models with a Time Based

Objective Function.

Table 3.5: Description of Models tested

Number Algorithm

Components Model 1 Model 2 Model 3 Model 4

1 Master Problem

Objective Function 3.16* 3.16 3.16 3.16

2 First-stage/Master

Problem MMP1 MMP1 AMP MMP1

3 Phase-I Second

Stage/Subproblem ARP

ARP

(3.121 as H) AMSSP (3.121 as H)

ARP

(3.121 as H)

4 Feasibility Cut 3.46 3.46

(3.121 as H) 3.118, (3.121 as H)

3.46

(3.121 as H)

5 Phase-II Second

Stage/Subproblem RP

RP

(3.121 as H) MSSP (3.121 as H)

RP

(3.121 as H)

6 Optimality Cut 3.64 3.64

(3.121 as H) 3.100, (3.121 as H)

3.123

(3.121 as H)

*The numerical entries in the table represent expression numbers.

Page 172: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

157

Table 3.6: Run time performance of models with a time based objective for dataset

2/6/8/5/3

No. Description Dataset Model Objective

Value Time

(secs) Feasibility

cuts Optimality

cuts

Model 1 1764.23 3.68 0 12 1 2/6/8/5/3 2121 Model 2 1764.23 3.54 0 10 Model 3 1764.23 3.875 1 11 Model 4 1764.23 3.59 0 30 2122 Model 1 1431.52 2.29 0 7 Model 2 1431.52 2.171 0 7 Model 3 1431.52 1.95 0 7 Model 4 1431.52 2.375 0 21 2123 Model 1 1143.99 2.65 0 9 Model 2 1143.99 2.57 0 9 Model 3 1143.99 2.0 0 7 Model 4 1143.99 2.57 0 27 2124 Model 1 670.62 6.04 0 16 Model 2 670.62 3.73 0 18 Model 3 670.62 5.73 0 17 Model 4 670.62 3.78 0 45 Model 1 1393.8 2.46 0 12 2125 Model 2 1393.8 2.765 0 11 Model 3 1393.8 2.70 0 9 Model 4 1393.8 2.01 0 33 Model 1 1269.78 3.96 0 15 2126 Model 2 1269.78 2.45 0 15 Model 3 1269.78 4.06 0 15 Model 4 1269.78 2.32 0 42 Model 1 1672.02 2.1 0 12 2127 Model 2 1672.02 1.78 0 12 Model 3 1672.02 2.73 0 10

Model 4 1672.02 2.17 0 39 * Model 1: Standard optimality and feasibility cut with a big H value of 1000 * Model 2: Standard optimality and feasibility cut with a conservative (operation due-date based) estimate of big H. * Model 3: Alternative optimality and feasibility cut with a conservative estimate of big H. * Model 4: Multicut algorithm.

Page 173: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

158

Table 3.7: Run time performance of models with a time based objective for dataset

3/4/8/5/3

* Model 1: Standard optimality and feasibility cut with a big H value of 1000 * Model 2: Standard optimality and feasibility cut with a conservative (operation due-date based) estimate of big H. * Model 3: Alternative optimality and feasibility cut with a conservative estimate of big H. * Model 4: Multicut algorithm.

No. Description Dataset Model Objective

Value Time

(secs) Feasibility

cuts Optimality

cuts

Model 1 1573.33 32.01 0 80 2 3/4/8/5/3 3121 Model 2 1573.33 33.42 0 82 Model 3 1573.33 33.23 0 70 Model 4 1573.33 36.78 0 234 3122 Model 1 3065.16 48.85 0 115 Model 2 3065.16 46.04 0 113 Model 3 3065.35 63.92 20 101 Model 4 3065.16 53.26 0 303 3123 Model 1 1573.39 3.65 0 12 Model 2 1573.39 3.51 0 11 Model 3 1573.39 3.46 1 11 Model 4 1573.39 3.09 0 30 3124 Model 1 1856.2 74.45 0 169 Model 2 1856.2 59.87 0 159 Model 3 1856.2 69.51 0 133 Model 4 1856.2 82.65 0 477 Model 1 2395.42 8.18 0 19 3125 Model 2 2395.42 7.2 0 17 Model 3 2395.42 9.25 0 18 Model 4 2395.42 6.25 0 45 Model 1 469.06 2.39 0 7 3126 Model 2 469.06 2.35 0 7 Model 3 469.06 2.79 0 7 Model 4 469.06 2.46 0 21 Model 1 615.07 6.04 0 16 3127 Model 2 615.07 6.18 0 16 Model 3 615.07 4.84 0 16

Model 4 615.07 6.23 0 48

Page 174: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

159

Table 3.8: Run time performance of models with a time based objective for dataset

4/4/8/5/3

No. Description Dataset Model Objective

Value Time

(secs) Feasibility

cuts Optimality

cuts Model 1 1760.93 985.17 0 567 3 4/4/8/5/3 4161 Model 2 1760.93 735.95 6 444 Model 3 1760.93 1194.72 21 422 Model 4 1760.93 1232.52 2 1233 4162 Model 1 1588.52 5.56 0 18 Model 2 1588.52 5.75 0 18 Model 3 1588.52 5.79 0 18 Model 4 1588.52 6.06 0 54 4163 Model 1 1224.67 52.14 0 105 Model 2 1224.67 44.9 0 96 Model 3 1224.67 65.54 1 98 Model 4 1224.67 51.85 0 258 4164 Model 1 698.01 22.56 1 58 Model 2 698.01 18.17 1 40 Model 3 698.01 31.51 1 54 Model 4 698.01 15.84 1 111 Model 1 962.93 4.81 0 15 4165 Model 2 962.93 4.43 0 13 Model 3 962.93 4.43 0 13 Model 4 962.93 4.25 0 39 Model 1 675.69 4.26 0 13 4166 Model 2 675.69 5.14 0 15 Model 3 675.69 5.09 0 16 Model 4 675.69 4.95 0 45 Model 1 381.43 2.18 0 7 4167 Model 2 381.43 2 0 6 Model 3 381.43 2.76 0 8 Model 4 381.43 2.04 0 18

* Model 1: Standard optimality and feasibility cut with a big H value of 1000 * Model 2: Standard optimality and feasibility cut with a conservative (operation due-date based) estimate of big H. * Model 3: Alternative optimality and feasibility cut with a conservative estimate of big H. * Model 4: Multicut algorithm.

Page 175: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

160

Table 3.9: Run time performance of models with a time based objective for dataset

3/6/8/5/3

No. Description Dataset Model Objective

Value Time

(secs) Feasibility

cuts Optimality

cuts Model 1 1938.1 9.64 0 28 4 3/6/8/5/3 318 Model 2 1938.1 7.84 0 23 Model 3 1938.1 8.20 4 20 Model 4 1938.1 8.03 0 60 3181_1 Model 1 1469.04 36.89 0 60 Model 2 1469.04 34.46 0 59 Model 3 1469.04 55.23 3 49 Model 4 1469.04 41.18 0 150 3182_1 Model 1 654.19(gap) 4390.83 0 800 Model 2 147.90(gap) 5056.91 0 700 Model 3 228.19(gap) 3666.17 1 500 Model 4 1183.91 5704.59 0 1734 3183 Model 1 1215.43 149.188 2 122 Model 2 1215.43 75.89 1 95 Model 3 1215.43 108.7 0 91 Model 4 1215.43 79.28 2 243 Model 1 571.73 21.85 0 40 3184 Model 2 571.73 22.26 0 40 Model 3 571.73 29.28 0 30 Model 4 571.73 34.09 0 126 Model 1 693.08 25.98 0 46 3187 Model 2 693.08 14.34 0 31 Model 3 693.08 31.71 2 37 Model 4 693.08 21.23 0 102 Model 1 1644.3 88.6 1 94 3188 Model 2 1644.3 80.14 1 97 Model 3 1644.3 142.219 3 99 Model 4 1644.3 71.46 0 225

* Model 1: Standard optimality and feasibility cut with a big H value of 1000 * Model 2: Standard optimality and feasibility cut with a conservative (operation due-date based) estimate of big H. * Model 3: Alternative optimality and feasibility cut with a conservative estimate of big H. * Model 4: Multicut algorithm.

Page 176: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

161

Table 3.10: Run time performance of models with a time based objective for dataset

3/6/8/5/5

No. Description Dataset Model Objective

Value Time

(secs) Feasibility

cuts Optimality

cuts Model 1 1483.58 41.25 0 56 5 3/6/8/5/5 318S1 Model 2 1483.58 34.82 2 59 Model 3 1483.58 35.87 3 53 Model 4 1483.58 34.42 1 210 3181_1S1 Model 1 1086.5 159.45 0 90 Model 2 1086.5 127.85 0 79 Model 3 1086.5 140.01 0 51 Model 4 1086.5 117.6 0 270 3183S1 Model 1 618.197 54.03 3 68 Model 2 618.197 43.21 3 67 Model 3 618.917 33.17 0 53 Model 4 618.917 54.125 2 310 Model 1 973.11 4.54 0 8 3184S1 Model 2 973.11 3.53 0 8 Model 3 973.11 4.98 0 10 Model 4 973.11 4.6 0 40 Model 1 997.47 391.312 0 315 3187S1 Model 2 997.47 411.56 1 329 Model 3 997.47 596.25 0 288 Model 4 997.47 717.62 0 1525 Model 1 513.08 94.95 0 90 3188S1 Model 2 513.08 126.2 3 97 Model 3 513.08 225.5 0 97 Model 4 513.08 264.23 3 585

* Model 1: Standard optimality and feasibility cut with a big H value of 1000 * Model 2: Standard optimality and feasibility cut with a conservative (operation due-date based) estimate of big H. * Model 3: Alternative optimality and feasibility cut with a conservative estimate of big H. * Model 4: Multicut algorithm.

Page 177: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

162

In a comparison of the four models we observe from the experimental results that model

2 almost always outperforms model 1 in terms of computational time. It also generates

fewer optimality cuts. Model 2 differs from Model 1 in that H is replaced by a

conservative estimate as described in section 3.3.8 whereas H is fixed at a value of 1000

in model 1. Model 3, which uses the alternative optimality cut, requires fewer number of

iterations to converge to the optimum in most test instances as compared to Models 1

and 2. However, the premium in terms of computational time per iteration is also higher.

Note that Model 3 takes the maximum computational time out of all models in solving

the problems. The reason for this is that model 3 adds extra binary variables as well as

extra constraints given by Expressions (3.71) and (3.81) to the master problem.

Moreover, a number of continuous variables as well as constraints are added to the

subproblem. Another point to be noted here is that when the upper bound in

Expressions (3.108), (3.109), (3.110) and (3.111) is set too tight, the second-stage

problem (AMSSP) could be rendered infeasible thus necessitating a feasibility cut and

ultimately leading to the master problem having to be re-solved. All of these factors work

either independently or in unison to contribute to the increased computational times

observed in the case of model 3.

Finally, Model 4 which implements the multicut version of the L-shaped method

generates the maximum number of optimality cuts because an optimality cut pertaining

to each scenario is added to the master problem as opposed to an aggregate cut that is

added in Models 1, 2 and 3. While on one hand, the larger number of optimality cuts

more accurately reflect the shape of the second-stage or recourse function and may result

in fewer overall iterations, they also on the other hand, burden the master problem more

heavily as these cuts accumulate over the iterations. Therefore, depending on the trade-

off between the total overall number of iterations required to converge to the optimum

versus the computational overhead incurred in solving a more complicated master

problem, model 4 exhibits a variable performance in terms of computational time for the

various data sets.

In the following section, we once again compare the performances of the four models,

but we now do so with the alternative Stage-I objective function (see section 3.3.9).

Page 178: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

163

3.3.12.3 Run Time Performance of Different Models with a Cost Based Objective

Function.

Table 3.11: Description of Models Tested

No. Algorithm

Components

Model

1 Model 2 Model 3 Model 4

1 Master Problem

Objective Function3.122 3.122 3.122 3.122

2 First-stage/Master

Problem MMP1 MMP1 AMP MMP1

3 Phase-I Second-

stage/Subproblem ARP

ARP

(3.121 as H)

AMSSP

(3.121 as H)

ARP

(3.121 as H)

4 Feasibility Cut 3.46 3.46

(3.121 as H)

3.118,

(3.121 as H)

3.46

(3.121 as

H)

5 Phase-II Second-

stage/Subproblem RP

RP

(3.121 as H)

MSSP

(3.121 as H)

RP

(3.121 as H)

6 Optimality Cut 3.64 3.64

(3.121 as H)

3.100,

(3.121 as H)

3.123

(3.121 as H)

*The numerical entries in the table represent expression numbers.

Page 179: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

164

Table 3.12: Run time performance of models with a cost based objective for dataset

2/6/8/5/3

No. Description Dataset Model Objective

Value Time

(secs) Feasibility

cuts Optimality

cuts Model 1 7534.99 3.76 0 11 1 2/6/8/5/3 2121 Model 2 7534.99 3.26 3 8 Model 3 7534.99 5.71 3 13 Model 4 7534.99 3.35 3 24 2122 Model 1 6388.67 2.15 0 7 Model 2 6388.67 2.18 0 7 Model 3 6388.67 1.98 0 7 Model 4 6388.67 2.04 0 21 2123 Model 1 8147.82 2.82 0 9 Model 2 8147.82 2.54 0 9 Model 3 8147.82 1.81 0 7 Model 4 8147.82 2.79 0 18 2124 Model 1 8005.66 4.01 0 11 Model 2 8005.16 4.125 0 11 Model 3 8005.16 4.04 0 11 Model 4 8005.16 3.82 0 33 Model 1 5926.34 4.56 0 12 2125 Model 2 5926.34 4.2 0 11 Model 3 5926.34 2.48 0 9 Model 4 5926.34 4.09 0 33 Model 1 8144.81 5.37 0 15 2126 Model 2 8144.81 5.35 0 15 Model 3 8144.81 4.12 0 15 Model 4 8144.81 4.85 0 42 Model 1 8079.89 4.29 0 12 2127 Model 2 8079.89 4.28 0 12 Model 3 8079.89 1.89 0 7 Model 4 8079.89 4.4 0 36

* Model 1: Standard optimality and feasibility cut with a big H value of 1000 * Model 2: Standard optimality and feasibility cut with a conservative (operation due-date based) estimate of big H. * Model 3: Alternative optimality and feasibility cut with a conservative estimate of big H. * Model 4: Multicut algorithm.

Page 180: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

165

Table 3.13: Run time performance of models with a cost based objective for dataset

3/4/8/5/3

* Model 1: Standard optimality and feasibility cut with a big H value of 1000 * Model 2: Standard optimality and feasibility cut with a conservative (operation due-date based) estimate of big H. * Model 3: Alternative optimality and feasibility cut with a conservative estimate of big H. * Model 4: Multicut algorithm.

No. Description Dataset Model Objective

Value Time

(secs) Feasibility

cuts Optimality

cuts Model 1 10435.01 31.32 0 80 2 3/4/8/5/3 3121 Model 2 10435.01 31.07 0 79 Model 3 10435.01 35.64 0 77 Model 4 10435.01 34.39 0 228 3122 Model 1 12289.15 44.79 0 108 Model 2 12289.15 50.82 0 115 Model 3 12289.15 64.29 16 106 Model 4 12289.15 45.29 0 279 3123 Model 1 7396.34 3.53 0 12 Model 2 7396.34 3.21 0 11 Model 3 7396.34 3.29 1 11 Model 4 7396.34 3.23 0 33 3124 Model 1 10133.54 74.56 0 169 Model 2 10133.54 65.56 0 159 Model 3 10133.54 68.5 0 131 Model 4 10133.54 85.92 0 471 Model 1 9807.48 4.6 0 12 3125 Model 2 9807.48 5.25 0 13 Model 3 9807.48 5.65 0 13 Model 4 9807.48 4.34 0 33 Model 1 7408.41 2.29 0 7 3126 Model 2 7408.41 2.64 0 7 Model 3 7408.41 2.79 0 7 Model 4 7408.41 2.53 0 21 Model 1 4527.02 5.32 0 14 3127 Model 2 4527.02 5.62 0 14 Model 3 4527.02 4.03 0 14 Model 4 4527.02 5.6 0 42

Page 181: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

166

Table 3.14: Run time performance of models with a cost based objective for dataset

4/4/8/5/3

* Model 1: Standard optimality and feasibility cut with a big H value of 1000 * Model 2: Standard optimality and feasibility cut with a conservative (operation due-date based) estimate of big H. * Model 3: Alternative optimality and feasibility cut with a conservative estimate of big H. * Model 4: Multicut algorithm.

No. Description Dataset Model Objective

Value Time

(secs) Feasibility

cuts Optimality

cuts Model 1 11633.58 5.84 0 17

3 4/4/8/5/3 4162 Model 2 11633.58 5.76 0 17 Model 3 11633.58 5.18 0 17 Model 4 11633.58 5.79 0 51 4163 Model 1 8515.08 26.89 0 63 Model 2 8515.08 24.79 0 58 Model 3 8515.08 22.04 1 52 Model 4 8515.08 22.18 0 150 4164 Model 1 5175.59 23.18 0 55 Model 2 5175.59 19.42 0 45 Model 3 5175.59 23.14 2 51

Model 4 5175.59 19.82 0 129 Model 1 6098.7 7.25 0 21 4165 Model 2 6098.7 8.9 0 24 Model 3 6098.7 8.92 1 26 Model 4 6098.7 6.5 0 54 Model 1 5113 6.14 0 18 4166 Model 2 5113 6.4 0 18 Model 3 5113 6.32 0 21 Model 4 5113 6.46 0 54 Model 1 5043.33 2.07 0 6 4167 Model 2 5043.33 2.07 0 6 Model 3 5043.33 1.625 0 6 Model 4 5043.33 2.42 0 18

Page 182: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

167

Table 3.15: Run time performance of models with a cost based objective for dataset

3/6/8/5/3

No. Description Dataset Model Objective

Value Time

(secs) Feasibility

cuts Optimality

cuts Model 1 10904.36 10.9 0 28 4 3/6/8/5/3 318 Model 2 10904.36 8.17 0 21 Model 3 10904.36 6.71 1 19 Model 4 10904.36 7.35 0 57 3181 Model 1 11808.11 19.71 0 38 Model 2 11808.11 20.21 0 37 Model 3 11808.11 15.32 0 25 Model 4 11808.11 13.84 0 75 3182_1 Model 1 11054.05 1123.7 1 371 Model 2 11054.05 383.219 0 204 Model 3 11054.05 670.9 1 166 Model 4 11054.05 345.37 0 483 3183 Model 1 7706.86 181.266 1 145 Model 2 7706.86 33.85 0 61 Model 3 7706.86 40.45 0 68 Model 4 7706.86 53.51 0 228 Model 1 9041.58 23.06 0 47 3184 Model 2 9041.58 24.15 0 46 Model 3 9041.58 27.93 0 43 Model 4 9041.58 22.9 0 126 Model 1 9345.89 7.98 1 20 3187 Model 2 9345.89 8.9 1 22 Model 3 9345.89 10.93 2 25 Model 4 9345.89 9.65 1 69 Model 1 10517.98 22.04 2 39 3188 Model 2 10517.98 16.35 2 31 Model 3 10517.98 14.23 0 26 Model 4 10517.98 15.43 1 87

* Model 1: Standard optimality and feasibility cut with a big H value of 1000 * Model 2: Standard optimality and feasibility cut with a conservative (operation due-date based) estimate of big H. * Model 3: Alternative optimality and feasibility cut with a conservative estimate of big H. * Model 4: Multicut algorithm.

Page 183: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

168

Table 3.16: Run time performance of models with a cost based objective for dataset

3/6/8/5/5

No. Description Dataset Model Objective

Value Time

(secs) Feasibility

cuts Optimality

cuts Model 1 9173.69 46.03 3 58 5 3/6/8/5/5 318S1 Model 2 9173.69 35.06 3 52 Model 3 9173.68 42.25 3 67 Model 4 9173.69 37.96 3 260 3181_1S1 Model 1 6730.58 89.875 0 62 Model 2 6730.58 59.7 0 48 Model 3 6730.58 81.01 0 43 Model 4 6370.58 23.5 0 120 3183S1 Model 1 6611.21 48.06 3 64 Model 2 6611.21 41 3 57 Model 3 6611.21 27.14 0 50 Model 4 6611.21 42.85 2 265 Model 1 7623.41 34.28 0 51 3184S1 Model 2 7623.41 35.57 0 51 Model 3 7623.41 38.18 0 53 Model 4 7623.41 36.65 0 240 Model 1 8481.91 866.797 0 486 3187S1 Model 2 8481.91 664.34 1 362 Model 3 8481.91 2299.91 0 408 Model 4 8481.91 1117.5 0 1720 Model 1 5846.76 80.98 4 71 3188S1 Model 2 5846.76 69.15 0 66 Model 3 5846.76 80.62 8 59 Model 4 5846.76 81.31 0 300

* Model 1: Standard optimality and feasibility cut with a big H value of 1000 * Model 2: Standard optimality and feasibility cut with a conservative (operation due-date based) estimate of big H. * Model 3: Alternative optimality and feasibility cut with a conservative estimate of big H. * Model 4: Multicut algorithm.

Page 184: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

169

Results from Section 3.3.12.3 reveals a similar trend as that observed in Section 3.3.12.2.

A comparison of results shows that the computational time taken for the different data

sets remains more or less the same in both cases. For the case of 3 jobs, 6

operations/job, 8 machines, 5 workcenters and 3 scenarios, results from Section 3.3.12.3

seem to outperform those from Section 3.3.12.2. For the above case, when the number

of scenarios is increased to 5, we see that for two of the data sets in this category, the

converse is true.

It is evident from the tabulated results that the time required to solve these problems

using the L-shaped method regardless of the master problem objective function or the

model used is fairly large. One of the many reasons is that the stochastic NMJS problem

lacks relatively complete recourse and therefore additional constraints (either by means of

feasibility cuts or through pre-built reentrant flow and deadlock prevention constraints)

must be added to the master problem to restore second-stage feasibility.

Among the four models, those that contain bounds on the continuous variables in the

Stage-II problem may be more prone to generate feasibility cuts. In certain cases,

although, a master problem solution may be feasible (i.e. satisfy reentrant flow conditions

and deadlock prevention constraints), it may result in completion time values for certain

operations that exceed the bounds if the bound is set too tight. The large master problem

and its repeated resolution until feasibility is achieved is believed to be one of the main

drivers behind the large computation times seen.

The L-shaped method also suffers from a few drawbacks in that all of the feasibility and

optimality cuts must be retained during the tenure of the computation to ensure

convergence. Also, during the early iterations step lengths tend to be large before a

reasonably good approximation of the second-stage or recourse function can be available

and owing to this the method is not very capable in taking advantage of a superior

starting solution.

The experimental evidence gathered so far appears to suggest that, a simple master

problem with a balanced trade-off between pre-built constraints (such as those for

reentrant flow and deadlock prevention) versus externally added feasibility cuts in

Page 185: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

170

conjunction with a conservative estimate for big H in the second-stage problem may be

most conducive to solving the stochastic NMJS problem using the L-shaped method.

Recall that the master problem in the L-shaped method for the NMJS problem (section

3.3) is composed solely of binary integer variables. Laporte and Louveaux (1993) present

an integer L- shaped method for the case where some of the first stage variables are

binary. The integer L-shaped method incorporates the classical L-shaped method within

a branch and bound scheme in order to resolve the integrality of the master problem

variables. The steps involved in the integer L-shaped algorithm as applied to the NMJS

problem are presented in the following section.

3.3.13 The Integer L-shaped Algorithm for the NMJS problem: Defining the current problem as follows:

Minimize ( , )

( , )( , ) ( , , , )( , ) ( , , , )

1 1 1 1 ( , ) ( , )( , ) ( , )

i

i j m m

S N MJm s mm m

i j i j k ls i j i j k ls i j m m i j k lM Z Z

i j k l

p yx uφ= = = ∈ = ∈ ∈

+∑ ∑∑ ∑ ∑ ∑ ∑

1( , )

( , )( , , 1)1 0 ( , ) ( , 1)

iN Je f

e fi j je fM Mi j i j i j

v d θ−

+∈ ∈= = +

+ +∑ ∑∑∑

subject to (3.17), (3.18), (3.19), (3.20), (3.21) and 1,...,m m m mx y v mCA B γ+ + ≥ = fCUT 1,...,n n nnx y v nQP R θ λ+ + + ≥ = nCUT

( , )( , )

( , , , )

( , )( , ) ( , 1)( , , 1)

[0,1] 1,..., , 1,..., ,

[0,1] 1,..., , ( , ) , ( , ) , ( , ) ( , )

[0,1] 1,..., , 1,..., 1, ,

im

i ji j

mm mi j k l

e fi j i jii j j

i N j mx J M

m M i j k l i j k ly z z

i N j e fv J M M ++

∈ ∀ = = ∈

∈ ∀ = ∈ ∈ ≠

∈ ∀ = = − ∈ ∈

Let q be the iteration index, fCUT and nCUT be the number of feasibility and optimality

cuts generated so far and Z be the best found integer first stage solution.

Page 186: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

171

Algorithm:

Step 1: Set 0q = , fCUT=0, nCUT=0, Z =∞ . The value of θ is set equal to -∞ or an

appropriate lower bound and is ignored in the computation. The node list presently

consists of one node corresponding to the initial current problem.

Step 2: Pick a node from the list. If the list is empty; stop.

Step 3: Set 1q q= + . Solve the current problem. If the current problem has no feasible

solution, fathom the current node and go to step 2. Otherwise, let ( ), , ,qq q qyx v θ be

the optimal solution.

Step 4: If ˆqq q qp u d Zyx v θ+ + + > fathom the current node and return to Step 2.

Step 5: If integrality restrictions are violated, pick a fractional variable and create two

nodes that correspond to setting this variable’s value at 0 or 1. Replace the current node

in the list by the two new nodes and return to Step 2.

Step 6: Using expressions (3.32)-(3.40) compute the objective of problem ARP (see

section 3.3.1) for every scenario s . If the value of the objective function equals zero for

all scenarios, proceed to Step 7 else pick a scenario s for which the objective function

value is positive and using the dual solutions compute 1mA + , 1mB + , 1mC + and 1mγ + .

Append the feasibility cut 1 1 1 1m m m mx y vCA B γ+ + + ++ + ≥ in accordance with the

expression (3.46). Set 1fCUT fCUT= + and return to Step 3.

Step 7: Using expressions (3.25) – (3.31) determine ( ), , ,qq qQ syx v for every

scenario s . Let

Q( , , )q qq q q qq p u dy yx v x vZ = + + +

If ˆq ZZ < , set ˆ qZ Z= and save , ,qq qyx v .

Step 8: If Q( , , )qq q qyx vθ ≥ fathom the current node and return to Step 2. Else, using

the dual solutions for expressions (3.26) – (3.31) compute 1nP + , 1nQ + , 1nR + , 1nλ + and

append the optimality cut given by 1 1 11n n nnx y vQP R θ λ+ + +++ + + ≥ in accordance

with expression (3.64). Set 1nCUT nCUT= + and return to Step 3.

Page 187: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

172

In the next section, we examine the deterministic equivalent for the stochastic NMJS

problem and analyse its performance over the generated test instances.

3.4 Approach 2: The Deterministic Equivalent for the Stochastic NMJS Problem

This section outlines Approach 2, which involves using CPLEX to solve the

deterministic equivalent of the stochastic NMJS problem given by Expressions 3.1-3.15.

Experimental results detailing the performance of three models are presented. The first

model is given by Expressions 3.1-3.15 where a big H value of 1000 is used. The second

model involves replacing the value of big H by the estimated upper bound on

completion times as described in section 3.3.8. Since the LP relaxation of Expressions

3.1-3.15 is rather tight, a third model that makes use of this LP relaxation to fix some of

the binary assignment variables is proposed. In the third model, the LP relaxation of a

given test instance is first solved and those assignment variables that take on values of

either 0 or 1 in the relaxed model are similarly fixed in the MIP, which is solved

subsequently. Observe that the third model is basically an LP heuristic procedure because

the values of the assignment variables fixed either at 0 or 1 based on the solution to the

relaxed model may not necessarily be optimal. The LP heuristic procedure is guided by

the belief that, since the LP relaxation is rather tight, the probability that the values fixed

at either 0 or 1 in the relaxation are close to the optimal is rather high. Based on the

assignment variables whose values are fixed at 1, additional constraints can be developed

that lead to a fixing of some of the sequencing variables. To illustrate this, consider the

following two cases:

1) Assignment variables ( , )mi jx =1 and ( , )

mk lx =1 where i = k, j < l . That is, two

operations assigned to the same machine are such that one of them precedes the other.

- Under these circumstances, in order for the operation precedence constraints to hold, it

is necessary that the earlier occurring operation be performed before the operation that

occurs later. That is, there cannot exist a sequence where the later occurring operation is

performed before the earlier occurring operation. Therefore, it must be that ( , , , ) 0mk l i jy =

2) Assignment variable ( , )mi jx = 0. That is, operation (i, j) has not been assigned to

machine m .

Page 188: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

173

- In this case, since an operation has not been assigned to machine capable of processing

it, it must mean that this operation cannot possibly precede or succeed other operations

that this machine is capable of processing. Therefore for any operation ( , )k l capable of

being processed on machine m , we must have both ( , , , ) 0mi j k ly = and ( , , , ) 0m

k l i jy = .

In the third model, with some of the assignment variables fixed at their LP relaxation

values, we include additional constraints over and above those given by Expressions 3.1-

3.15 to cover the above two cases.

In the sequel, we present experimental results for Approach 2. The first column

describes the problem, the second contains the datasets tested, the third provides details

on the type of model used to solve the problem, the fourth and the fifth columns give

the IP objective and the LP relaxation values respectively, the sixth column gives the

percentage gap between the LP heuristic solution and the IP objective value, the seventh

column shows the time taken to solve the problem in CPU seconds, and finally, the

eighth and the ninth columns give details on the number of MIP simplex iterations and

branch and bound nodes taken to solve the problem.

Page 189: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

174

3.4.1 Experimentation – II

Table 3.17: Performance of Models 1, 2 and the LP heuristic for dataset 3/6/8/5/3

No. Description Dataset Model Objective

Value LP

Value Gap* (%)

Time (secs)

# MIP simplex

iter # B&B nodes

Model 1 1938.1 1919.9 0.047 146 18 1 3/6/8/5/3 318 Model 2 1938.1 1920.61 0.047 137 15

LP

Heuristic 1938.1 0 0.015 110 16 3181 Model 1 2423.96 2371.44 0.063 356 74 Model 2 2423.96 2371.77 0.078 299 57

LP

Heuristic 2423.96 0 0.063 250 38 3182 Model 1 2378.96 2336.69 0.219 1122 219 Model 2 2378.96 2337.08 0.11 394 59

LP

Heuristic 2378.96 0 0.094 601 95 3183 Model 1 1215.43 1188.46 0.078 291 30 Model 2 1215.43 1188.46 0.078 268 33

LP

Heuristic 1215.43 0 0.063 216 29 Model 1 571.73 552.68 0.062 216 24 3184 Model 2 571.73 552.88 0.14 327 49

LP

Heuristic 571.73 0 0.031 118 15 Model 1 1121.35 1003.12 0.812 4930 1101 3185 Model 2 1121.35 1006.63 0.594 4093 614

LP

Heuristic 1121.35 0 0.188 1438 262 Model 1 750.16 690.48 2.375 19491 4085 3186 Model 2 750.16 690.48 2.422 17897 4188

LP

Heuristic 750.16 0 1.844 18625 3683

** Denotes the % gap between the objective value of the LP heuristic and that for Model 2.

* Model 1 uses CPLEX and a big H value of 1000 in the deterministic equivalent. * Model 2 uses CPLEX and a conservative (operation due-date based) estimate of big H. * The LP Heuristic uses a conservative estimate of big H and relies on the partial fixing of assignment and

sequencing variables.

Average time savings of model 2 over model 1: 0.031 seconds.

Average time savings of LP heuristic over model 1: 0.194 seconds

Average time savings of LP heuristic over model 2: 0.167 seconds

Page 190: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

175

Table 3.18: Performance of Models 1, 2 and the LP heuristic for dataset 3/10/8/5/3

No. Description Dataset Model Objective

Value LP

Value Gap (%)

Time (secs)

# of MIP simplex

iter # B&B nodes

Model 1 2953.52 2879.68 1.485 8028 1569 2 3/10/8/5/3 330 Model 2 2953.52 2879.82 0.688 2815 515

LP

Heuristic 2953.52 0 1.641 13787 2072 3301 Model 1 3603.76 3539.17 1.297 6858 1060 Model 2 3603.76 3539.45 0.985 4173 691

LP

Heuristic 3603.76 0 0.281 1481 264 3302 Model 1 3817.02 3766.72 1.438 9097 1190 Model 2 3817.02 3766.94 1.328 7367 965

LP

Heuristic 3817.94 0.0241 0.406 2771 419 3303 Model 1 2106.73 2063.19 1.266 6646 1045 Model 2 2106.73 2063.6 0.688 2459 377

LP

Heuristic 2110.37 0.173 0.266 662 107 Model 1 2640.65 2590.22 1.375 5930 1330 3304 Model 2 2640.65 2590.31 1.297 7215 1301

LP

Heuristic 2644.12 0.131 0.469 3787 543 Model 1 1346.19 1291.67 1.375 7160 1085 3305 Model 2 1346.19 1291.83 0.781 2546 432

LP

Heuristic 1346.19 0 0.265 717 95 Model 1 2332.19 2270.89 0.766 1856 314 3306 Model 2 2332.19 2271.03 0.5 982 128

LP

Heuristic 2332.19 0 0.578 2710 494

* Model 1 uses CPLEX and a big H value of 1000 in the deterministic equivalent. * Model 2 uses CPLEX and a conservative (operation due-date based) estimate of big H. * The LP Heuristic uses a conservative estimate of big H and relies on the partial fixing of assignment and

sequencing variables.

Average time savings of model 2 over model 1: 0.39 seconds

Average time savings of LP heuristic over model 1: 0.728 seconds

Average time savings of LP heuristic over model 2: 0.337 seconds

Page 191: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

176

Table 3.19: Performance of Models 1, 2 and the LP heuristic for dataset 4/10/8/5/3

* Model 1 uses CPLEX and a big H value of 1000 in the deterministic equivalent. * Model 2 uses CPLEX and a conservative (operation due-date based) estimate of big H. * The LP Heuristic uses a conservative estimate of big H and relies on the partial fixing of assignment and

sequencing variables.

Average time savings of model 2 over model 1: 69.67 seconds

Average time savings of LP heuristic over model 1: 95.06 seconds

Average time savings of LP heuristic over model 2: 25.39 seconds

No. Description Dataset Model Objective

Value LP

Value Gap (%)

Time (secs)

# of MIP simplex

iter # B&B nodes

Model 1 1783.18 1538.92 579.67 2784247 343783 3 4/10/8/5/3 440 Model 2 1783.18 1538.95 867.02 4210954 455537

LP

Heuristic 1783.18 0 786.713 3948245 478005 4401 Model 1 1798.81 1661.4 610.83 2821073 331212 Model 2 1798.81 1661.53 395.08 1670877 200203

LP

Heuristic 1800.36 0.086 403.33 1905913 259478 4402 Model 1 3879.46 3764.79 546.038 1995230 348838 Model 2 3879.46 3764.87 191.221 777508 119574

LP

Heuristic 3879.46 0 297.133 1345499 237171 4403 Model 1 2203.88 2117.19 22.469 110699 13617 Model 2 2203.88 2117.4 61.969 312846 37140

LP

Heuristic 2203.88 0 57.079 386213 46795 Model 1 2122.9 2033.35 340.333 1501458 225698 4404 Model 2 2122.9 2033.57 181.159 1080815 119518

LP

Heuristic 2123.38 0.023 18.953 100194 15524 Model 1 3252.64 3174.57 77.313 400049 52468 4405 Model 2 3252.64 3175.11 39.125 158733 28856

LP

Heuristic 3253.29 0.020 11.56 72137 10256 Model 1 2560.8 2467.58 98.438 540952 64642 4406 Model 2 2560.8 2467.91 51.813 304078 33865

LP

Heuristic 2560.8 0 34.86 176150 34865

Page 192: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

177

Table 3.20: Performance of Models 1, 2 and the LP heuristic for dataset 3/6/8/5/5

* Model 1 uses CPLEX and a big H value of 1000 in the deterministic equivalent. * Model 2 uses CPLEX and a conservative (operation due-date based) estimate of big H. * The LP Heuristic uses a conservative estimate of big H and relies on the partial fixing of assignment and

sequencing variables.

Average time savings of model 2 over model 1: -0.02 seconds

Average time savings of LP heuristic over model 1: 0.114seconds

Average time savings of LP heuristic over model 2: 0.134 seconds

No. Description Dataset Model Objective

Value LP

Value Gap (%)

Time (secs)

# MIP simplex

iter

# B&B

nodes

Model 1 1483.58 1470.02 0.094 219 15 4 3/6/8/5/5 318S1 Model 2 1483.58 1470.15 0.094 215 18

LP

Heuristic 1483.58 0 0.063 136 11 3181S1 Model 1 1575.72 1550.19 0.141 411 43 Model 2 1575.72 1550.43 0.156 387 35

LP

Heuristic 1575.72 0 0.11 312 41 3182S1 Model 1 973.79 940.16 0.297 1110 177 Model 2 973.79 940.52 0.281 814 121

LP

Heuristic 973.79 0 0.14 627 115 3183S1 Model 1 618.19 600.95 0.047 194 7 Model 2 618.19 600.99 0.047 186 7

LP

Heuristic 618.75 0.091 0.031 154 11 Model 1 973.11 963.45 0.063 182 7 3184S1 Model 2 973.11 963.45 0.063 187 7

LP

Heuristic 973.11 0 0.063 154 6 Model 1 963.5 945.42 0.094 276 33 3185S1 Model 2 963.5 945.42 0.11 418 50

LP

Heuristic 964.67 0.121 0.078 207 29 Model 1 1235.08 1191.47 0.922 5904 907 3186S1 Model 2 1235.08 1191.47 1.047

LP

Heuristic 1235.08 0 0.375

Page 193: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

178

Table 3.21: Performance of Models 1, 2 and the LP heuristic for dataset 3/10/8/5/5

No. Description Dataset Model Objective

Value LP

Value Gap (%)

Time (secs)

# MIP simplex

iter # B&B nodes

Model 1 3239.09 3185.69 2.5 12495 1758 5 3/10/8/5/5 330S1 Model 2 3239.09 3185.93 2.985 17601 2177

LP

Heuristic 3239.09 0 1.312 9474 1244 3301S1 Model 1 1762.28 1707.16 1.547 6765 642 Model 2 1762.28 1707.16 3.031 17887 2108

LP

Heuristic 1762.28 0 0.781 2701 344 3303S1 Model 1 1939.41 1875.2 1.297 4298 677 Model 2 1939.41 1875.36 0.922 2179 276

LP

Heuristic 1939.41 0 0.469 1222 134 3304S1 Model 1 1743.12 1674.63 1.453 5172 740 Model 2 1743.12 1674.65 0.985 2712 387

LP

Heuristic 1743.12 0 1.656 8530 1282 Model 1 1694.32 1664.17 1.39 5836 802 3305S1 Model 2 1694.32 1664.36 1.063 2440 347

LP

Heuristic 1696.07 0.103 1.5 8264 1236 Model 1 3230.26 3176.27 3.969 22370 2941 3306S1 Model 2 3230.26 3176.46 2.375 12423 1481

LP

Heuristic 3230.26 0 0.5 2339 377

* Model 1 uses CPLEX and a big H value of 1000 in the deterministic equivalent. * Model 2 uses CPLEX and a conservative (operation due-date based) estimate of big H. * The LP Heuristic uses a conservative estimate of big H and relies on the partial fixing of assignment and

sequencing variables.

Average time savings of model 2 over model 1: 0.1325 seconds

Average time savings of LP heuristic over model 1: 0.989 seconds

Average time savings of LP heuristic over model 2: 0.857 seconds

Page 194: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

179

Table 3.22: Performance of Models 1, 2 and the LP heuristic for dataset 4/10/8/5/5

* Model 1 uses CPLEX and a big H value of 1000 in the deterministic equivalent. * Model 2 uses CPLEX and a conservative (operation due-date based) estimate of big H. * The LP Heuristic uses a conservative estimate of big H and relies on the partial fixing of assignment and

sequencing variables.

Average time savings of model 2 over model 1: 16.199 seconds

Average time savings of LP heuristic over model 1: 45.51 seconds

Average time savings of LP heuristic over model 2: 36.63 seconds

No. Description Dataset Model Objective

Value LP

Value Gap (%)

Time (secs)

# MIP simplex

iter # B&B nodes

Model 1 2515.71 2449.56 17.297 60489 6946 6 4/10/8/5/5 440S1 Model 2 2515.71 2449.80 11.313 33045 4038 LP Heuristic 2515.71 0 4.016 14370 1924 4401S1 Model 1 2179.24 2105.62 34.438 143591 15624 Model 2 2179.24 2105.71 29.485 133944 12361 LP Heuristic 2179.24 0 10.297 52628 6651 4403S1 Model 1 1491.03 1402.11 156.674 759427 68158 Model 2 1491.03 1402.44 175.97 762803 75798 LP Heuristic 1491.03 0 30.595 119820 15913 Model 1 3052.97 2974.59 104.876 419508 46256 4404S1 Model 2 3052.97 2974.77 123.28 497686 50782 LP Heuristic 3052.97 0 19.594 77671 9799 Model 1 1192.04 1137.38 43.45 154907 20409 4405S1 Model 2 1192.04 1137.38 49.625 202499 22688 LP Heuristic 1195.07 0.254 13.547 62225 8404 4406S1 Model 1 3011.09 2904.06 740.75 3062091 316623 Model 2 3011.09 2904.08 654.54 2826744 282577

LP Heuristic 3011.09 0 746.36 3984427 392452

Page 195: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

180

Table 3.23: Performance of Models 1, 2 and the LP heuristic for dataset 3/6/8/5/7

* Model 1 uses CPLEX and a big H value of 1000 in the deterministic equivalent. * Model 2 uses CPLEX and a conservative (operation due-date based) estimate of big H. * The LP Heuristic uses a conservative estimate of big H and relies on the partial fixing of assignment and

sequencing variables.

Average time savings of model 2 over model 1: -0.0105 seconds.

Average time savings of LP heuristic over model 1: 0.036 seconds.

Average time savings of LP heuristic over model 2: 0.046 seconds.

No. Description Dataset Model Objective

Value LP

Value Gap (%)

Time (secs)

# MIP simplex

iter # B&B nodes

7 3/6/8/5/7 3181S2 Model 1 441.2 395.36 0.203 1331 175 Model 2 441.2 395.47 0.156 648 60

LP

Heuristic 441.2 0 0.219 1379 157 3182S2 Model 1 487.86 461.37 0.125 301 17 Model 2 487.86 461.37 0.172 523 55

LP

Heuristic 487.86 0 0.219 592 49 3183S2 Model 1 712.19 700.62 0.156 265 44 Model 2 712.19 700.72 0.156 251 43

LP

Heuristic 712.19 0 0.11 223 41 Model 1 930.07 905.35 0.344 2461 276 3184S2 Model 2 930.07 905.35 0.422 2860 312

LP

Heuristic 930.07 0 0.172 397 53 Model 1 1095.96 1080.65 0.094 290 28 3185S2 Model 2 1095.96 1080.9 0.11 292 28

LP

Heuristic 1095.96 0 0.047 208 10 Model 1 985.73 944.36 0.172 754 79 3186S2 Model 2 985.73 944.36 0.141 637 50

LP

Heuristic 991.57 0.592 0.11 439 52

Page 196: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

181

* Model 1 uses CPLEX and a big H value of 1000 in the deterministic equivalent. * Model 2 uses CPLEX and a conservative (operation due-date based) estimate of big H. * The LP Heuristic uses a conservative estimate of big H and relies on the partial fixing of assignment and

sequencing variables.

Average time savings of model 2 over model 1: 0.258 seconds

Average time savings of LP heuristic over model 1: 0.711 seconds

Average time savings of LP heuristic over model 2: 0.453 seconds

Table 3.24: Performance of Models 1, 2 and the LP heuristic for dataset 3/10/8/5/7

No. Description Dataset Model Objective

Value LP

Value Gap (%)

Time (secs)

# MIP simplex

iter

# B&B

nodes Model 1 919.17 851.66 3.594 8 3/10/8/5/7 330S2 Model 2 919.17 851.66 2.703 9009 921

LP

Heuristic 919.26 0.010 1.859 7234 788 3301S2 Model 1 1543.38 1489.00 2.954 11212 1138 Model 2 1543.38 1489.05 2.547 8606 879

LP

Heuristic 1546.68 0.214 0.953 4276 395 3303S2 Model 1 1309.14 1268.56 2.75 10624 1151 Model 2 1309.14 1268.68 2.125 7066 768

LP

Heuristic 1309.14 0 1.078 4368 380 3304S2 Model 1 1891.38 1858.08 1.547 4495 474 Model 2 1891.38 1858.28 2.36 9163 1052

LP

Heuristic 1892.29 0.048 1.609 8201 755 Model 1 886.95 837.58 1.938 5252 665 3305S2 Model 2 886.95 837.59 1.906 4819 602

LP

Heuristic 886.95 0 1.266 3960 484 Model 1 863.57 798.39 2.344 5342 865 3306S2 Model 2 863.57 798.44 1.938 3388 598

LP

Heuristic 863.57 0 4.094 13846 1877

Page 197: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

182

Table 3.25 Performance of Models 1, 2 and the LP heuristic for dataset 4/10/8/5/7

* Model 1 uses CPLEX and a big H value of 1000 in the deterministic equivalent. * Model 2 uses CPLEX and a conservative (operation due-date based) estimate of big H. * The LP Heuristic uses a conservative estimate of big H and relies on the partial fixing of assignment and

sequencing variables.

Average time savings of model 2 over model 1: 436.61 seconds

Average time savings of LP heuristic over model 1: 254.42 seconds

Average time savings of LP heuristic over model 2: -182.19 seconds

No. Description Dataset Model Objective

Value LP

Value Gap (%)

Time (secs)

# MIP simplex

iter # B&B nodes

Model 1 1726.73 1654.15 622.617 2550370 179847 9 4/10/8/5/7 440S2 Model 2 1726.73 1654.21 223.3 943929 65373

LP

Heuristic 1728.62 0.109 594.74 2146451 199139 4401S2 Model 1 2130.89 2053.63 677.5 2447245 213967 Model 2 2130.89 2053.75 113.73 457622 32341

LP

Heuristic 2134.51 0.170 71.11 3032211 26771 4402S2 Model 1 2065.06 1994.02 286.415 908122 92632 Model 2 2065.06 1994.25 64.158 218381 17913

LP

Heuristic 2065.06 0 29.672 113999 10291 4403S2 Model 1 1512.73 1427.41 1372.41 4940355 464593 Model 2 1512.73 1427.60 907.464 3493693 301133

LP

Heuristic 1513.25 0.034 1311.39 5106039 474074 Model 1 1804.31 1736.26 1562.02 5401002 485245 4404S2 Model 2 1804.31 1736.6 630.56 1972942 197032

LP

Heuristic 1808.12 0.211 74.939 216001 31774 Model 1 1748.1 1679.05 28.61 80788 9254 4405S2 Model 2 1748.1 1679.13 10.031 20893 2380

LP

Heuristic 1748.1 0 60.31 12560 1348 4406S2 Model 1 1205.46 1097.86 853.413 3517473 230247 Model 2 1205.46 1097.88 397.416 1551036 109264

LP

Heuristic 1205.46 0 1479.85 6249914 481059

Page 198: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

183

Table 3.26: Performance of Models 1, 2 and the LP heuristic for dataset 4/10/12/5/7

* Model 1 uses CPLEX and a big H value of 1000 in the deterministic equivalent. * Model 2 uses CPLEX and a conservative (operation due-date based) estimate of big H. * The LP Heuristic uses a conservative estimate of big H and relies on the partial fixing of assignment and

sequencing variables Average time savings of model 2 over model 1: 453.45 seconds

Average time savings of LP heuristic over model 1: 719.784 seconds

Average time savings of LP heuristic over model 2: 266.73 seconds

No. Description Dataset Model Objective

Value LP

Value Gap (%)

Time (secs)

# MIP simplex

iter # B&B nodes

Model 1 1772.84 1711.40 116.06 255386 16230 10 4/10/12/5/7 D1 Model 2 1772.84 1711.49 61.797 119532 8494

4403S2IF LP

Heuristic 1772.84 0 25.656 84406 7191 D2 Model 1 2055.84 1999.29 193.72 406168 45168 4402S2IF Model 2 2055.84 1999.29 74.298 138005 15172

LP

Heuristic 2055.84 0 130.549 411252 47826 D3 Model 1 1954.97 1869.68 1519.3 3482869 278642 4404S2IF Model 2 1954.97 1869.77 348.16 802717 63050

LP

Heuristic 1966.48 0.589 353.301 1421252 109005 D4 Model 1 1600.7 1532.97 272.94 578902 42352 4405S2IF Model 2 1600.7 1533.22 101.72 216644 17918

LP

Heuristic 1622.36 1.353 18.875 59842 6083 Model 1 1290.92 1199.52 381.67 942448 72092 D5 Model 2 1290.92 1199.52 179.471 436233 35791

4401S2IF1 LP

Heuristic 1290.92 0 31.688 126460 11218 Model 1 1261.08 1167.92 1099.69 2424590 226632 D6 Model 2 1261.08 1168.04 162.611 364891 33531

4402S2IF1 LP

Heuristic 1261.38 0.024 67.282 213389 20026 D7 Model 1 1349.16 1249.66 969.84 2423274 174795 4403S2IF1 Model 2 1349.16 1249.69 1339.19 3422957 237982

LP

Heuristic 1353.8 0.344 55.532 174314 21159

D7 Model 1 1569.71 1452.30 1897.56 4493124 430832 4404S2IF1 Model 2 1569.71 1452.32 559.13 1188318 132946

LP

Heuristic 1571.02 0.083 9.625 23095 2808

Page 199: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

184

Table 3.27: Performance of Models 1, 2 and the LP heuristic for dataset 5/10/12/7/3

* Model 1 uses CPLEX and a big H value of 1000 in the deterministic equivalent. * Model 2 uses CPLEX and a conservative (operation due-date based) estimate of big H. * The LP Heuristic uses a conservative estimate of big H and relies on the partial fixing of assignment and

sequencing variables.

Average time savings of model 2 over model 1: 236.48 seconds

Average time savings of LP heuristic over model 1: 927.72 seconds

Average time savings of LP heuristic over model 2: 691.238 seconds

No. Description Dataset Model Objective

Value LP

Value Gap (%)

Time (secs)

# MIP simplex

iter # B&B nodes

Model 1 2317.56 2170.18 22.61 63639 11881 11 5/10/12/7/3 D1 Model 2 2317.56 2170.29 18.094 24840 4801

550_1 LP

Heuristic 2317.56 0 30.906 68280 13709 D2 Model 1 1912.16 1791.21 33.62 122697 19103 5501_1 Model 2 1912.16 1791.31 51.922 160633 21942

LP

Heuristic 1913.63 0.077 49.53 194116 22293 D3 Model 1 1921.05 1774.15 6559.76 25584998 2838187 5502_1 Model 2 1921.05 1774.2 5372.75 15143570 1637904

LP

Heuristic 1921.05 0 451.94 1387620 198328 D4 Model 1 2474.87 2327.13 1454.52 5805567 697132 5503_1 Model 2 2474.87 2327.13 1072.11 2509875 405846

LP

Heuristic 2474.87 0 1688.79 5598950 791587 Model 1 1531.14 1392.45 468.16 1654428 271716 D5 Model 2 1531.14 1392.45 212.68 552465 87409

5504_1 LP

Heuristic 1541.57 0.681 132.42 379486 71081 Model 1 2687.3 2558.06 425.78 2178493 194480 D6 Model 2 2687.3 2558.45 346.73 1042877 121054

5505_1 LP

Heuristic 2687.3 0 93.99 391234 53904 D7 Model 1 2442.54 2325.92 95.142 355820 52591 5506_1 Model 2 2442.54 2325.92 329.906 1083379 108320

LP

Heuristic 2442.54 0 117.95 454353 68024

Page 200: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

185

Table 3.28: Performance of Models 1, 2 and the LP heuristic for dataset 5/10/12/7/7

* Model 1 uses CPLEX and a big H value of 1000 in the deterministic equivalent. * Model 2 uses CPLEX and a conservative (operation due-date based) estimate of big H. * The LP Heuristic uses a conservative estimate of big H and relies on the partial fixing of assignment and

sequencing variables.

Average time savings of model 2 over model 1: 2966.10 seconds

Average time savings of LP heuristic over model 1: 4154.71 seconds

Average time savings of LP heuristic over model 2: 1188.60 seconds

No. Description Dataset Model Objective

Value LP

Value Gap (%)

Time (secs)

# MIP simplex

iter # B&B nodes

Model 1 1938.91 1724.57 89.626 202867 24763 12 5/10/12/7/7 D1 Model 2 1938.91 1724.57 88.936 112714 15471

550_1S2 LP

Heuristic 1938.91 0 88.171 136370 18192 D2 Model 1 2851.24 2655.73 447.615 760015 126423 5501_1S2 Model 2 2851.24 2655.87 452.807 562889 80519

LP

Heuristic 2851.24 0 922.86 1381610 209829 D3 Model 1 2506.081 2269.37 10065.6 26184135 2269219 5502_1S2 Model 2 2506.08 2269.37 5439.81 7269986 785102

LP

Heuristic 2506.08 0 4682.97 7887918 704224 D4 Model 1 3425.68 3317.86 87.68 222155 25379 5503_1S2 Model 2 3425.68 3318.36 85.593 128727 16569

LP

Heuristic 3425.68 0 53.57 100797 12264 Model 1 2883.79 2734.97 120.658 289687 32641 D5 Model 2 2883.79 2735.23 111.35 181651 19377

5504_1S2 LP

Heuristic 2884.04 0.009 48.32 96665 10378 Model 1 1753.83 1619.85 2855.49 6042713 732493 D6 Model 2 1753.83 1619.89 439.61 666524 76156

5505_1S2 LP

Heuristic 1753.83 0 232.091 397983 43907 D7 Model 1 1906.53 1763.19 21844.4 69875869 4764965 5506_1S2 Model 2 1880.78 1763.36 8130.22 14614472 1148217

LP

Heuristic 1880.78 0 400.104 707805 88192

Page 201: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

186

The above experimental results clearly indicate that model 2 outperforms model 1 in

almost all the cases considered. Only for two of the cases (3 jobs, 6 operations/ job, 8

machines, 5 workcenters, 5 scenarios, and 3 jobs, 6 operations, 8 machines, 5

workcenters, 7 scenarios) it is trivially outperformed by model 1. Moreover, as the

problem size increases, the savings in computational time rapidly increase when model 2

is used over model 1. For the largest problems solved in this experiment (5 jobs, 10

operations/job, 12 machines, 7 workcenters, 7 scenarios), the average savings amount to

as much as 50 minutes. These observations indicate that, for flexible job-shop scheduling

problems of the kind considered in this study, it might be advantageous to generate an

estimate as conservative as possible for the value of big H. Although the LP relaxation

for model 1 and model 2 differ trivially at the root node, it is possible that on further

branching, the LP relaxation values at the leaf nodes differ significantly between the two

models with model 2 leading to a tighter relaxation and hence faster pruning and low

computational times.

Another clear outcome from the above experiments is that the LP heuristic outshines

both model 1 as well as model 2 for all of the cases except the one involving 4 jobs, 10

operations/job, 8 machines, 5 workcenters and 7 scenarios. For this case, while it

performs better than model 1, it falls behind model 2. Once again, with larger problem

sizes, the savings in time increase. For the largest problems solved in this experiment (5

jobs, 10 operations/job, 12 machines, 7 workcenters, 7 scenarios), the average savings

amount to as much as 70 minutes over model 1 and 20 minutes over model 2.

Interestingly, the LP heuristic produces the optimal solution in a majority of cases. The

maximum gap observed between the LP heuristic solution and the solution obtained

from model 1/model 2 over all the data sets in the above experiment is 1.35%. The LP

heuristic, therefore, offers a powerful alternative to solving these problems to near-

optimality with a very low computational burden.

Page 202: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

187

3.5 Experimentation on the Value of Stochastic Solution (VSS):

This parameter measures the difference between the expected value solution and the

recourse problem (stochastic) solution. It underscores the importance of using a solution

that is robust over all scenarios versus a solution that approximates the randomness

through expected values. Table 3.29 presents the results of such a comparison for several

data instances. The first column provides a description of the dataset, the second column

indicates the instance of the dataset solved, the third column depicts the number of

processing time scenarios considered, the fourth column highlights the probabilities of

occurrence of the various scenarios, the fifth and the sixth columns present the

stochastic and expected value solutions respectively, the seventh column provides the

difference between the sixth and fifth columns and quantifies the value of the stochastic

solution (VSS). Finally, the eighth column gives the percentage difference. The stochastic

solution is obtained by solving Expressions (3.1-3.15) where only the first two terms

pertaining to completion times and budget surplus are considered in the objective

function (Expression (3.1)). To obtain the expected value solution, Expressions (3.1-

3.15) are first solved in the absence of any scenarios by replacing the random processing

times for each job-operation by its expected value. The resulting assignment and

sequencing decisions are preset and the problem (Expressions (3.1-3.15)) is solved once

more in the presence of all scenarios. The resulting objective values under each scenario

when weighted by their probabilities of occurrence and summed together provide the

expected value solution.

Page 203: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

188

Table 3.29: Value of stochastic solution

* Indicates that a probability of 0.7 is distributed equally amongst scenarios 21 through 40 and a probability of 0.3 is distributed

equally amongst the rest of the scenarios.

Dataset

Description

Instance

No. of

Scenarios

Distribution of

Probabilities

Stochastic

Solution ($)

Expected

Value

Solution

($)

VSS

($) Difference

1 20 Equally probable 618.08 646.44 28.36 4.38%

2 40 Equally probable 888.99 971.26 82.27 8.47%

3 40 0.7~(21-40)*

0.3~rest* 1109.65 1159.95 50.3 4.34%

4 40 0.7~(11-30)

0.3~ rest 792.39 843.16 50.77 6.02%

5 40 0.6~(21-40)

0.4~ rest 999.87 1041.44 41.57 3.99%

30 o

pera

tion

s, 8

mac

hine

s, 5

wor

kcen

ters

6 40 0.75~(11-35)

0.25~rest 877.59 962.30 84.71 8.8%

Page 204: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

189

Table 3.29 Value of stochastic solution (continued)

Dataset

Description

Instance

No. of

Scenarios

Distribution of

Probabilities

Stochastic

Solution ($)

Mean value

Solution ($)

Value of

stochastic

Solution (VSS) $

Difference

7 40 0.8~(11-30)

0.2~rest 744.08 812.97 68.89 8.47%

8 40 0.8~(16-40)

0.2~rest 1050.25 1095.34 45.09 4.11%

9 40 0.6~(6-20)

0.4~rest 691.87 728.54 36.67 5.03%

10 40 0.8~(6-25)

0.2~rest 618.99 658.07 39.08 5.93%

30 o

pera

tion

s, 8

mac

hine

s, 5

wor

kcen

ters

11 40

0.6~(16-25)

0.3~(26-40)

0.1~(1-15)

876.37 927.17 50.8 5.47%

Page 205: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

190

Table 3.29: Value of stochastic solution (continued)

Dataset

Instance

No. of

scenarios

Distribution of

probabilities

Stochastic

solution ($)

Mean

value

solution ($)

Value of

stochastic

solution (VSS)

$

Difference

1 11 Equally probable 2096.77 2164.93 68.16 3.14%

2 32 Equally probable 1239.01 1315.03 76.02 5.78%

3 32 0.8~(11-32)

0.2~rest 1341.46 1444.05 102.59 7.10%

4 32 0.75~(6-20)

0.25~rest 1033.88 1063.02 29.14 2.74%

50 o

pera

tion

s, 1

2 m

achi

nes,

7

wor

kcen

ters

5 32 0.8~(1-20)

0.2~rest 1023.98 1056.88 32.9 3.11%

Page 206: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

191

3.6 Experimentation on the Impact of Customer Priorities for Completion Time

and Budget Surplus:

Tables 3.30-3.34 depict for three different datasets, the impact of differing penalty factors

on the completion times and budget surplus values for 5 jobs, 50 operations and 12

machines under 7 distinct processing time scenarios. The objective function in all the

instances minimizes the expected weighted sum of completion times and budget surplus

for all the jobs. The first column gives the dataset, the second column specifies the

penalty factor for completion time t. Columns 3 – 9 give the completion time values in

seconds under each scenario corresponding to each penalty factor. Column 10 specifies

the penalty factor for budget surplus, and finally, columns 11-17 give the budget surplus

values under each scenario. In each dataset the penalty factors for the completion time

and budget surplus, as specified in columns 2 and 10, remain the same across all

customers. All instances have been solved using model 2 described under Section 3.4.

As the penalty for completion time increases, in most cases, we notice a decrease in

completion time for the jobs across all scenarios. In some cases, however, despite an

increase in penalty, the completion time values either stay the same or increase for some

jobs. This may happen in some instances where the completion times of one of more

jobs are elevated in order to reduce the completion times of remaining jobs so as to

achieve a minimum objective function value. The same reasoning holds for the budget

surplus values depicted in Tables 3.30-3.34.

Page 207: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

192

Table 3.30: Impact of penalty factors on the completion time and budget surplus values for job J1.

Penalty Completion times under different scenarios Penalty Budget surplus values under different scenarios

Dataset t S2 S3 S4 S5 S6 S7 y S1 S2 S3 S4 S5 S6 S7

0 142.61 147.93 153.25 158.57 163.9 169.22 179.89 1.0 0 0 0 0 300.47 677.05 1432.54

0.25 144.59 149.29 153.98 159.35 164.76 170.16 181.45 0.75 0 0 0 0 300.47 677.05 1432.54

0.5 118.79 123.13 127.45 131.77 136.11 141.37 152.19 0.5 0 0 0 0 339.68 721.61 1488.48

0.75 99.19 104.63 110.07 115.51 121.25 128.26 142.4 0.25 0 0 0 26.25 416.28 805.63 1587.63

D1

1.0 83.7 89.27 94.87 100.44 106.03 111.61 122.81 0 0 0 0 240.42 651.01 1060.8 1883.25

0 321.23 355.61 402.13 418.92 379.84 394.59 424.69 1.0 0 0 0 0 0 46.84 844.32

0.25 144.51 151.96 159.88 167.86 175.81 183.77 199.67 0.75 0 0 0 0 0 46.84 844.32

0.5 108.96 115.37 121.78 128.21 134.64 141.06 154.34 0.5 0 0 0 0 0 140.48 955.51

0.75 108.96 115.37 121.78 128.21 134.64 141.06 154.34 0.25 0 0 0 0 0 140.48 955.51

D2

1.0 108.96 115.37 121.78 128.21 134.64 141.06 154.34 0 0 0 0 0 0 140.48 955.51

0 294.69 308.91 323.08 337.31 351.52 365.69 464.5 1.0 0 315.43 641.01 968.75 1298.39 1624.12 2278.77

0.25 106.12 111.87 117.59 123.34 129.09 134.81 146.61 0.75 0 315.43 641.01 968.75 1298.39 1624.12 2278.77

0.5 97.97 103.65 109.31 114.99 120.67 126.33 137.99 0.5 0 321.30 649.23 978.71 1310.07 1637.86 2295.96

0.75 97.97 103.65 109.31 114.99 120.67 126.33 137.99 0.25 0 321.30 649.23 978.71 1310.07 1637.86 2295.96

D3

1.0 87.85 93.67 99.46 105.75 112.12 118.48 131.21 0 53.77 395.68 733.49 1073.58 1415.48 1753.47 2432.32

Page 208: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

193

Table 3.31: Impact of penalty factors on the completion time and budget surplus values for job J2.

Penalty Completion times under different scenarios Penalty Budget surplus values under different scenarios

Dataset t S1 S2 S3 S4 S5 S6 S7 y S1 S2 S3 S4 S5 S6 S7

0 428.26 441.49 454.87 468.16 481.36 494.65 521.2 1.0 0 0 0 159.22 441.84 720.0 1278.89

0.25 131.47 135.62 139.78 143.93 148.1 152.25 165.66 0.75 0 0 0 159.22 441.84 720.0 1278.89

0.5 121.15 125.33 129.53 133.72 137.91 142.1 150.48 0.5 0 0 0 171.35 454.26 735.0 1297.72

0.75 114.79 119.68 124.58 129.45 134.31 139.18 148.96 0.25 0 0 0 184.62 470.21 752.54 1319.53

D1

1.0 101.91 106.27 110.65 115.01 119.36 123.77 132.86 0 0 0 108.97 412.61 717.33 1020.98 1629.59

0 240.92 251.87 314.83 327.83 280.28 291.7 314.47 1.0 0 0 0 0 150.06 505.28 1216.64

0.25 133.29 139.6 146.39 153.22 160.03 166.85 180.48 0.75 0 0 0 0 155.51 511.774 1226.43

0.5 124.91 130.57 136.23 141.91 147.55 153.21 164.56 0.5 0 0 0 0 155.43 512.11 1227.41

0.75 129.01 134.67 140.52 147.55 154.85 162.16 176.76 0.25 0 0 0 0 155.43 512.11 1227.41

D2

1.0 129.01 134.67 140.52 147.55 154.85 162.16 176.76 0 0 0 0 0 155.43 512.11 1227.41

0 368.41 385.99 403.51 421.1 438.67 465.18 561.7 1.0 0 54.46 362.74 669.19 978.91 1284.59 1899.03

0.25 98.1 103.54 108.97 114.4 119.84 125.25 136.11 0.75 0 54.46 362.74 669.19 978.91 1284.59 1899.03

0.5 98.1 103.54 108.97 114.4 119.84 125.25 136.11 0.5 0 54.46 362.74 669.19 978.91 1284.59 1899.03

0.75 98.1 103.54 108.97 114.4 119.84 125.25 136.11 0.25 0 54.46 362.74 669.19 978.91 1284.59 1899.03

D3

1.0 84.66 89.74 94.8 99.88 104.96 110.03 121.25 0 36.44 382.34 726.96 1076.12 1422.86 1767.51 2459.99

Page 209: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

194

Table 3.32: Impact of penalty factors on the completion time and budget surplus values for job J3.

Penalty Completion times under different scenarios Penalty Budget surplus values under different scenarios

Dataset t S1 S2 S3 S4 S5 S6 S7 y S1 S2 S3 S4 S5 S6 S7

0 276.39 288.74 301.09 313.45 325.79 338.12 362.84 1.0 0 0 0 0 0 289.66 1143.56

0.25 123.27 127.9 132.51 137.82 144.04 151.47 166.3 0.75 0 0 0 0 0 330.02 1194.95

0.5 126.72 131.69 136.64 141.59 146.57 152.46 166.3 0.5 0 0 0 0 0 330.02 1194.95

0.75 101 107.62 114.26 120.89 127.48 134.1 147.45 0.25 0 0 0 0 0 431.47 1315.56

D1

1.0 105.47 112.03 118.64 125.2 131.79 138.36 151.55 0 0 0 0 0 0 431.47 1315.56

0 418.3 434.89 454.45 468.01 484.51 501.04 534.19 1.0 0 0 0 0 0 0 762.17

0.25 108.48 113.52 118.54 124.13 130.15 136.17 148.17 0.75 0 0 0 0 0 0 762.17

0.5 91.99 97.27 102.52 108.44 115.1 121.73 135.01 0.5 0 0 0 0 0 46.49 830.27

0.75 94.64 99.78 104.89 110.44 117.19 123.92 137.38 0.25 0 0 0 0 0 56.98 842.25

D2

1.0 94.64 99.78 104.89 110.44 117.19 123.92 137.38 0 0 0 0 0 0 56.98 842.25

0 149.72 157.08 164.4 171.76 179.1 186.42 201.14 1.0 0 47.22 383.69 721.20 1057.62 1393.19 2069.55

0.25 138.46 144.59 150.69 156.82 162.94 169.03 181.47 0.75 0 46.29 383.48 722.09 1059.23 1395.52 2075.27

0.5 138.46 144.59 150.69 156.82 162.94 169.03 181.47 0.5 0 46.29 383.48 722.09 1059.23 1395.52 2075.27

0.75 130.21 136.38 142.53 148.71 154.87 161.01 173.53 0.25 0 64.79 405.34 747.32 1087.49 1427.14 2112.94

D3

1.0 131.46 137.58 144.25 150.95 157.62 164.29 177.66 0 0 103.95 448.92 795.40 1139.67 1483.82 2178.15

Page 210: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

195

Table 3.33: Impact of penalty factors on the completion time and budget surplus values for job J4.

Penalty Completion times under different scenarios Penalty Budget surplus values under different scenarios

Dataset t S1 S2 S3 S4 S5 S6 S7 y S1 S2 S3 S4 S5 S6 S7

0 353.81 345.15 360.74 376.32 391.87 407.43 438.6 1.0 0 0 0 0 241.64 598.22 1313.68

0.25 105.05 109.52 114 118.47 123.84 130.44 143.59 0.75 0 0 0 0 241.64 598.22 1313.68

0.5 105.05 109.52 114 118.47 123.84 130.44 143.59 0.5 0 0 0 0 241.64 598.22 1313.68

0.75 80.62 85.17 89.75 94.32 98.84 103.42 112.55 0.25 0 0 0 0 362.66 732.52 1470.77

D1

1.0 80.62 85.17 89.75 94.32 98.84 103.42 112.55 0 0 0 0 0 362.66 732.52 1470.77

0 223.96 232.97 241.98 250.99 259.97 268.98 287.01 1.0 0 0 0 0 0 345.84 1116.27

0.25 148.93 154.71 160.49 166.26 172.03 178.49 193.93 0.75 0 0 0 0 0 345.84 1116.27

0.5 150.08 157.24 164.42 171.59 178.76 185.93 200.27 0.5 0 0 0 0 0 345.84 1116.27

0.75 130.89 138.17 145.48 152.77 160.07 167.36 181.95 0.25 0 0 0 0 92.85 489.73 1285.1

D2

1.0 128.62 134.63 140.65 146.65 152.66 159.39 174.65 0 0 0 0 0 309.06 723.73 1556.44

0 247.13 259.27 271.4 283.55 338.65 351.91 448.9 1.0 0 0 0 0 0 0 0

0.25 105.85 111.05 116.21 121.41 126.61 131.79 143.36 0.75 0 0 0 0 0 0 0

0.5 104.27 109.52 114.74 119.99 125.24 130.47 142.15 0.5 0 0 0 0 0 0 0

0.75 104.27 109.52 114.74 119.99 125.24 130.47 142.15 0.25 0 0 0 0 0 0 0

D3

1.0 108.44 113.85 119.25 124.66 130.07 135.47 146.28 0.25 0 0 0 0 0 0 0

Page 211: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

196

Table 3.34: Impact of penalty factors on the completion time and budget surplus values for job J5.

Penalty Completion times under different scenarios Penalty Budget surplus values under different scenarios

Dataset t S1 S2 S3 S4 S5 S6 S7 y S1 S2 S3 S4 S5 S6 S7

0 339.32 354.2 369.07 383.95 398.81 413.66 443.42 1.0 0 0 0 0 0 0 0

0.25 122.97 127.74 132.48 137.23 142.9 149.76 163.47 0.75 0 0 0 0 0 0 0

0.5 118.86 123.78 128.69 133.58 138.5 145.25 159.11 0.5 0 0 0 0 0 0 131.77

0.75 117.13 122.17 127.21 132.25 137.28 142.31 152.81 0.25 0 0 0 0 0 0 131.77

D1

1.0 116.5 121.59 126.68 131.76 136.84 141.92 152.81 0 0 0 0 0 0 0 104.34

0 324.38 339.06 405.89 422.98 384.21 399.26 429.98 1.0 0 0 0 0 0 34.22 677.23

0.25 136.49 141.92 147.54 153.17 158.82 164.45 176.88 0.75 0 0 0 0 0 34.22 677.23

0.5 154.67 161.64 168.84 177.01 185.19 193.35 210.14 0.5 0 0 0 0 0 34.22 677.23

0.75 133.24 138.76 144.47 151.35 158.53 165.59 180.01 0.25 0 0 0 0 0 268.60 947.25

D2

1.0 133.24 138.76 144.47 151.35 158.53 165.59 180.01 0 0 0 0 0 0 268.60 947.25

0 324.7 338.57 352.4 366.28 380.12 393.96 492.06 1.0 0 0 0 0 0 0 0

0.25 115.31 121.03 126.77 132.49 138.21 143.93 155.38 0.75 0 0 0 0 0 0 0

0.5 115.31 121.03 126.77 132.49 138.21 143.93 155.38 0.5 0 0 0 0 0 0 0

0.75 115.31 121.03 126.77 132.49 138.21 143.93 155.38 0.25 0 0 0 0 0 0 0

D3

1.0 99.76 104.01 108.28 112.53 116.78 121.03 129.55 0 0 0 0 0 0 0 0

Page 212: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

197

C h a p t e r 4

A DYNAMIC STOCHASTIC SCHEDULING APPROACH

4.1 Introduction:

In this chapter, we present a dynamic stochastic scheduling approach for the NMJS

problem. Recall that, the processing time durations for job operations in the NMJS

environment are not deterministic. In the previous chapter, a two-stage stochastic linear

program with recourse was described wherein stochasticity in processing times is

modeled through a finite number of plausible scenarios. The underlying approach

generates assignment and sequencing choices that are well-hedged against future

contingencies so as to achieve the overall objectives of minimizing job completion times

and budget surplus. Note that, in a real-life implementation that is faithful to the two-

stage procedure, assignment (routing) and sequencing decisions will be made for all the

operations of all the jobs at the outset and these will be followed through regardless of

the actual processing times realized for individual operations. It may be possible to refine

this procedure if information on actual processing time realizations for completed

operations could be utilized so that assignment and sequencing decisions for impending

operations are adjusted based on the evolving scenario (which may be very different

from the scenarios modeled) while still hedging against future uncertainty. In this

chapter, we present the details for such a refinement wherein the NMJSP model (Section

3.2) is solved at each decision point in a rolling horizon fashion while incorporating the

actual processing times realized for the operations that have been completed.

4.2 The DSSP Approach:

The proposed Dynamic Stochastic Scheduling Problem approach (DSSP) embeds within

it the augmented two-stage stochastic program with recourse (ATSSP) which is solved at

each decision point using the LP heuristic described in Section 3.4. A decision point is

reached at the completion of every operation of the jobs.

Page 213: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

198

The ATSSP is an extension of the TSSP wherein indirect precedence variables and

associated logical constraints are included so as to precisely model preexisting conditions

in the shop floor at each decision point as follows:

ATSSP: Minimize (3.1) subject to (3.2-3.15), (3.47-3.51), 3.56, 3.60.

Expression (3.1) is modified to minimize the expected weighted sum of completion times

and budget surplus for all jobs.

At the outset, there exists a set of N jobs whose ready times are known. All machines

are assumed to be available. The following two decisions must be made:

1 Assignment of the first operations of all the jobs

2 Sequence in which to process the first operations of all the jobs.

The above decision problem is solved at time zero and is referred to as 1( , )i jDP , where

DP indicates a decision problem, the subscript ( , )i j denotes a generic job-operation

pair and the superscript ‘1’ indicates that it is the ‘first’ decision problem to be solved. 1( , )i jDP is solved using the LP heuristic. Although the assignment and sequencing

decisions for all the operations of all the jobs are determined, yet only the first operations

of all the jobs are assigned to their respective machines. Note that, it is possible for more

than one first operation of the jobs to be assigned to the same machine. In that case, the

first job-operation to begin processing on that machine will depend upon the sequence

obtained from 1( , )i jDP . Once processing begins, we wait until one of the first operations,

say i,( 1) finish processing on some machinem , where i,( 1) denotes the first operation

of job i . A new decision point is reached and 2( ,1)iDP must now be solved. Before

solving it, however, the assignment decisions for the first operations of all the jobs

currently in process will be required from the output of the previous decision

problem 1( , )i jDP . The assignment and sequencing variable values for all these operations

including (i, )1 is fixed as specified by 1( , )i jDP . We reiterate that the assignments and

Page 214: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

199

sequences of the first operations of only those jobs that are currently in process or have

completed processing are fixed. This is done to ensure that, in 2( ,1)iDP , these job-

operations are not reassigned to other machines and also to guarantee that they are

placed ahead in the sequence before all other operations that could possibly be assigned

to their respective machines where they are currently in process. In general, at any

decision point p, the following information will be required from the shop-floor.

1 Current status of all the jobs in the system classified according to whether a job is

currently in process or is in transit or has arrived at the destination machine but is

waiting in queue.

2 The sequence of job-operations processed so far on all the machines.

3 The realized processing times for job-operations processed so far.

Using the foregoing information, additional constraints are added to ( , )pi jDP so as to

model previous decisions and existing conditions on the shop-floor because these are

unalterable. For instance, at a decision point p, if a job is in transit or has arrived at its

destination machine but is waiting in queue, the assignment of this job-operation to the

destination machine is fixed. However, its sequence with respect to the other operations

that the machine is capable of processing or has already processed is not fixed unless the

job-operation is already in process or has completed processing. Also, the decision

problem at p must incorporate the realized processing time for all job-operations that

have been completed and the scenario-dependent processing time estimates for all

operations currently in process as well as future operations that await processing. At the

decision point p, ( , )pi jDP is solved using the LP heuristic in order to determine:

1 The best possible assignment and tentative sequencing decision for ( , 1)i j +

2 The next tentative job-operation to process on machine m where ( , )i j just finished

processing.

The sequencing decision for ( , 1)i j + is labeled ‘tentative’ in order to allow for the fact

that, should another decision problem be solved in the time that it takes ( , )i j to transit

Page 215: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

200

from machine m to say machine n to which ( , 1)i j + has been assigned, the new

decision problem may deem it optimal to alter the sequence on machine n as determined

by the previous decision problem However, the new decision problem is not permitted

to revise the assignment of ( , 1)i j + . This is because ( , 1)i j + may already be in transit

or may have arrived at machine n and altering its destination may not be practical. For

reasons similar to the ones above, the job-operation to be processed next on machine

m is also labeled ‘tentative’. In general, if the next operation to be processed on some

machine m , where an operation ( , )i j has just finished processing, arrives at this

machine in the time between ( , )pi jDP and the next DP, then that operation is started and

so is the job-operation deemed next in sequence and already present in queue having

arrived earlier.

Job-operations that arrive at a machine between two decision points may either find the

machine busy or idle. If the machine is busy, the jobs must wait until this machine (or

some other machine) becomes idle at which point a decision problem is solved and a

new sequence of job-operations on different machines is redetermined although the

assignments of job-operations to machines are fixed. If the machine is idle and one of

the job operations in queue is assigned to be processed next (as determined from the last

DP solved) then processing shall commence; else the machine continues to lie idle. Thus,

while assignments are more rigidly fixed, the sequencing is more flexible and is allowed to

vary at each decision point in order to optimally adjust to the evolving scenario. This

procedure of solving the two-stage stochastic program with recourse while adding

constraints to account for past decisions is repeated until all job-operations are assigned

and sequenced to optimality under unfolding conditions in this rolling horizon

framework.

Page 216: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

201

The DSSP Algorithm:

Step 1: Begin by solving the ATSSP 1( , )i jDP at time zero using the LP heuristic.

Step 2: Using the output of 1( , )i jDP assign the first operations of jobs to their respective

machines for processing. All job ready times must be honored. In the event that multiple

first operations of jobs are assigned to the same machine, begin processing the first

amongst these operations as determined from the sequence prescribed by the output of 1( , )i jDP .

Step 3: Wait, until one of the first operations of a job, say i,( 1) of job i , finishes

processing. Using information from the shop floor on job-operations currently in

process, those in transit and those that have arrived at their destination machines but are

waiting in queue, fix the necessary assignment and sequencing variables (as outlined in

Section 4.2) and solve 2( ,1)iDP using the LP heuristic.

Step 4: Determine the best possible assignment and tentative sequencing decision for

i,( 2) and also the next tentative job-operation to process on the machine corresponding

to i,( 1) as well as other job-operations that are to be imminently assigned to machines

presently idle.

Step 5: Route i,( 2) to the machine determined in Step 4 and if the next job to be

processed on the machine that just finished processing i,( 1) arrives in the time between

the completion of i,( 1) and the next decision point, process this job. Likewise, based on

the solution to 2( ,1)iDP , process any jobs that arrive at idle machines before the next

decision point.

Step 6: Continue this procedure of solving the ATSSP at the completion of each new

operation (whilst fixing assignment and sequencing variables to model past decisions and

existing conditions on the shop floor) until all job-operations are routed and sequenced

under unfolding conditions in this rolling horizon framework.

Page 217: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

202

4.3 An Illustrative Example:

Next, we illustrate the implementation of the above algorithm through an example

problem. Consider the partial data file shown below for a case involving two jobs with

four operations each and six machines segregated into three workcenters.

Set of Jobs: set JOB := 1 2; Set of operations for each job: set OPER[1] := 0 1 2 3 4; set OPER[2] := 0 1 2 3 4; Set of machines: set MACH := M0 M1 M2 M3 M4 M5 M6; Set of scenarios: set SCEN := S1 S2 S3; Set of alternative machines for each job-operation: set M[1,0] := M0; set M[1,1] := M3 M4; set M[1,2] := M1 M2; set M[1,3] := M3 M4; set M[1,4] := M1 M2; set M[2,0] := M0; set M[2,1] := M1 M2; set M[2,2] := M5 M6; set M[2,3] := M1 M2; set M[2,4] := M5 M6; Capabilities of each machine: set Z[M0] := (1,0) (2,0); set Z[M1] := (1,2) (1,4) (2,1) (2,3); set Z[M2] := (1,2) (1,4) (2,1) (2,3); set Z[M3] := (1,1) (1,3); set Z[M4] := (1,1) (1,3); set Z[M5] := (2,2) (2,4); set Z[M6] := (2,2) (2,4);

Page 218: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

203

Travel time between machines in hours: Parameter T:

M0 M1 M2 M3 M4 M5 M6 M0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 M1 0.0 0.0 0.4 12.0 12.0 9.0 9.0 M2 0.0 0.4 0.0 12.0 12.0 9.0 9.0 M3 0.0 12.0 12.0 0.0 0.4 7.0 7.0 M4 0.0 12.0 12.0 0.4 0.0 7.0 7.0 M5 0.0 9.0 9.0 7.0 7.0 0.0 0.4 M6 0.0 9.0 9.0 7.0 7.0 0.4 0.0; Budget and ready times (hours) for each job: Parameter:

B r 1 1273.08 0.00 2 1592.78 1.01; Scenario probabilities: Parameter probability: S1 0.3 S2 0.4 S3 0.3; From the above data, we have two jobs whose ready times are known. All machines are

assumed to be available. At the outset, the following two decisions must be made:

1 Assignment of the first operations of both the jobs

3 Sequence in which to process the first operations of both the jobs.

The above decision problem that must be solved at time zero is referred to as 1( , )i jDP .

We solve it using the LP heuristic and obtain the following assignment and sequence

(AS) chart depicting the output in the form of assignment and sequencing decisions for

all operations (see Figure 4.1). (Notice that the AS chart is different from the Gantt chart

in that it does not present a schedule but rather an assignment of operations to and their

sequences on the machines). This is due to the fact that, from the output, only the

assignment and sequencing variable values are of interest because the completion time

variables are scenario dependent.

Page 219: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

204

Figure 4.1: AS chart at decision point 1.

According to the data file, the operation (1,1) arrives at time t=0 and the operation (2,1)

arrives at t = 1.01. We begin processing (1,1) on M4 at t=0 (shown in Figure 4.1 with a

dashed border) and wait until the next decision point is reached, which is determined

when the operation (1,1) finishes processing. In the mean time, the operation (2,1) arrives

at t=1.01 and begins processing on M2 (shown in Figure 4.1 with a dashed border). Let

us say that the operation (1,1) completes processing at (1,1)t =1.2 (actual or realized

processing time for each job operation has been generated using a uniform distribution

between the lowest and highest scenario processing time estimates for that operation) at

which time the operation (2,1) is still in process on M2. We must now solve 2(1,1)DP in

which the following conditions must be incorporated to reflect existing shop conditions.

Job- operation (1,1):

1. (1,1)M = M3, M4. The operation (1,1) has been assigned to M4. Therefore

machine M3 should no longer be considered as a possible candidate for

assigning (1,1). So,

4 3(1,1) (1,1)1, 0M Mx x= =

(1,2) (2.3)

(2,1) (1,4)

(1,3)

(1,1)

(2,4)

(2,2)

M1

M2

M3

M4

M5

M6

Page 220: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

205

2. 4Mz = (1,1), (1,3). The operation (1,1) is the first operation processed on M4. Therefore,

4 4

(1,3,1,1) (1,3,1,1)0, 0M My g= =

3. 3Mz = (1,1), (1,3). Since the operation (1,1) is not assigned to M3 its y and

g variables with respect to all other operations on this machine must be zero.

That is,

33 33

(1,3,1,1) (1,1,1,3)(1,1,1,3) (1,3,1,1)0, 0, 00,

MM MM gy gy= = ==

4. Set the scenario-dependent processing times for the operation (1,1) to its actual

realized value.

1 2 3( 4, ) ( 4, ) ( 4, )(1,1) (1,1) (1,1) 1.2M M Ms s sp p p= = =

Job- operation (2,1):

5. (2,1)M = M1, M2. The operation (2,1) has been assigned to M2. Therefore

machine M1 should no longer be considered as a possible candidate for

assigning the operation (2,1). So,

2 1(2,1) (2,1)1, 0M Mx x= =

6. 2Mz = (1,2), (1,4), (2,1) and (2,3). The operation (2,1) is the first operation

processed on M2. Therefore,

2 2 2 2 2 2

(1,4,2,1) (1,4,2,1) (1,2,2,1) (1,2,2,1) (2,3,2,1) (2,3,2,1)0, 0, 0, 0, 0, 0M M M M M My g y g y g= = = = = = 7. 1Mz = (1,2), (1,4), (2,1), (2,3). Since the operation (2,1) is not assigned to M1 its y

and g variables with respect to all other operations on this machine must be set to

zero.

1 1 1 1 1 1

(1,4,2,1) (1,4,2,1) (1,2,2,1) (1,2,2,1) (2,3,2,1) (2,3,2,1)

1 1 1 1 1 1(2,1,1,4) (2,1,1,4) (2,1,1,2) (2,1,1,2) (2,1,2,3) (2,1,2,3)

0, 0, 0, 0, 0, 0,

0, 0, 0, 0, 0, 0.

M M M M M M

M M M M M M

y g y g y g

y g y g y g

= = = = = =

= = = = = =

Page 221: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

206

We now solve 2(1,1)DP using the LP heuristic and obtain the following AS chart depicting

the output in the form of assignment and sequencing decisions for all the operations.

Figure 4.2: AS chart at decision point 2.

The following decisions become clear from the above chart 1. Job 1 must traverse to M1 to begin its next operation (1,2). The travel time from

M4 to M1 is 12 hrs.

2. M4 will stay idle because no other operation has been assigned to it at the

moment.

Since job 1 begins its travel to M1, it is expected to get there at time t = 1.2+12 = 13.2

hrs. In the meantime, say the operation (2,1) finishes processing on M2 and the realized

processing time on M2 for the operation (2,1) is 9.5 hrs. Therefore, the completion time

of the operation (2,1) is: (2,1)t = 1.01+9.5=10.51 hrs. We now reach a decision point and

must solve 3(2,1)DP . The following conditions must be incorporated to reflect the existing

shop floor conditions.

(1,2) (2,3)

(2,1)

(1,3)

(1,1)

(2,4)

(2,2)

M1

M2

M3

M4

M5

M6

(1,4)

Page 222: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

207

Operation (1,1):

1. The operation (1,1) is in transit and is headed towards M1 and this cannot be

altered. Therefore, we fix its assignment

1(1,2) 1.Mx =

Operation (1,2):

2. (1,2)M = M1, M2. Since the operation (1,2) must be processed on M1, we must

have

2(1,2) 0.Mx =

3. 2Mz = (1,2), (1,4), (2,1), (2,3). Therefore, set

2 2 2 2 2 2

(1,4,1,2) (1,4,1,2) (2,1,1,2) (2,1,1,2) (2,3,1,2) (2,3,1,2)

2 2 2 2 2 2(1,2,1,4) (1,2,1,4) (1,2,2,1) (1,2,2,1) (1,2,2,3) (1,2,2,3)

0, 0, 0, 0, 0, 0,

0, 0, 0, 0, 0, 0.

M M M M M M

M M M M M M

y g y g y g

y g y g y g

= = = = = =

= = = = = =

4 Set the scenario dependent processing times for the operation (2,1) to its actual realized value.

1 2 3( 2, ) ( 2, ) ( 2, )(2,1) (2,1) (2,1) 9.5.M M Ms s sp p p= = =

We now solve 3

(2,1)DP using the LP heuristic and the solution obtained is the same as

depicted in the Figure 4.2.

Observe that job 2 must be shipped to M6 and M2 must lie idle because no other

operation is scheduled to be processed on it. Therefore, job 2 begins its transit towards

M6 at t= 10.51 hrs and will reach M6 at 10.51 + 9 = 19.51 hrs. At t = 13.2 hrs, when the

operation (1,1) reaches M1, it begins processing on M1 because at the last decision point, 3(2,1)DP , the sequence determined was to process the operation (1,2) on M1 first. Let us

say the operation (1,2) continues processing for 8 hrs. In that case, its completion time

will be (1,2)t = 13.2 + 8 = 21.2 hrs.

Meanwhile job 2 reaches M6 at 19.51 hrs where it begins processing at M6 as per the

sequencing decisions determined from the last DP ( 3(2,1)DP ) solved. Let us say, that the

Page 223: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

208

operation (2,2) keeps the machine busy from t = 19.51 hrs to t = 19.51 + 5.6 = 25.11

hrs. Since the operation (1,2) finishes processing at 21.2 hrs, we reach a decision point

DP4)2,1( . At this point, we must incorporate the following conditions to reflect existing

shop floor conditions.

Operation (2,2):

1. 6 5(2,2) (2,2)1, 0.M Mx x= =

2. 5Mz = (2,2), (2,4), Therefore,

5 5 5 5(2,2,2,4) (2,2,2,4) (2,4,2,2) (2,4,2,2)0, 0, 0, 0,M M M My g y g= = = =

3. 6Mz = (2,2), (2,4), and since the operation (2,2) is first in sequence and is in

process on M6, we must have

6 6(2,4,2,2) (2,4,2,2)0, 0.M My g= =

Operation (1,2):

4. The operation (1,2) has been sequenced first on M1. Therefore, from 1Mz =

(1,2), (1,4), (2,1), (2,3), we must have

1 1 1 1 1 1

(1,4,1,2) (1,4,1,2) (2,1,1,2) (2,1,1,2) (2,3,1,2) (2,3,1,2)0, 0, 0, 0, 0, 0.M M M M M My g y g y g= = = = = =

5. Set the scenario dependent processing times for the operation (1,2) to its actual realized value.

1 2 3( 1, ) ( 1, ) ( 1, )

(1,2) (1,2) (1,2) 8.0M M Ms s sp p p= = =

We now solve DP4)2,1( using the LP heuristic and obtain the same AS chart as shown in

Figure 4.2.

Observe that job 1 must leave for M3 to begin its next operation (1,3) while (2,3) is the

next one in sequence to be processed on M1. Therefore, job 1 begins its travel towards

M3 at t= 21.2 hrs and will reach M3 at t = 21.2 + 12 = 33.2 hrs. Meanwhile, at t=25.11

hrs, the operation (2,2) finishes processing on M6 and we reach a new decision point

Page 224: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

209

given by .5)2,2(DP At this point, we must incorporate the following to reflect the existing

shop floor conditions.

Operation (1,2):

1. Job 1 is in transit and moving towards M3 for its next operation (1,3). Therefore,

we fix its assignment

3 4(1,3) (1,3)1, 0.M Mx x= =

2. 4Mz = (1,1), (1,3). Therefore,

4 4 4 4(1,3,1,1) (1,3,1,1) (1,1,1,3) (1,1,1,3)0, 0, 0, 0.M M M My g y g= = = =

Operation (2,2):

3. Set the scenario dependent processing times for the operation (2,2) to its actual realized value.

1 2 3( 6, ) ( 6, ) ( 6, )

(2,2) (2,2) (2,2) 5.6.M M Ms s sp p p= = = We solve 5

(2,2)DP using the LP heuristic and obtain the same AS chart as shown in Figure 4.2. Note that job 2 must proceed to M1 for its next operation (2,3). Therefore, it begins its

transit at t = 25.11 hrs and reaches M1 at t= 25.11 + 9 = 34.11 hrs. M1 stays idle

between t = 21.2 to t= 34.11 hrs until job 2 arrives to begin its next operation. Also,

there are no other operations to be processed on M6 as per the chart. Therefore, M6 lies

idle.

Meanwhile, job 1 reaches M3 at 33.2 hrs and as per the last decision problem 5(2,2)DP , it

begins its processing on M3. Assuming that the realized processing time on M3 is 1 hr,

the operation (1,3) will finish processing on M3 at t = 33.2+1 = 34.2 hrs.

Now, job 2 arrives at M1 to begin processing its operation (2,3) at t = 34.11 hrs. It finds

M1 idle at this point, and therefore, begins processing. Note that, as per the sequence

determined at the last decision point 5(2,2)DP , it must follow (1,2) on M1. Let us say that

the operation (2,3) keeps M1 busy from t= 34.11 to t=34.11 + 9 = 43.11hrs.

Page 225: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

210

Since (1,3) finishes processing on M3 at 34.2 hrs, we reach yet another decision point

given by .6)3,1(DP At this point, we must incorporate the following conditions to reflect

the existing shop floor conditions.

Operation (2,3):

1. 1 2(2,3) (2,3)1, 0.M Mx x= =

2. Since the operation (2,3) follows the operation (1,2) on M1, we must have 1 1

(1,2,2,3) (1,2,2,3)1, 1.M My g= =

3. 1Mz = (1,2) (1,4) (2,1) (2,3). Therefore, 1 1 1 1

(1,4,2,3) (1,4,2,3) (2,1,2,3) (2,1,2,3)0, 0, 0, 0.M M M My g y g= = = =

4. 2Mz = (1,2), (1,4), (2,1), (2,3)

2 2 2 2 2 2(1,4,2,3) (1,4,2,3) (2,1,2,3) (2,1,2,3) (1,2,2,3) (1,2,2,3)

2 2 2 2 2 2(2,3,1,4) (2,3,1,4) (2,3,2,1) (2,3,2,1) (2,3,1,2) (2,3,1,2)

0, 0, 0, 0, 0, 0,

0, 0, 0, 0, 0, 0.

M M M M M M

M M M M M M

y g y g y g

y g y g y g

= = = = = =

= = = = = =

Operation (1,3)

5. Set the scenario dependent processing times for the operation (2,2) to its actual realized value.

1 2 3( 3, ) ( 3, ) ( 3, )

(1,3) (1,3) (1,3) 1.0M M Ms s sp p p= = =

We solve 6

(1,3)DP using the LP heuristic and obtain the same AS chart as shown in

Figure 4.2.

Observe that job 1 must be shipped to M1 and M3 must lie idle because no other

operation is scheduled to be processed on it. Therefore, the operation (1,3) begins its

transit towards M1 at t= 34.2 hrs and will reach M1 at 34.2 + 12 = 46.2 hrs.

Page 226: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

211

Meanwhile the operation (2,3) on M1 completes processing at t = 43.11 hrs. We arrive at

a decision point given by 7(2,3)DP . We must incorporate the following shop-floor

conditions.

1. Job 1 is in transit towards M1 to accomplish operation (1,4). Therefore, the

corresponding assignment must be fixed.

1 2(1,4) (1,4)1, 0M Mx x= =

2. 2Mz = (1,2), (1,4), (2,1), (2,3)

2 2 2 2 2 2(1,2,1,4) (1,2,1,4) (2,1,1,4) (2,1,1,4) (2,3,1,4) (2,3,1,4)0, 0, 0, 0, 0, 0.M M M M M My g y g y g= = = = = =

3. Set the scenario dependent processing times for (2,3) to its actual realized value.

1 2 3( 1, ) ( 1, ) ( 1, )(2,3) (2,3) (2,3) 9.0.M M Ms s sp p p= = =

We solve 7(2,3)DP using the LP heuristic and obtain the same AS chart as shown in

Figure 4.2.

Observe that job 2 must be shipped to M5 for its next operation (2,4) and begins its

transit towards M5 at t= 43.11 hrs and will reach M1 at 43.11 + 9 = 52.11 hrs.

Meanwhile, (1,3) arrives at M1 at t=46.2 hrs and begins processing as determined in the

previous DP ( 7(2,3)DP ). Assuming that the operation (1,4) keeps M1 busy for a duration

of 8.5 hrs, it will finish processing at t= 46.2+8.5 = 54.7 hrs.

Also at t=52.11 hrs, job 2 arrives at M5 to begin processing its operation (2,4), and

assuming that the actual processing time incurred is 6 hrs, it will complete processing at

t=52.11 + 6 = 58.11 hrs.

Next, we compare the performance of the DSSP approach, which attempts to refine the

augmented two-stage stochastic solution by incorporating the actual (or realized)

processing times at each decision point to create assignment and sequencing choices that

can optimally adjust to the evolving scenario, with the augmented two-stage stochastic

solution (ATSSP), which was obtained by solving the problem at time zero. The results

Page 227: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

212

of this comparison for the above example as well as results from eight distinct

simulations for the same dataset are described in Table 4.1. Each simulation (case)

corresponds to a unique set of processing time realizations for each job operation on the

machines that are capable of processing it. The processing time realizations have been

generated randomly between the lowest and the highest scenario estimates for each

operation. The DSSP objective value for each case as shown in column 2 of Table 4.1 is

obtained by evaluating the final AS chart prescribed by the DSSP approach under the

unique set of processing time realizations that characterize the case. The ATSSP

objective value for each case shown in column 3 of Table 4.1 is obtained by evaluating

the AS chart prescribed by the ATSSP approach at time zero ( 1( , )i jDP ) under a unique

set of processing time realizations corresponding to the case.

Table 4.1: A comparison of solutions obtained by the DSSP and ATSSP approaches

Case

No.

DSSP Objective ATSSP Objective

1 111.9 114.42

2 109.15 112.55

3 112.87 114.23

4 109.59 111.94

5 114.83 117.18

6 111.81 113.62

7 113.59 113.25

8 112.38 114.99

9 108.14 111.74

In almost all of the cases above, we note a decrease in the objective function value when

the DSSP approach is used. This trend is seen generally because in the DSSP approach

assignment and sequencing decisions are undertaken at the completion of every job

operation in light of updated information regarding realized processing times for past

operations and other shop floor conditions that may be present. On the contrary, the

Page 228: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

213

ATSSP approach generates a one-time assignment and sequence chart in light of

estimated processing times for all operations. The ATSSP approach does not permit a

revision of the assignment and sequences generated if the realized processing times for

operations happen to deviate significantly from their estimated values. In certain cases

however, the ATSSP approach may generate an objective value that is lower than the

DSSP approach. This is possible if the assignments and sequences recommended by the

ATSSP approach happen to realize very optimistic processing times and those

recommended by the DSSP approach realize very pessimistic processing times.

We present below a comparison of the AS charts using the ATSSP and the DSSP

approaches for each of the nine cases above. Note that the AS chart using the ATSSP

approach remains the same in all nine cases. Also recall that the AS chart merely reflects

assignment and sequencing choices and does not reflect start or finish times for job-

operations.

Figure 4.3: AS chart – ATSSP approach Figure 4.4: AS chart for case 1 – DSSP

approach

(1,2) (2,3)

(2,1) (1,4)

(1,3)

(1,1)

(2,4)

(2,2)

M1

M2

M3

M4

M5

M6

(1,2) (2,3)

(2,1)

(1,4)

(1,3)

(1,1)

(2,2)

(2,4)

M1

M2

M3

M4

M5

M6

Page 229: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

214

Figure 4.5: AS chart for case 2 – DSSP approach Figure 4.6: AS chart for case 3– DSSP approach. Figure 4.7: AS chart for case 4 – DSSP approach Figure 4.8: AS chart for case 5 – DSSP approach

(1,2)

(2,3)(2,1)

(1,4)

(1,3)

(1,1)

(2,2)

(2,4)

M1

M2

M3

M4

M5

M6

(1,2) (2,3)

(2,1)

(1,4)

(1,3)

(1,1)

(2,4) (2,2)

M1

M2

M3

M4

M5

M6

(1,2) (2,3)

(2,1)

(1,4)

(1,3)

(1,1)

(2,2)

(2,4)

M1

M2

M3

M4

M5

M6

(1,2) (2,3)

(2,1)

(1,4)

(1,3)

(1,1)

(2,2)

(2,4)

M1

M2

M3

M4

M5

M6

Page 230: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

215

Figure 4.9: AS chart for case 6 – DSSP approach Figure 4.10: AS chart for case 7 –

DSSP approach

Figure 4.11: AS chart for case 8 – DSSP approach Figure 4.12: AS chart for case 9 –

DSSP approach

(1,2) (2,3)

(2,1)

(1,4)

(1,3)

(1,1)

(2,4)(2,2)

M1

M2

M3

M4

M5

M6

(1,2) (2,3)

(2,1)

(1,4)

(1,3)

(1,1)

(2,2)

(2,4)

M1

M2

M3

M4

M5

M6

(1,2) (2,3)

(2,1)

(1,4)

(1,3)

(1,1)

(2,2)

(2,4)

M1

M2

M3

M4

M5

M6

(1,2)

(2,3) (2,1)

(1,4)

(1,3)

(1,1)

(2,2)

(2,4)

M1

M2

M3

M4

M5

M6

Page 231: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

216

4.4 The DSSP Approach Applied to a Large Problem Instance:

In this section, we present the results of the DSSP approach as applied to two larger

problem instances; the first instance involves 40 operations, 12 machines and 3

processing time scenarios, and the second instance involves 40 operations, 12 machines

and 7 processing time scenarios. Table 4.2 presents a comparison of the objective

function values realized for the DSSP, the optimal and the ATSSP approaches for

instance 1. Column 2 provides a description of the dataset and columns 3, 4 and 5

indicate the DSSP (LP heuristic), optimal (CPLEX) and ATSSP objective function values

respectively.

Instance 1:

Table 4.2: A comparison of solutions obtained by the DSSP, the optimal and the ATSSP

approaches for a large problem instance.

Objective function value No. Description

DSSP (LP) DSSP (CPLEX) ATSSP

1

40 operations, 12

machines, 3 processing

time scenarios

1557.31 1557.31 1585.47

To determine the quality of the solutions obtained by the LP heuristic, we compare these

with the solutions that are obtained by using an optimum seeking algorithm at every

decision point of the DSSP. The optimal solutions were obtained by solving Model 1(see

section 3.4) using CPLEX. These results are presented in Table 4.3.

Page 232: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

217

Table 4.3: A comparison of time and solutions obtained by the LP heuristic and an

optimal algorithm at each decision point for a large problem instance.

CPU Time (seconds)

Objective Function Value

Decision

Point

LP heuristic CPLEX LP heuristic CPLEX

1 2868.98 10890.7 2694.82 2694.82

2 1372.2 26818.3 2646.26 2646.26

3 56.42 22775.6 2558.51 2558.51

4 303.38 16603.2 2499.25 2498.84

5 285.99 95651 2453.34 2452.92

6 250.24 2036.85 2425.55 2425.20

7 60.42 128.06 2413.69 2413.69

8 29.37 240.48 2322.03 2322.03

9 10.21 9295.32 2255.46 2255.46

10 26.26 485.69 2061.01 2061.01

11 11.13 35.40 2061.81 2061.81

12 2.84 12.15 2011.04 2011.04

13 2.53 19.84 2011.15 2011.15

14 2.188 3.95 2011.13 2011.13

15 2.17 10 1986.71 1986.71

16 1.203 1.84 1858.84 1858.84

17 1.312 1.31 1778.35 1778.35

18 0.86 0.83 1777.42 1777.42

19 0.218 0.55 1755.99 1755.99

20 0.234 0.64 1754.77 1754.77

21 0.203 0.59 1743.89 1743.89

22 0.188 0.33 1730.05 1730.05

23 0.203 0.34 1659.01 1659.01

Page 233: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

218

Table 4.3: A comparison of time and solutions obtained by the LP heuristic and an

optimal algorithm at each decision point for a large problem instance (continued)

CPU Time (seconds)

Objective Function Value

Decision

Point

LP heuristic CPLEX LP heuristic CPLEX

24 0.172 0.33 1659.75 1659.75

25 0.14 0.25 1659.70 1659.70

26 0.141 0.125 1697.46 1697.46

27 0.062 0.125 1690.48 1690.48

28 0.094 0.109 1689.73 1689.73

29 0.078 0.109 1834.03 1834.03

30 0.063 0.094 1833.16 1833.16

31 0.172 0.094 1762.13 1762.13

32 0.062 0.094 1761.89 1761.89

33 0.047 0.063 1761.71 1761.71

34 0.047 0.047 1760.56 1760.56

35 0.016 0.016 1682.94 1682.94

36 0.031 0.015 1743.85 1743.85

37 0.031 0.015 1743.57 1743.57

38 0.031 0.016 1677.99 1677.99

39 0.016 0.015 1657.20 1657.20

40 0.016 0.0 1620.15 1620.15

41 0.0 0.0 1557.31 1557.31

It is clear from Table 4.3 that in only 3 of the 41 instances does CPLEX give a better

solution. However, the improvement obtained over the heuristic solution is less than

0.02%. This is depicted in Figure 4.13 which essentially shows no difference in the

objective function values generated by the heuristic and CPLEX. On the other hand, the

Page 234: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

219

CPU times required by CPLEX are almost always larger than those required by the

heuristic procedure, and especially at the earlier decision points where the problem size is

larger. This is pictorially depicted in Figure 4.14.

Objective Value vs. Decision Point

0

500

1000

1500

2000

2500

3000

1 4 7 10 13 16 19 22 25 28 31 34 37 40

Decision Point

Obj

ectiv

e Va

lue

(dol

lars

)

LP heuristic solutionOptimal solution

Figure 4.13: A plot comparing the solutions of the LP heuristic versus an optimal

algorithm at each decision point for instance 1

CPU Time vs. Decision Point

0

20000

40000

60000

80000

100000

120000

1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41

Decision Point

Tim

e (s

econ

ds)

LP HeuristicCPLEX

Figure 4.14: A plot comparing the CPU times of the LP heuristic versus that required by

CPLEX at each decision point for instance 1

Page 235: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

220

The final solutions obtained by the ATSSP and DSSP approaches are shown in Figures

4.15 and 4.16, respectively.

Figure 4.15: AS chart determined by the ATSSP approach for instance 1.

Figure 4.16: Final AS chart determined by the DSSP approach for instance 1

(1, 2) (3, 2) (1, 7) (3, 7) (4, 8)

(1, 1) (2, 3) (2, 8)

(2, 5)

(3, 1) (1, 8)(4, 1)

(1, 5) (4, 5) (3, 4) (2, 10) (1, 10) (4, 10) (3, 9)

(1, 3) (2, 4) (4, 6) (3, 6) (2, 9)

(4, 4) (1, 6) (3, 5) (4, 9) (3, 10)

(2, 1) (4, 3) (2, 6)

(2, 2) (4, 2) (1, 4) (3, 3) (2, 7) (4, 7) (1, 9) (3, 8)

M1

M2

M3

M4

M5

M6

M7

M8

(1, 2) (3, 2) (1, 7) (3, 7) (4, 8)

(1, 1)

(2, 3) (2, 8)

(2, 5)

(3, 1) (1, 8)(4, 1)

(1, 5) (4, 5) (3, 4) (2, 10) (1, 10) (4, 10) (3, 9)

(1, 3) (2, 4) (4, 6) (3, 6) (2, 9)

(4, 4)

(1, 6)

(3, 5) (4, 9) (3, 10)

(2, 1) (4, 3) (2, 6)

(2, 2) (4, 2) (1, 4) (3, 3) (2, 7) (4, 7) (1, 9) (3, 8)

M1

M2

M3

M4

M5

M6

M7

M8

Page 236: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

221

Instance 2:

We present below a comparison of the objective function values realized for the DSSP,

the optimal and the ATSSP approaches for instance 2. Column 2 provides a description

of the dataset and columns 3, 4 and 5 indicate the DSSP (LP heuristic), optimal (CPLEX)

and ATSSP objective function values respectively.

Table 4.4: A comparison of solutions obtained by the DSSP, optimal and ATSSP

approaches for a large problem instance.

Objective function value No. Description

DSSP (LP) DSSP (CPLEX) ATSSP

2

40 operations, 12

machines, 7 processing

time scenarios

1329.71 1306.13 1420.39

Table 4.5 shows the CPU time and the solution value obtained at each decision point

when the LP heuristic is used to solve the decision problem. Since the solutions

generated by CPLEX at the decision points differ from those generated by the LP

heuristic, these are depicted in Table 4.6.

In a comparison of Tables 4.5 and 4.6 we see that a huge savings in computational time

(98% when summed over the first seven decision points) is realized at the initial decision

points when using the LP heuristic over the optimal algorithm (CPLEX) although the

objective function values generated remain the same. The optimal solution however

diverges after decision point 12. Table 4.4 presents the final objective function values

where we notice that although the optimal algorithm offers the best solution (1.77%

lower than the heuristic solution), the LP heuristic nevertheless improves the ATSSP

solution significantly (6.38%). The final assignments and sequences (AS charts) are

Page 237: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

222

reflected in Figures 4.19, 4.20 and 4.23 for the ATSSP, DSSP and the optimal approaches

respectively. Figures 4.17 and 4.18 depict the objective function value and the CPU

times required by the LP heuristic at each decision point. Likewise, Figures 4.21 and 4.22

present the objective function value and CPU times required by the optimal algorithm at

each decision point.

Table 4.5: Times and solutions obtained by the LP heuristic at each decision point for a

large problem instance.

Decision

Point

CPU Time

(seconds)

LP Heuristic

Objective function

Value

LP Heuristic

1 101.34 1244.95

2 580.97 1241.74

3 163.03 1228.33

4 197.21 1225.08

5 10.93 1224.74

6 12.62 1208.28

7 10.43 1226.04

8 2.56 1226.10

9 2.5 1219.19

10 1.609 1204.5

11 1.187 1201.71

12 1.59 1270.40

13 2.0 1263.72

14 1.344 1319.15

15 1.328 1319.15

16 1.265 1319.15

17 0.875 1297.98

18 5.953 1286.49

Page 238: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

223

Table 4.5: Times and solutions obtained by the LP heuristic at each decision point for a

large problem instance (continued)

Decision

Point

CPU Time

(seconds)

LP Heuristic

Objective function

Value

LP Heuristic

19 0.937 1286.21

20 0.656 1377.42

21 0.859 1380.03

22 0.812 1397.82

23 0.547 1397.66

24 0.672 1397.62

25 0.547 1468.51

26 0.406 1468.65

27 0.578 1555.71

28 0.578 1525.03

29 0.327 1483.15

30 0.14 1483.08

31 0.125 1477.60

32 0.187 1351.33

33 0.14 1351.48

34 0.062 1350.17

35 0.031 1317.73

36 0.031 1317.08

37 0.031 1270.65

38 0.14 1361.81

39 0.031 1338.29

40 0.031 1333.23

41 0.031 1329.71

Page 239: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

224

Objective Value vs. Decision Point

0200400600800

10001200140016001800

1 4 7 10 13 16 19 22 25 28 31 34 37 40

Decision Point

Obj

ectiv

e Va

lue

LP heuristic solution

Figure 4.17: A plot depicting the LP heuristic solution value at each decision point for

instance 2

Time vs. Decision Point

0

100

200

300

400

500

600

700

1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41

Decision Point

Tim

e (s

econ

ds)

LP heuristic

Figure 4.18: A plot of the CPU times required by the LP heuristic at each decision point

for instance 2

Page 240: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

225

Figure 4.19: AS chart determined by the ATSSP approach for instance 2.

Figure 4.20: AS chart determined by the DSSP approach for instance 2.

(4, 8)(1, 3) (4, 3) (3,5) (2, 9) (1, 8)

(2, 3) (1, 4) (2, 8)

(2, 1)

(1, 5) (3, 6)(2, 2)

(3, 2) (1, 2) (4, 2) (2, 6) (3, 7) (1, 7) (4, 7)

(3, 1) (4, 4) (4, 9)

(3,3) (4, 5) (3, 8)

(2, 4) (3, 10)

(1, 1) (4, 1) (3, 4) (2,5) (1, 6) (4, 6) (3, 9) (2, 10)

M1

M2

M3

M4

M5

M6

M7

M8

(1, 9) (4, 10)

(2, 7) (1, 10)

(1, 10)

(1, 3) (3, 5) (4, 3) (4, 8)

(2, 3) (1, 4) (2, 8)

(2, 1)

(1, 5) (3, 6)(2, 2)

(3, 2) (1, 2) (4, 2) (2, 6) (3, 7) (1, 7) (4, 7)

(3, 1) (4, 4) (4, 9)

(3,3) (3, 8) (4, 5)

(2, 4) (3, 10)

(1, 1) (4, 1) (3, 4) (2, 5) (1, 6) (3, 9) (4, 6) (2, 10)

M1

M2

M3

M4

M5

M6

M7

M8

(4, 10)

(2, 7)

(1, 9)

(1, 8) (2, 9)

Page 241: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

226

Table 4.6: Time and solutions obtained by CPLEX at each decision point for a large

problem instance.

Decision

Point

CPU Time

(seconds)

CPLEX

Objective function

Value

CPLEX

1 52691.5 1244.95

2 680.175 1241.74

3 1145.54 1228.33

4 185.15 1225.08

5 318.45 1224.74

6 14.25 1208.28

7 176.37 1226.04

8 27.82 1226.10

9 18.23 1219.19

10 24.57 1204.50

11 6.625 1201.71

12 39.39 1243.03

13 16.73 1235.20

14 9.125 1285.7

15 3.281 1290.13

16 2.782 1288.67

17 2.578 1306.78

18 2.64 1286.37

19 3.84 1281.90

20 3.266 1368.96

21 2.437 1460.16

22 1.515 1418.21

23 1.39 1418.21

Page 242: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

227

Table 4.6: Time and solutions obtained by CPLEX at each decision point for a large

problem instance (continued).

Decision

Point

CPU Time

(seconds)

CPLEX

Objective function

Value

CPLEX

24 1.25 1420.82

25 0.937 1491.71

26 1.484 1491.85

27 0.953 1365.73

29 0.718 1335.05

30 0.687 1335.05

31 0.515 1302.72

32 0.265 1296.99

33 0.079 1296.99

34 0.078 1248.96

35 0.047 1225.43

36 0.031 1224.05

37 0.031 1315.29

38 0.047 1314.63

39 0.031 1314.71

40 0.047 1309.65

41 0.031 1306.13

Page 243: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

228

Objective Value vs. Decision Point

0

200

400

600

800

1000

1200

1400

1600

1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41

Decision Point

Obj

ectiv

e Va

lue

($)

Optimal solution

Figure 4.21: A plot depicting the optimal solution at each decision point for instance 2

Time vs. Decision Point

0

10000

20000

30000

40000

50000

60000

1 4 7 10 13 16 19 22 25 28 31 34 37 40

Decision Point

Tim

e (s

econ

ds)

CPLEX

Figure 4.22: A plot of the CPU times required by the optimal algorithm at each decision

point for instance 2

Page 244: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

229

Figure 4.23: AS chart determined by the optimal algorithm for instance 2.

In the following section we present a flowchart for the generalized DSSP approach in

order to accommodate jobs whose arrival times may not be known in advance. The

flowchart also provides a scheme to discard completed operations from the system and

roll the reference time line forward as the horizon advances.

(1, 3)

(3, 5)

(4, 3) (4, 8)

(2, 3) (1, 4) (2, 8)

(2, 1)

(1, 5) (3, 6)(2, 2)

(3, 2) (1, 2) (4, 2) (2, 6) (3, 7) (1, 7) (4, 7)

(3, 1) (4, 4) (4, 9)

(3,3) (3, 8)(4, 5)

(2, 4) (3, 10)

(1, 1) (4, 1) (3, 4) (2, 5) (1, 6) (3, 9) (4, 6) (2, 10)

M1

M2

M3

M4

M5

M6

M7

M8

(1, 10)

(1, 8)

(4, 10)

(2, 7)

(1, 9)

(2, 9)

Page 245: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

230

Solve the decision problem using the LP heuristic Determine the new AS chart

1. Determine the assignment of impending operations.

2. Ship operations to their destination machines.

3. Determine next operations to process on machines presently idle.

4. Process these operations if they are already in queue or arrive before the next decision point.

Wait until an operation finishes processing or a new job arrives

Reach a Decision point

A

B

Decision problem at time zero

Begin

4.5 Flowchart for the generalized DSSP Approach

Page 246: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

231

Yes

A

Decision point due to operation completion

New Reference Time line required?

Examine current shop statusDetermine operations that have finished processing and those currently in process. Determine operations in transit. Determine operations in queue at destination machines.

Fix assignment and sequence of operations that have finished processing and those currently in process. Fix only the assignment of operations currently in transit and those in queue at destination machines.

B

Decision point due to new job arrival

Examine current shop statusDetermine operations that have finished processing and those currently in process. Determine operations in transit. Determine operations in queue at destination machines.

Fix assignment and sequence of operations that have finished processing and those currently in process with respect to the new operations. Fix only the assignment of operations currently in transit and those in queue at destination machines.

C

No

No

Yes

Page 247: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

232

C

Discard all operations completed by the new reference time line. Renumber remaining operations of all jobs. Determine operations currently in process, those in transit and those waiting in queue at destination machines.

Revise ready times for all the jobs in the system with respect to the new time line. Determine estimates for remaining processing times for job-operations currently in process. Fix the assignment and sequences of job operations currently in process Fix the assignment of job operations in transit and those waiting in queue at destination machines.

B

Page 248: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

233

C h a p t e r 5

CONCLUSIONS AND FUTURE RESEARCH

5.1 Conclusions

In this dissertation we have presented a stochastic programming based approach for the

NMJS (Network of MEMS job shops) problem. This problem arises in a distributed

fabrication environment that has recently emerged to serve the evolving needs of the

high investment, low volume MEMS industry. The problem is modeled as a two-stage

stochastic program with recourse where the uncertainty in processing times is captured

using scenarios. The first-stage routing and sequencing variables are binary whereas the

second-stage completion time and budget surplus variables are continuous. Of the two

approaches presented to solve the model, the first involves using the L-shaped method

to decompose the problem into the first and the second stage problems. Since the NMJS

problem lacks relatively complete recourse, the first-stage solution can be infeasible to

the second-stage problem in that the first stage solution may either violate the reentrant

flow conditions or it may create a deadlock. In order to alleviate these infeasibilities, we

develop feasibility cuts, which when appended to the master problem eliminate the

infeasible solution. Alternatively, we also develop constraints to explicitly address these

infeasibilities directly within the master problem. We show how a deadlock involving 2 or

3 machines arises if and only if a certain relationship between operations and a certain

sequence amongst them exists and we generalize this argument to the case of m machines

which forms the basis for our deadlock prevention constraints. Computational results at

the end of Chapter 3 compare the relative merits of a model which relies solely on

feasibility cuts with models that incorporate reentrant flow and deadlock prevention

constraints within the master problem. Experimental evidence reveals that the latter

offers appreciable time savings over the former. Moreover, in a majority of instances we

see that models that carry deadlock prevention constraints in addition to the reentrant

flow constraints provide at par or better performance than those that solely carry

reentrant flow constraints.

Page 249: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

234

We, next, compare the performances of different models by using the standard and

alternative optimality cuts developed in this work. Also included in this comparison is the

multicut method, which adds as many optimality cuts as there are number of scenarios

per major iteration so as to secure a better representation of the recourse function.

Furthermore, we also examine the impact of using a conservative upper bound for

operation completion times as the value of big H. Experimentation indicates that Model

2 which uses the standard optimality cut in conjunction with the conservative estimate

for big H almost always outperforms Model 1 which also uses the standard optimality cut

but uses a fixed value of 1000 for big H. Model 3 which employs the alternative

optimality cut with the conservative estimate for big H requires the fewest number of

iterations to converge to the optimum but it also incurs the maximum premium in terms

of computational time. This is because the alternative optimality cut adds to the

complexity of the problem in that it appends additional variables and constraints to the

master problem as well as the subproblems. In the case of Model 4 the segregated

optimality cuts accurately reflect the shape of the recourse function resulting in fewer

overall iterations but the large number of these cuts accumulate over the iterations

making the master problem sluggish and so this model exhibits a variable performance

for the various datasets. These experiments reveal that a compact master problem and a

conservative estimate for big H positively impact the run time performance of a model.

A similar trend in the performance of the above four models is observed when the first

stage objective function is changed from a time based to a cost based objective. Since

the first stage variables are binary, the L-shaped method can be naturally extended to

incorporate a branch and bound scheme, and in view of this, we conclude approach 1

with a framework for an integer L-shaped algorithm for the NMJS problem.

A second approach for solving the stochastic NMJS problem relies on the tight LP

relaxation observed for the deterministic equivalent of the model. We describe a heuristic

wherein the LP relaxation of the deterministic equivalent is first solved and certain binary

assignment variables are fixed following which additional logical constraints fix the values

for some of the binary sequencing variables. The resulting problem is, then, solved using

CPELX. Experimental results comparing the performance of the above LP heuristic

Page 250: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

235

procedure with optimal algorithms that directly use CPLEX over the generated test

instances illustrate the effectiveness of the heuristic procedure. For the largest problems

(5 jobs, 10 operations/job, 12 machines, 7 workcenters, 7 scenarios) solved in this

experiment, an average savings of as much as 4154 seconds and 1188 seconds was

recorded in a comparison with Models 1 and 2 respectively. Both of these models solve

the deterministic equivalent of the stochastic NMJS problem using CPELX but differ in

that Model 1 uses a big H value of 1000 whereas Model 2 uses the conservative upper

bound for big H developed in this dissertation. The maximum optimality gap observed

for the LP heuristic over all the data instances solved was 1.35%. Experimental

observations also indicate an increase in computational time savings as the size of the

problem increases. The LP heuristic, therefore, offers a powerful alternative to solving

these problems to near-optimality with a very low computational burden.

Chapter 4 outlines the dynamic stochastic scheduling approach (DSSP) in which the

stochastic programming model for the NMJS problem is solved at each decision point (a

decision point is reached at the completion of a job-operation) using the LP heuristic in a

rolling horizon fashion while incorporating constraints that model existing conditions in

the shop floor and the actual processing times realized for the operations that have been

completed. This approach is applied to two large problem instances and its performance

is evaluated against (1) A DSSP approach that uses an optimal algorithm at each decision

point, and (2) The two-stage stochastic programming approach. Results from the

experimentation indicate that the DSSP approach that relies on using the LP heuristic at

each decision point generates superior assignment and sequencing decisions than the

two-stage stochastic programming approach and provides solutions that are near-optimal

with a very low computational burden. For the first instance which involves 40

operations, 12 machines and 3 processing time scenarios, the DSSP approach using the

LP heuristic yields the same solution as the optimal algorithm with a total time savings of

71.4% and also improves upon the two-stage stochastic programming solution by 1.7%.

In the second instance, the DSSP approach that relies on using the LP heuristic yields a

solution with an optimality gap of 1.77% and a total time savings of 98% over the

optimal algorithm. In this case, the DSSP approach with the LP heuristic improves upon

the two-stage stochastic programming solution by 6.38%. Finally we present framework

Page 251: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

236

for the implementation of the DSSP approach that extends the basic DSSP algorithm to

accommodate jobs whose arrival times may not be known in advance. The framework

also provides a scheme to discard completed operations from the system and roll the

reference time line forward as the horizon advances.

5.2 Future Research

Exploration of the proposed branch and bound approach for the NMJS problem and its

application to the real-life problem scenario appears to be a promising direction for

future research.

For a more realistic rendering of the problem, extensions to the two-stage stochastic

programming model proposed in section 3.2 could include stochasticity in travel times,

job arrival times and job compositions.

Currently, it is assumed that, upon completion of an operation, a job will immediately

begin its travel to the next machine. In a real-life scenario it may be more economical to

wait for a few more jobs, destined to the same location, to ship as a batch. Lot sizing

issues become relevant in such cases.

Quality is yet another important issue because the wafer travel between fabs could lead to

contamination problems. In such cases, one might wish to restrict the number of

possible wafer transfers. In the NMJS model (Section 3.2), this can be acheived by

placing a cap on the number of transfers possible for a job.

Also, in our present treatment of the problem, we have assumed all machines to be serial

processing machines. Since batch processors are very common in the semiconductor and

MEMS environment, characteristics relevant to these processors could be incorporated

in our modeling of the problem.

From an algorithmic point of view, the regularized decomposition method credited to

Ruszcsynski may be worth investigating as a possible alternative to the L-shaped method.

Hybrid methodologies that incorporate constraint programming and mixed integer

programming techniques may be explored for solving the deterministic equivalent of the

Page 252: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

237

problem. Metaheuristics such as tabu search have been known to perform well on

flexible job-shop scheduling problems and the use of these techniques to solve the real-

life NMJS problem may also be worth exploring.

Page 253: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

238

REFERENCES

1. Ultimate MEMS market analysis. April 2005, Yole Development.

2. Albornoz, V.M., P. Benario, and M.E. Rojas, A two-stage stochastic integer programming

model for a thermal power system expansion. International Transactions in Operational Research, 2004, 11: p. 243-257.

3. Alcaide, D., A. Rodriguez-Gonzalez, and J. Sicilia, An approach to solve the minimum

expected makespan flow-shop problem subject to breakdowns. European Journal of Operational Research, 2002, 140: p. 384-398.

4. Allahverdi, A. and J. Mittenthal, Scheduling on a two-machine flowshop subject to random

breakdowns with a makespan objective function. European Journal of Operational Research, 1995, 81: p. 376-387.

5. Allahverdi, A. and Y. Sotskov, Two machine flowshop minimum-length scheduling problem

with random and bounded processing times. International Transactions in Operational Research, 2003, 10: p. 65-76.

6. Alonso-Ayuso, A., L.F. Escudero, A. Garin, M.T. Ortuno, and G. Pérez, An approach for

strategic supply chain planning under uncertainty based on stochastic 0-1 programing. Journal of Global Optimization, 2003, 26: p. 97-124.

7. Alvarez-Valdez, R., A. Fuertes, J.M. Tamarit, G. Giménez, and R. Ramos, A heuristic to

schedule flexible job-shop in a glass factory. European Journal of Operational Research, 2005, 165, No.2: p. 525-534.

8. Bagga, P.C., n jobs, 2 machines sequencing problems with stochastic service times. Operations

Research, 1970, 7: p. 184-197. 9. Balasubramanian, J. and I.E. Grossman, A Novel Branch and Bound Algorithm for scheduling

flowshop plants with uncertain processing times. Computers and Chemical Engineering, 2002, 26, No. 1: p. 41-57.

10. Balasubramanian, J. and I.E. Grossman, Scheduling optimization under uncertainty.

Computers and Chemical Engineering, 2002, 27: p. 469-490. 11. Barbarosoglu, G. and Y. Arda, A two-stage stochastic programming framework for

transportation planning in disaster response. Journal of Operational Research Society, 2004, 55: p. 43-53.

12. Baykasoglu, A., Linguistic-based meta-heuristic optimization model for flexible job shop

scheduling. International Joural of Production Research, 2002, 40, No. 17: p. 4523-4543. 13. Baykasoglu, A., L. Ozbakir, and A.I. Sonmez, Using multiple objective tabu search and

grammars to model and solve multi-objective flexible job shop scheduling problems. Journal of Intelligent Manufacturing, 2004, 15, No. 6: p. 777-785.

Page 254: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

239

14. Benard, W.L., Workflow control for the MEMS job shop. Unpublished Ph.D. Thesis, Dept of

Electrical and Computer Engineering, Carnegie Mellon University, Pittsburgh, Pennsylvania, 2003.

15. Brandimarte, P., Routing and scheduling in a flexible job shop by tabu search. Annals of

Operations Research, 1993, 41: p. 157-183. 16. Chambers, J.B. and J.W. Barnes, Flexible job shop scheduling by Tabu Search. Technical

Report Series ORP 96-06, Graduate Program in Operations Research and Industrial Engineering, Dept of Mechanical Engineering, The University of Texas at Austin., 1996b.

17. Chen, Z.L., S. Li, and D. Tirupati, A scenario based stochastic programming approach for

technology and capacity planning. Computers and Operations Research, 2002, 29: p. 781-806. 18. Cunningham, A.A. and S.K. Dutta, Scheduling jobs with exponentially distributed processing

times on two machines of a flow shop. Naval Research Logistics Quarterly, 1973, 16: p. 69-81.

19. Darby-Dowman, K., S. Barker, E. Audsley, and D. Parsons, A two-stage stochastic

programming with recourse model for determining robust planting plans in horticulture. Journal of Operational Research Society, 2000, 51, No.1: p. 83-89.

20. Dauzère-Pérès, S. and J. Paulli, An integrated approach for modeling and solving the general

multiprocessor job-shop scheduling problem using tabu search. Annals of Operations Research, 1997, 70: p. 281-306.

21. Elmaghraby, S.E. and K.A. Thoney, The two machine stochastic flowshop problem with

arbitrary processing time distributions. IIE Transactions, 1999. 31: p. 467-477. 22. Engell, S., A. Märkert, G. Sand and R. Schultz, Aggregated scheduling of a multiproduct batch

plant by two-stage stochastic integer programming. Optimization and Engineering, 2003, 5: p. 335-359.

23. Foley, R.D. and S. Suresh, Stochastically minimizing the makespan in flow shops. Naval

Research Logistics Quarterly, 1984, 31: p. 551-557. 24. Frickinger, J., et al., Flying Wafer – A standardized methodology for Multisite processing of

300 mm wafers at Research and Development-sites. 12th IFAC Symposium on Information Control Problems in Manufacturing, INCOM 2006, 17-19 May, Saint-Etienne, France.

25. Forst, F.G., Minimizing total expected costs in the two machine, stochastic flow shop.

Operations Research Letters, 1983, 2: p. 58-61. 26. Gascon, A., P. Lefrancois, and L. Cloutier, Computer-assisted multi-item, multi-machine and

multi-site scheduling in a hardwood flooring factory. Computers in Industry, 1998, 36: p. 231-244.

27. Ginzburg, D.G., S. Kesler, and Z. Landsman, Industrial job-shop scheduling with random

operations and different priorities. International Journal of Production Economics, 1995, 40: p. 185-195.

Page 255: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

240

28. Gnoni, M.G., R. Iavagnilio, G. Mossa, G. Mummolo, and A. Di Leva, Production planning of multi-site manufacturing system by hybrid modelling: A case study from the automotive industry. International Journal of Production Economics, 2003, 85: p. 251-262.

29. Golenko-Ginzburg, D. and A. Gonik, Using "look-ahead" techniques in job-shop scheduling

with random operations. International Journal of Production Economics, 1997, 50: p. 13-22. 30. Golenko-Ginzburg, D. and A. Gonik, Optimal job-shop scheduling with random operations

and cost objectives. International Journal of Production Economics, 2002. 76: p. 147-157. 31. Golenko-Ginzburg, D., S. Kesler, and Z. Landsman, Industrial job-shop scheduling with

random operations and different priorities. International Journal of Production Economics, 1995, 40: p. 185-195.

32. Gourgand, M., N. Grangeon, and S. Norre, A contribution to the stochastic flowshop

scheduling problem. European Journal of Operational Research, 2003, 151: p. 415-433. 33. Graham, R.L., E.L. Lawler, J.K. Lenstra, and A.H.G. Rinooy Kan, Optimization and

approximation in deterministic sequencing and scheduling: a survey. Annals of Discrete Mathematics, 1979, 5: 287-326

34. Graves, S., A review of production scheduling. Operations Research, 1981, 29: p. 646-675. 35. Guinet, A., Multi-site planning: A transshipment problem. International Journal of Production

Economics, 2001, 74: p. 21-32. 36. Haneveld, W.K.K. and M.H.van der Vlerk, Optimizing electricity distribution using two-stage

integer recourse models. Stochastic Programming E-Print Series, ed. J.L. Higle and W. Romisch. Vol. 12. 2001.

37. Harjunkoski, I. and I.E. Grossman, Decomposition techniques for multistage scheduling

problems using mixed-integer and constraint programming methods. Computers and Chemical Engineering, 2002, 26: p. 1533-1552.

38. Ho, N.B. and J.C. Tay, GENACE: An efficient cultural algorithm for solving the flexible job-

shop problem. Proceedings of the 2004 Congress on Evolutionary Computation, 2004, 2: p. 1759-1766.

39. Hutchison, J., K. Leong, Snyder, D, and P. Ward, Scheduling approaches for random job

shop flexible manufacturing systems. International Journal of Production Research, 1991, 28, No. 11.

40. J. Cole, S., A.J. Schaefer, and J.W. Yen, A stochastic integer programming approach to

solving a synchronous optical network design problem. Networks, 2004, 44, No. 1: p. 12-26. 41. J.R.Birge and F.V.Louveaux, A multicut algorithm for two-stage stochastic linear programs.

European Journal of Operational Research, 1988, 34: p. 384-392. 42. Jain, V. and I.E. Grossman, Algorithms for hybrid MILP/CP models for a class of

optimization problems. INFORMS Journal of Computing, 2001. 13: p. 258-276.

Page 256: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

241

43. Jia, C., Minimizing variation in a stochastic flowshop. Operations Research Letters, 1998, 23: p. 109-111.

44. Jia, H.Z., A. Y. C. Nee, J. Y. H. Fuh, and Y.F. Zhang, A modified genetic algorithm for

distributed scheduling problems. Journal of Intelligent Manufacturing, 2003, 14: p. 351-362. 45. Jones, A. and L.C. Rabelo, Survey of job shop scheduling techniques. NISTIR, National

Institute of Standards and Technology, Gaithersburg, MD, 1998. 46. Kacem, I., S. Hammadi, and P. Borne, Approach by localization and multiobjective

evolutionary optimization for flexible job-shop scheduling problems. IEEE Transactions on Systems, Man and Cybernetics - Part C: Applications and Reviews, 2002, 32, No. 1: p. 1-13.

47. Kamburowski, J., Stochastically minimizing the makespan in two-machine flowshops without

blocking. European Journal of Operational Research, 1999, 112: p. 304-309. 48. Kamburowski, J., On three machine flowshops with random job processing times. European

Journal of Operational Research, 2000, 125: p. 440-449. 49. Kenyon, A.S. and D.P. Morton, Stochastic vehicle routing with random travel times.

Transportation science, Feb 2003, 37, No. 1: p. 69-83. 50. Kim, Y.-D., A comparison of dispatching rules for job shops with multiple identical jobs and

alternative routeings. International Journal of Production Research, 1990, 28, No. 5: p. 953-962.

51. Ku, P.S. and S.C. Niu, On Johnson's two-machine flowshop with random processing times.

Operations Research, 1986, 34: p. 130-136. 52. Kutanoglu, E. and I. Sabuncuoglu, Experimental Investigation of iterative simulation-based

scheduling in a dynamic and stochastic job shop. Journal of Manufacturing systems, 2001, 20, No. 4: p. 264-279.

53. Lai, T.-C., Y.N. Sotskov, N. Sotskova, and F. Werner, Mean flow time minimization with

given bounds of processing times. European Journal of Operational Research, 2004, 159: p. 558-573.

54. Laporte, G. and F. Louveaux, The integer L-shaped method for stochastic integer programs

with complete recourse. Operations Research Letters, 1993, 13: p. 133-142. 55. Lee, C.-Y. and Z.-L. Chen, Machine scheduling with transportation considerations. Journal of

Scheduling, 2001, 4: p. 3-24. 56. Luh, P.B., D. Chen, and L.S. Thakur, An effective approach for job-shop scheduling with

uncertain processing requirements. IEEE Transactions on Robotics and Automation, 1999, 15, No. 2: p. 328-339.

57. Makino, T., On a scheduling problem. Journal of Operations Research Society of Japan, 1965,

8: p. 32-44. 58. Mittal, B.S. and P.C. Bagga, A priority problem in sequencing with stochastic service times.

Operations Research, 1977, 14: p. 19-28.

Page 257: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

242

59. Mohring, R.H., A.S. Schulz, and M. Uetz, Approximation in stochastic scheduling: The power

of LP-based priority policies. 60. Morton, D. and E. Popova, A Bayesian stochastic programming approach to an employee

scheduling problem. IIE Transactions, 2004, 36, No. 2: p. 155-167. 61. Nasr, N. and E.A.Elsayed, Job shop scheduling with alternative machines. International

Journal of Production Research, 1990, 28, No. 9: p. 1595-1609. 62. Nowak, M.P. and W. Römisch, Stochastic Lagrangian Relaxation Applied to Power Scheduling

in a Hydro-Thermal System under Uncertainty. Annals of Operations Research, 2000, 100 (1-4): p. 251-272.

63. Nurnberg, R. and W. Römisch, A two-stage planning model for power scheduling in a hydro-

thermal system under uncertainty. Optimization and Engineering, 2002, 3: p. 355-378. 64. Philip, D., Scheduling reentrant flexible job shops with sequence dependent set up times.

2005, Masters Thesis, Montana State University. 65. Pinedo, M., Minimizing the expected makespan in stochastic flow shops. Operations

Research, 1982,. 30: p. 148-162. 66. Prasad, V.R., n x 2 flowshop sequencing problem with random processing times. Operations

Research, 1981, 18: p. 1-14. 67. Roux, W., S. Dauzere-Peres, and J.B. Lasserre, Planning and scheduling in a multi-site

environment. Production Planning and Control, 1999, 10. No. 1: p. 19-28. 68. Sand, G., S. Engell, A. Märkert, R. Schultz, and C. Schultz, Approximation of an ideal online

scheduler for a multiproduct batch plant. Computers and Chemical Engineering, 2000, 24: p. 361-367.

69. Sauer, J., T. Freese, and T. Teschke, Towards Agent Based Multi-Site scheduling. 1998. 70. Schaefer, L.A. and A.J. Schaefer, Locating hybrid fuel cell-turbine power generation units

under uncertainty. Annals of Operations Research, 2004, 132: p. 301-322. 71. Scrich, C.R., V.A. Armentano, and M. Laguna, Tardiness minimization in a flexible job shop:

A tabu search approach. Journal of Intelligent Manufacturing, 2004, 15, No. 1: p. 103-115. 72. Silva, E.L., et al., Transmission constrained maintenance scheduling of generating units. IEEE

Transactions on Power Systems, May 1995, 10, No. 4: p. 695-701. 73. Singer, M., Forecasting policies for scheduling a stochastic due date job shop. international

Journal of Production Research, 2000, 38, No. 15: p. 3623-3637. 74. Subramaniam, V., G.K. Lee, T. Ramesh, G.S. Hong, and Y. S. Wong, Machine selection rules

in a dynamic job shop. International Journal of Advanced Manufacturing Technology, 2000, 16: p. 902-908.

Page 258: Stochastic Scheduling for a Network of MEMS Job Shops · 2020. 1. 17. · stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach

243

75. Talwar, T.T., A note on sequencing problems with uncertain job times. Journal of Operations Research Society of Japan, 1967, 9: p. 93-97.

76. Tanev, I.T., T. Uozumi, and Y. Morotome, Hybrid evolutionary algorithm-based real-world

flexible job shop scheduling problem: Application service provider approach. Applied soft computing, 2004, 5, No.1: p. 87-100.

77. Tavakkoli-Moghaddem, R., F. Jolai, F. Vaziri, P.K. Ahmed, and A. Azaron, A hybrid method

for solving stochastic job shop scheduling problems. Applied mathematics and computation, 2005, 170: p. 185-206.

78. van Slyke, R. and R.J.-B. Wets, L-shaped linear programs with applications to optimal control

and stochastic programming. SIAM Journal on Applied Mathematics, 1969, 17: p. 638-663. 79. Wang, L. and L.L. Hee, A simulation study on release, synchronization and dispatching in

MEMS fabrication. Proceedings of the 2002 Winter Simulation Conference, 2002: p. 1392-1400.

80. Wang, L., L. Zhang, and D.Z. Zheng, A class of hypothesis-test based genetic algorithm for

flow shop scheduling with stochastic processing times. International Journal of Advanced Manufacturing Technology, 2005, 25, No.1-2: p. 1157-1163.

81. Weiss, G., Multiserver stochastic scheduling. Deterministic and stochastic scheduling, 1982: p.

157-179. 82. Wu, Z. and M.X. Weng, Multiagent scheduling method with earliness and tardiness objectives

in flexible job shops. IEEE Transactions on Systems, Man and Cybernetics - Part B: Cybernetics, 2005. 35, No.2: p. 293-301.

83. Xia, W.-J, and Z.-M. Wu, Hybrid particle swarm optimization approach for multi-objective

flexible job-shop scheduling problems. Control and Decision, 2005, 20, No. 2: p. 137-141. 84. Yoshitomi, Y., A genetic algorithm approach to solving stochastic job-shop scheduling

problems. International Transactions in Operational Research, 2002, 9: p. 479-495.