Genetic Algorithm and Particle Swarm Optimization Techniques for Solving Multi-Objectives on Single Machine Scheduling Problem

In this paper, two of the local search algorithms are used (genetic algorithm and particle swarm optimization), in scheduling number of products (n jobs) on a single machine to minimize a multi-objective function which is denoted as ∑ (total completion time, total tardiness, total earliness and the total late work). A branch and bound (BAB) method is used for comparing the results for (n) jobs starting from (5-18). The results show that the two algorithms have found the optimal and near optimal solutions in an appropriate time.

neighborhood functions with conditions of reordering items locally to get (obtain) a neighboring solution [1].Since the introduction of the local search techniques in the combinatorial optimization problems, the specialists have used these techniques for solving the NP-hard problems.Using these techniques in scheduling makes an allowance for us to test problems with a large number of jobs.Minimizing the total late work on a single machine was treated by Chin-Chia Wu [2].where he proposed three genetic algorithms and combined them to get the fourth one, his computational results showed that the three (GA) algorithms were getting the stability state when n becomes larger.The problem ∑ was solved by Tariq and Doha [3].by using simulated annealing (SA) and descent method (DM) for (75,…,30000) jobs, and they show that the (SA) gives a reasonable results for small n, and the times for both (DM) and (SA) algorithms are equal.Tariq and Faez [4].propose the (PSO) and the (GA) as heuristic methods to find approximation solutions for ∑ ∑ and they found that these local search algorithms solve the problem for jobs with reasonable time.The (PSO) algorithm were applied by Hanan [5].For solving the problem ∑ where she proposed a new style of development steps to achieve good convergence in application, and made a comparison between (PSO) and (GA) showed that the results of (PSO) are better than (GA) for jobs.Tariq, and Hafed reach jobs by using three local search techniques; descent method ( DM), simulated annealing (SA) and tabu search (TS) in solving the problem ∑ ∑ , where they showed that the performance of the algorithms is evaluated on a large set of test problems and the results which are compared showed that (SA) and (TS) algorithms are better than (DM) with preference to (SA) algorithm, and showed that the three algorithms find optimal or near optimal solutions in a reasonable times [6].

2.Problem Representation
A multi-objective problem is considered, and the formal description of this problem is set as follows: Scheduling jobs on a single machine which is always available can execute them, where each one of these jobs can be executed on that machine at its special time (i.e.only one job can be executed at a time), and the machine can do only one job at a time.

For
we will denote and as the processing time and the due date of the job respectively.The schedule ( ) will define a completion time ∑ for every job .The tardiness and the earliness will show up for every job .The late work of job is the amount of the processing time that is performed after the due date , where if ( , if ( and if ( [7].Every job will be ready to be processed at time zero, where no preemption is allowed and our objective is to find a feasible solution that gives the minimum value of the multi-objective function ∑ .
Using the standard scheduling problem classification notation, our problem is denoted by ∑ and formulated as;

Heuristics for the Problem
In this section, we will mention the heuristics that are used for the problem (A) where there are two simple known heuristics used as an initialization for looking at feasibility of the solutions in the entire search space.

First Heuristic (H1):
This heuristic is obtained by applying the shortest processing time (SPT) rule (i.e.sorting the jobs in order of ).

Second Heuristic (H2):
This heuristic is obtained by applying the earliest due date (EDD) rule (i.e.sorting the jobs in order of ).

(BAB) Method
The (BAB) method depends basically on the complete enumeration in the search area.It consists of two procedures; branching and bounding.The branching procedure is the dividing of a large problem into two or more sub-problems, while the bounding calculates a lower bound on the optimal solution's value for every sub-problem [8].

Upper Bounds:
As initialization of searching in the search tree by using the (BAB) method, the two heuristics in (3.1) and (3.2) are used to play as the upper bounds of our problem in this paper.

Lower Bound:
For deriving a lower bound, the problem can be decomposed into two sub-problems , where: For this sub-problem, the lower bound, which was applied by Hussam Abid Ali [9]. is used to obtain the first lower bound ( ), where: Here, the lower bound in the theorem (4.1) below is used for to get the second lower bound ( ).
If { } and { } where , then we have that; ∑ .Now, the following lemma allows us to use as the lower bound for the problem ( ).Lemma 1 [10].
If and are the lower bounds of the problems ( respectively, then , is the lower bound of the main problem .

(LB) procedure
For where represents the set of all jobs, is equal to the set of the scheduled jobs and is the set of the un-scheduled jobs, then the procedure is: 1.Starting with empty set of the scheduled jobs (i.e.
), and begin to sort the jobs (one by one) until we have |S| , and the job will be add to the set then we solve the last sequence by the complete enumerate method (CEM).At every step we calculate the cost ∑ ( ) .
2. For the set , the jobs have been sorted in two rules for calculating the costs for the two subproblems by doing the following steps; Step (1): Sorting the jobs in the set by (SPT) rule, and then calculate ∑ by using equation (1).
Step (3): Calculate the total cost ∑ as follows:

Local Search Methods
In a matter of using the local search methods, there is no guarantee of obtaining optimality, but using them may give us solutions that are near the optimal.Therefore, the local search methods considered as the second choice of solving the NP-hard problems.In this paper, two of these methods are applied the genetic algorithm (GA) and the particle swarm optimization (PSO).

Genetic Algorithm (GA):
The genetic algorithm (GA) is an evolutionary search technique used for the scheduling problems to obtain a near optimal solution for complex problems [11].It begins with a randomly generated population of chromosomes (feasible solutions) and replaces (iteratively) this population with a new one.The (GA) requires for the problem a good representation, and a fitness function that measures the chromosome's quality.Regeneration technique depends on selection of ancestors (parents) and reunites them by using the crossover to get successor (children), then applying the mutation to change them (locally) for obtaining better results [12].

(GA) Operators
The (GA) has a number of operators: 1. Representation: In this paper, a chromosome is represented by a sequence of jobs where every gene refers to a job [13].

Initialization:
The initial population can be obtained by either introducing heuristics or random arranging [13].In our paper, we choose the first way (i.e.introducing heuristics), and we take 50 Chromosomes as the size of our initial population as follows: create an initial population of (50) chromosomes, choose five chromosomes where three of them are randomly selected while the remain two chromosomes by applying the earliest due date (EDD) and the shortest processing time (SPT) (i.e.seeding good parents).

Selection:
We use the roulette wheel selection method, where we choose the chromosome with the lowest fitness value since it has a higher probability of participating in one or more children to the new generation.

Fitness function:
The fitness function specifies a value reflects the quality (goodness or badness) of the chromosome [14].Here, in this paper, the considered function is; ∑ . 5. Crossover: The 1-point legitimate crossover (LEGX) [15].Is used.The cut probability is 0.2 when and when we cut at every 1000 jobs.In parent 1 the cut will be at the end, while in the parent 2, the cut will be at the beginning, as the following example; Parent Where we generate the new population by mating each chromosome from step 2 with the whole five chromosomes, and every parent chromosome will produce 10 children chromosomes, so the resulting new population will consists of 50 new chromosomes.

Mutation:
In getting better results, the mutation operator inside the sequence as an intent to get an improvement [16].For our problem, the random swap between jobs is applied.

Termination:
The stopping criterion is 600 seconds.

The Problems Instances:
The performance of the (BAB) procedure is compared on 5 problem instances, the sizes of these examples are n = [5,18].The problem instances were generated randomly, and for each job where , the processing time was uniformly generated in [1,10].While the due date was uniformly generated in the interval [ ] as it has been showed in the literature [20].Where ∑ and the two parameters (TP and RDD) are said to be the tardiness factor and related range of due dates respectively, and have the following values: and 6.

Computational Results
In this subsection, the computational results is given in tables, each table of them gives the results.In Table 1.we put the comparison among BAB, GA and PSO for n = [5,18].The Table 2.The number of (*) and the number of (#).Note: The symbol (*) refers to the minimum value's average, and the symbol (#) refers to the minimum time's average of each (n).

The Tables of Results
In Table 1. the results of applying (BAB, GA and PSO) are showed for n= [5,18].Jobs.For each n there are 5 different examples are tested.These results showed that the value averages for n={5,6,7,8,9,10,11,12,15} of using (BAB and GA) are equal, while the averages of using (PSO) are bigger with small differences.The execution time's average results showed the priority of (GA) among them (i.e. the (GA) is faster than the others).In Table 2.For each (n) there are (5) problems examples for testing.The Table 2. Begins with n = 50(50)100, 100(100)1000, 1000(1000)5000, 5000(5000)20000.The results showed that value's averages of (GA) are better than the (PSO's) averages except when (n=600) jobs.The execution time's average results showed that the (PSO) is faster than (GA) in all problems which are tested.

Conclusions
In solving the problem A, two local search methods (GA and PSO) are used, The results showed that the two methods have a good efficiency in finding the optimal or near optimal solutions comparing with the (BAB) method which is used, where (GA) showed that it has reached the optimal solutions many times as well as the (BAB), and the (PSO) showed the advantage in the execution times, where it is faster than (GA) in solving the problems.
contained the values and times by averages for n = [50, 20000].For each n there is 5 problems examples are tested.The symbols which used in the tables are: n: The number of jobs, BAB: The branch and bound method, GA: The genetic algorithm, PSO: The particle swarm optimization, : The average of the value, : The average of the execution time of the problem (by second).Best: The best (value & time) average.No. :

Table 1 :
comparison among BAB, GA and PSO.

Table 2 .
the averages of the (values & times) of (GA) and (PSO).