Solution Concept in TSP Problem Applied to Genetic Algorithm
J. Vasavi
Assistant Professor, Department of Computer Applications, SRM University, Chennai,
Tamilnadu, India.
*Corresponding Author E-mail: vasnathan2008@gmail.com
ABSTRACT:
The main purpose of this study is to propose a new representation method of chromosomes using binary matrix and new fittest criteria to be used as method for finding the optimal solution for TSP. The concept of the proposed method is taken from genetic algorithm of artificial inelegance as a basic ingredient which has been used as search algorithm to find the near-optimal solutions. Here we are introducing the new fittest criteria for crossing over, and applying the algorithm on Symmetric as well as asymmetric TSP, also presenting asymmetric problem in a new and different way.
KEYWORDS: Genetic algorithm, fittest criteria, asymmetric travelling salesman problem.
INTRODUCTION:
As far as the artificial inelegance is concerned, the genetic algorithm is an optimization technique based on natural evolution that is the change over a long period of time. Genetic algorithm (GAs) has been used as a search technique of many NP problems. Genetic algorithms have been successfully applied to many different types of problems, though several factors limit the success of a GA on a specific function. Problem required are good, but optimal solutions are not ideal for GAs. The manner in which points on the search space are represented is an important consideration. An acceptable performance measure or fitness value must be available. It must also be feasible to test many potential solutions. In nature the fittest individual is most likely to survive and mutate, therefore the next generation should be fitter and healthier because they were bred from healthy parents. The same idea can be applied on the TSP problem by first finding the different solutions and then combine those, which are the fittest solutions among them, in order to create a new and healthy solution and should be optimal or near optimal according to the problem.
TSP is a well known problem for finding the optimal path which can be solved by various methods. Many algorithms have been developed for TSP but here we are using the concept of genetic algorithm. Other approximation techniques for finding near optimum solutions for TSP based on heuristics are proposed in the literature such as [1] simulated annealing [2], ant colonies [3], genetic algorithms (GA) [4] and [5]. John Holland’s pioneering book “Adaptation in natural Artificial System (1975, 1992) showed how the evolutionary process can be applied to solve a wide variety of problems using a highly parallel technique that is now called genetic algorithm. Genetic Algorithms have been applied to a large number of real world problems. One of the first such applications was a gas pipeline control system created by [6], [7] mentions several GA applications including message routing, scheduling, and automated design. Entire conferences have been devoted to applications of Genetic Algorithms and evolutionary techniques to specific disciplines, such as Image Analysis, Signal Processing and Telecommunications [8]. The proposed genetic algorithm in this paper build on much work done by previous researchers [4], but we introduces additional improvements, providing an algorithm for symmetric as well as Asymmetric TSP [11], here we are implementing the new fittest criteria as well as new representation of asymmetric matrix and improving our solution by applying the crossover and mutation again and again in order to get the optimal solution.
II. GENETIC ALGORITHMS:
The Genetic Algorithm involves the following basic steps.
• Evaluation
• Cross over
• Mutation
Here we are trying to describe the main function used by algorithm to find the shortest closed path through a group of two dimensional positions. The Genetic Algorithm transforms a population of individual objects, each with an associated fitness value, into a new generation of the population using the Darwin principle of individual of reproduction and survival of the fittest and naturally occurring genetic operation such as a cross over (recombination) and mutation [12]. Each individual in the population represents a possible solution to a given problem. The genetic algorithm attempts to find a very good or best solution to the problem by genetically breeding the population of individuals.
A. Evolution:
The evolution function plays a very important role in genetic algorithm. Here we use a fittest function to decide how good a chromosome is. It also helps us in the selection of better chromosomes for the next cross over step, if we have relatively large number of initial population.
B. Cross Over:
After the completion of the selection process, the chromosomes chosen to be parents for the next generation are recombined to form children that are new chromosomes[13]. This combination can take many forms and the crossover operator can greatly affects the result of the search.
C. Mutation:
The operation of mutation allow new individual to be created. It begins by selecting an individual from the population based on its fitness. A point along the string is selected at random and the character at that point is randomly changed, the alternate individual is then copied in to the next generation of the Population.
Mutation is performed after crossover by randomly choosing a chromosome in the new generation to Mutate. We randomly choose a point to mutate and switch that point. Many types of mutation operators Exist. Here we are using the bit flip method which is only used with a binary chromosome representation, that changes a particular point on the chromosome to its opposite.
Outline of Basic Genetic Algorithm:
Generate random population of n genes (suitable solutions for the problem)
Evaluate the fitness f(x) of each gene x in the population
Create a new population by repeating following steps until the new population is complete
1. Selection: Select two parent genes from a population according to their fitness.
2. Crossover: With a crossover probability cross over the parents to form a new offspring (children). If no crossover was performed in the process uses the exact copy of parents.
3. Mutation: With a given probability, mutation alters to produce new offspring.
4. Accepting: finally this Place a new offspring in a new population in the set.
Use new generated population for a further run of algorithm [16] If the end condition is satisfied then the process will stop, and return to the best solution in current population. Finally go to step 2.
A typical flowchart of a genetic algorithm is shown in Figure 1. One iteration of the algorithm is referred to as a generation.
Figure 1. Flow chart of Genetic Algorithm
III. Genetic operator result extraction:
Although obtainable GAs were not specifically designed for itinerary planning, but rather as global search algorithms they offer a set of advantages for machine learning. Several methodologies for machine learning are based on the search for a good model inside the space of possible models. In this substance they are very flexible because the same Genetic Algorithm can be used with different representations. Combinatorial version of Genetic learning processes conceal different levels of difficulty according to the structural changes produced by the TS algorithm[18], from the simplest case of parameter optimization to the highest level of complexity for learning the rule set of a rule-based system via the COPE approach and the cooperation or competition between genes.
IV Genetic algorithms to solving operators optimal itineraries:
Mining optimal itineraries is not an easy task when there is a set of POI. It has some challenge and algorithmic complexity. The number of POI grows exponentially with the number of items. But this complexity is tackled with some latest algorithms which can efficiently prune the search space. Secondly, the problem of finding results from rules, i.e. picking optimal results from set of outputs.
In general the main motivation for using GAs in the itinerary selection is that they perform an optimal search and cope better with POI interaction than the greedy rule induction algorithms often used in data mining. The use of COPE in itinerary planning helps to predict optimal package based on the given POIs. This section discusses several aspects of GAs for itinerary planning. The main areas of discussion include individual representation of POIs, Genetic Operators involved and the choice of Fitness function.
In COPE_GA, an initial population consisting of a set of solution is chosen and then the solutions are evaluated. Relatively more effective solutions are selected to have more off springs which are, in some way, related to the original solutions. If the genetic operator is selected accurately then the final population will have better solutions so the genetic operator chooses an optimal gene from multiple iterations. GA improves the whole population. TS aim at producing one best solution. For the TS, we require several good initial solutions to ensure the required number of good initial solution.
The COPE_GA:
1. Choose initial population I from TS
2. Evaluate the fitness of each individual in the population using
3. Repeat until termination: (time limit or sufficient fitness achieved)
a. Select optimum -ranking individuals to reproduce
b. Breed new generation through crossover and/or mutation (genetic operations) and give origin to offspring
c. Evaluate the individual fitnesses of the offspring
d. Replace worst ranked part of population with offspring
The Algorithm can be depicted as:
1.C_TS generates n initial solutions using C_GA. It runs C_GA for fixed number of iterations, t.
a. Choose initial population of fixed size and set j=1
b. While (j<=t)
Begin
i. Apply the operator on the two parent schedules chosen randomly to produce two offspring and replace the parents by the best two the four schedules.
ii. j=j+1
End
2. C_TS sends m best solutions chosen to the m POIs.
3. Each worker node runs the C_TS algorithm by using the initial state received.
4. Upon receiving a converged result from the C_TS, the system stops execution.
Fitness function:
Let a rule be of the form:
IF A THEN C,
Where A is the predecessor (a set of POI) and C is the result (predicted package). The predictive performance of the result can be summarized by a tabular format Table 1.
Table.1 Predictive Performance of the Result
|
Actul class |
||
C |
Not C |
||
Predicted Class |
C |
Tp |
Fp |
NOT C |
En |
Tn |
Confusion Matrix for a classification rule:
The labels in each cell of the table have the following meaning:
TP = True Positives = Number of results satisfying A and C
FP = False Positives = Number of results satisfying A but not C
FN = False Negatives = Number of results not satisfying A but satisfying C
TN = True Negatives = Number of results not satisfying A nor C
Clearly, the higher the value of TP and TN, and the lower the values of FP and FN, is the better the rule.
Belief Factor, BF = TP / (TP + FP)(1)
Now measure the predictive accuracy of a package by taking into accounts not only its BF but also a measure of how “complete” the travel is, i.e. is the proportion of results having the predicted class C that is actually covered by the rule antecedent.
The result measure, denoted Res, is computed by the formula:
Res = TP / (TP + FN)(2)
In order to combine the BF and Res measures one can define a fitness function such as:
Fitness = BF × Res (3)
V. RESULTS;
Intel lab sensor dataset is used for evaluating the proposed GA. The genetic parameters are given in Table.2
Table.2 Genetic Algorithm Parameters
GA Population |
500 |
Selection Rule from |
COPE_TS |
Cross over probability |
0.1 |
Mutation Probability |
0.005 |
Fitness Function |
As defined |
Figure.2 shows the relation between the numbers of packages obtained with respect to the size of POI (Point Of Interest). Maximum number of results is obtained when the size of POI is 2 and the thereafter decreasing with increase of POIs. It shows the results analysis for the packages obtained. The number of results obtained for itinerary selection when number of POI increases proportionally.
Figure.2 Number of POI vs. Size of result (no-of-packages)
An Intel I3 2.2 GHz processor with 2 GB RAM was used to measure the execution time and memory usage.
VI. CONCLUSION:
In this paper we have given a very effective procedure for TSP by using the genetic algorithm of artificial intelligence. Also providing the fittest criteria, the proposed algorithm is for symmetric as well as asymmetric TSP with a different representation for asymmetric TSP which is very useful in solving the Problem easily Furthermore, in the algorithm given by is for symmetric TSP and the off spring is done by OR operation, in the proposed algorithm here we are introducing the fittest criteria and applying it on the symmetric as well as asymmetric TSP.
VII. REFERENCES:
1. B. Freisieben and P. Merz, New Genetic Local Search Operator for Travelling salesman Problem, Conference on Parallel Problem Solving From nature, app. 890-899, 1996.
2. P. van Laarhoven and E. H. L. Aarts, Simulated Annealing: Theory and Applications, Kluwer Academic, 1987,
3. M. Dorigo and L. M. Gambradella, Ant Colony System: A Cooperative Learning Approach to the Travelling Salesman Problem, JEEE Transaction on Evolutionary Computation, Vol. l, pp. 53-66, 1997.
4. D.E. Goldberg, Genetic Algorithm in Search, Optimization and Machine Learning. Addison-Wesley, 1989.
5. Naef Taher Al Rahedi and Jalal Atoum, Solving TSP problem using New Operator in Genetic Algorithms, American Journal of Applied Sciences 6(8):1586-1590, 2009.
6. Goldberg D., Computer-Aided Pipeline Operations Using Genetic Algorithms and Rule Learning. Part I; Genetic Algorithms in pipeline Optimization, Engineering with Computer 3, 1987.
7. Greffenstette, J., Genetic Algorithms Made Easy, 1991.
8. Poli, R., el. al. (Eds), Evolutionary Image Analysis, Signal Processing and Telecommunications Springer, 1999.
9. Tsai, H. K., Yang, J. M., Tsai, Y. F., and Kao, C. Y. (2004). An evolutionary algorithm for large traveling salesman problems. Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on, 34(4), 1718-1729
10. G. Clarke and J.W. Wright, “Scheduling of vehicles from a central depot to a number of delivery points,” Oper. Res., vol. 12, pp. 568–581, 1964.
11. D. Whitely, T. Stark weather, and F. D’ Ann, “Scheduling problems and traveling salesman: The genetic edge recombination operator,” in Proc. 3rd Int. Conf. Genetic Algorithms, 1989, pp. 133–140.
12. S. Kirkpatrick, C. D. Gelatt Jr, and M. P. Vecchi, “Optimization by simulated annealing,” Science, vol. 220, pp. 498–516, 1983.
13. F. Alizadeh, R. M. Karp, L. A. Newberg, and D. K. Weisser, “Physical mapping of chromosomes: A combinatorial problem in molecular biology,” in Proc. 4th ACM-SIAM Symp. Discrete Algorithms (SODA), 1993, pp. 52–76.
14. C. Korostensky and G. H. Gonnet, “Using traveling salesman problem algorithms for evolutionary tree construction,” Bioinformatics, vol. 16, no. 7, pp. 619–627, 2000.
15. H. J. Bremermann, M. Rogson, and S. Salaff, “Search by evolution,” in Biophysics and Cybernetic Systems. Washington, DC: Spartan, 1965, pp. 157–167.
16. M. Oliver, D. J. Smith, and J. R. C. Holland, “A study of permutation crossovers on the TSP,” in Proc. 2nd Int. Conf. Genetic Algorithm Their Applications, 1987, pp. 224–230.
17. Handbook Genetic Algorithms, L. Davis, Ed., Van Nostrand, New York, 1991, pp. 332–349.
18. H. Mühlenbein, M. Gorges-Schleuter, and O. Krämer, “Evolution algorithms in combinatorial optimization,” Parallel Comput., vol. 7, pp. 65–85, 1988.
19. J. J. Grefenstette, “Incorporating problem specific knowledge into genetic algorithms,” Genetic Algorithms Simulated Annealing, pp. 42–60, 1987.
20. B. Freisleben and P. Merz, “New genetic local search operators for the traveling salesman problem,” in Proc. Parallel Problem Solving Nature IV (PPSN IV), 1996, pp. 890–899.
Received on 12.08.2017 Modified on 24.08.2017
Accepted on 13.09.2017 © RJPT All right reserved
Research J. Pharm. and Tech. 2018; 11(1): 255-258.
DOI: 10.5958/0974-360X.2018.00048.3