Simulated Annealing (Meta-heuristics).pptx

afar1111 72 views 22 slides Jul 02, 2024
Slide 1
Slide 1 of 22
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22

About This Presentation

Simulated Annealing (Meta-heuristics).pptx


Slide Content

Simulated Annealing Dr. Ahmed Fouad Ali Faculty of Computers and Informatics Suez Canal University

Meta-heuristics techniques

Outline 3. Simulated annealing (SA)(Background) 4. SA (main concepts) 6. SA control parameters 5. SA algorithm 7. SA example 2. Motivation 1. What is annealing? 8. SA applications

What is annealing? Annealing is a physical process used to harden metals , glass, etc. Starts at a high temperature and cools slowly. Lower temperature of the glass slowly so that at each temperature the atoms can move just enough to begin adopting the most stable orientation.

What is annealing ? (Cont.) The temperature determines how much mobility the atoms have. How slowly you cool glass is critical. If the glass is cooled slowly enough, the atoms are able to " relax " into the most stable orientation. For example , the glass surrounding a hockey rink.

Motivation starting point descend direction local minima global minima barrier to local search ?

Simulated annealing (SA) (Background) The method was independently described by Scott Kirkpatrick , C. Daniel Gelatt and Mario P. Vecchi in 1983, and by Vlado Černý in 1985. SA is probably the most widely used meta-heuristic in combinatorial optimization problem. It was motivated by the analogy between the physical annealing of metals and the process of searching for the optimal solution in a combinatorial optimization problem. The main objective of SA method is to escape from local optima and so to delay the convergence.

Simulated annealing (main concepts) SA proceeds in several iterations from an initial solution x . At each iteration, a random neighbor solution x' is generated. The neighbor solution that improves the cost function is always accepted . Otherwise , the neighbor solution is selected with a given probability that depends on the current temperature T and the amount of degradation E of the objective function . E represents the difference in the objective value between the current solution x and the generated neighboring solution x' .

The probability follows, in general, the Boltzmann distribution as shown in the following equation Many trial solutions are generated as an exploration process at a particular level of temperature . The temperature is updated until stopping criteria satisfied. Simulated annealing (main concepts)

Simulated annealing algorithm Inner loop Outer loop

SA control parameters Choosing too high temperature will cost computation time expensively . Too low temperature will exclude the possibility of ascent steps, thus losing the global optimization feature of the method. We have to balance between these two extreme procedures. Initial temperature

SA control parameters (Cont.) If the temperature is decreased slowly , better solutions are obtained but with a more significant computation time. A fast decrement rule causes increasing the probability of being trapped in a local minimum . Choice of the temperature reduction strategy

In order to reach an equilibrium state at each temperature , a number of sufficient transitions (moves) must be applied. The number of iterations must be set according to the size of the problem instance and particularly proportional to the neighborhood size. Equilibrium State. SA control parameters (Cont.)

Concerning the stopping condition , theory suggests a final temperature equal to . In practice , one can stop the search when the probability of accepting a move is negligible , or reaching a final temperature TF . Stopping criterion SA control parameters (Cont.)

SA examples 1|| S w j T j Jobs 1 2 3 4 w j 4 5 3 5 p j 12 8 15 9 d j 16 26 25 27

SA examples Step 1. S =S 1 =(1,3,2,4). G(S 1 )=136. Let T=10 and a=0.9 => b 1 =9 Step 2. Select randomly which jobs to swap, suppose a Uniform(0,1) random number is V 1 = 0.24 => swap first two jobs Sc=(3,1,2,4), G(S c )=174, P( S k ,S c )=1.5% U 1 =0.91 => Reject S c Step 3 . Let k=2 Iteration 1

SA examples Step 2. Select randomly which jobs to swap, suppose a Uniform(0,1) random number is V2= 0.46 => swap 2nd and 3rd jobs Sc=(1,2,3,4), G(Sc)=115 S3=S0=Sc Step 3. Let k=3 Iteration 2

SA examples Step 2. V3= 0.88 => swap jobs in 3rd and 4th position Sc=(1,2,4,3), G(Sc)=67 S4=S0=Sc Step 3. Let k=4 Iteration 3

SA examples Step 2. V 4 = 0.49 => swap jobs in 2 nd and 3 rd position S c =(1,4,2,3), G(S c )=72, b 4 =10(0.9) 4 =6.6 P( S k ,S c )=47%, U 4 =.90 => S 5 =S 4 Step 3. Let k=5 Iteration 4

SA examples Step 2. V 5 = 0.11 => swap 1 st and 2 nd jobs S c =(2,1,4,3), G(S c )=83, b 5 =10(0.9) 5 =5.9 P( S k ,S c )=7%, U 5 =0.61 => S 6 =S 5 Step 3. Let k=6 Iteration 5

SA Applications Scheduling Quadratic assignment Frequency assignment Car pooling Capacitated p-median, Resource constrained project scheduling (RCPSP) Vehicle routing problems Graph coloring Retrieval Layout Problem Maximum Clique Problem, Traveling Salesman Problems Database systems Nurse Rostering Problem Neural Nets Grammatical inference, Knapsack problems SAT Constraint Satisfaction Problems Network design Telecomunication Network Global Optim ization

References Metaheuristics From design to implementation, El- Ghazali Talbi , University of Lille – CNRS – INRIA. S. Kirkpatrick , C. Gelatt , M. Vecchi , Optimization by simulated annealing, Science 220 (1983), 671-680.
Tags