Simulated Annealing

25,479 views 25 slides Jul 23, 2013
Slide 1
Slide 1 of 25
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25

About This Presentation

SA is a global optimization technique.
It distinguishes between different local optima.
It is a memory less algorithm & the algorithm does not use any information gathered during the search.
SA is motivated by an analogy to annealing in solids.
& it is an iterative improvement algorithm.


Slide Content

Term Paper on- Global Optimization Search Techniques (A I – Soft computing Domain) PRESENTED BY - JOY DUTTA ROLL – 91/CSE/101006 BTECH 6 TH SEM COMPUTER SCIENCE & ENGINEERING CALCUTTA UNIVERSITY 20-Jun-13 1

MAIN PROBLEM -> OPTIMIZATION 20-Jun-13 2 TABU SEARCH , GREEDY APPROACH , STEEPEST DESCEND, ETC SIMMULATED ANNEALING, PARTICLE SWARM OPTIMIZATION (PSO ), GRADIENT DESCENT ETC

Difficulty in Searching Global Optima 20-Jun-13 3 starting point descend direction local minima global minima barrier to local search

Simulated Annealing(SA) SA is a global optimization technique. SA distinguishes between different local optima. SA is a memory less algorithm, the algorithm does not use any information gathered during the search SA is motivated by an analogy to annealing in solids. Simulated Annealing – an iterative improvement algorithm. 20-Jun-13 4

Consequences of the Occasional Ascents 20-Jun-13 5 Help escaping the local optima. desired effect Might pass global optima after reaching it adverse effect

Background: Annealing Simulated annealing is so named because of its analogy to the process of physical annealing with solids,. A crystalline solid is heated and then allowed to cool very slowly until it achieves its most regular possible crystal lattice configuration (i.e., its minimum lattice energy state), and thus is free of crystal defects. If the cooling schedule is sufficiently slow, the final configuration results in a solid with such superior structural integrity. Simulated annealing establishes the connection between this type of thermodynamic behaviour and the search for global minima for a discrete optimization problem. 20-Jun-13 6

Background (cont..) Solid is heated to melting point High-energy, high-entropy state Removes defects/irregularities Temp is very slowly reduced Recrystallization occurs (regular structure) New internal state of diffused atoms Fast cooling induces fragile structure 20-Jun-13 7

Control of Annealing Process 20-Jun-13 8 Acceptance of a search step (Metropolis Criterion): Assume the performance change in the search direction is . Accept a ascending step only if it pass a random test, Always accept a descending step, i.e.

Relationship between Physical Annealing and Simulated Annealing Thermodynamic Simulation Combinatorial Optimization System states solutions Energy Cost Change of State Neighbouring Solutions Temperature Control Parameter T Frozen State Heuristic Solution 20-Jun-13 9

Stopping Criterion A given minimum value of the temperature has been reached. A certain number of iterations (or temperatures) has passed without acceptance of a new solution. A specified number of total iterations has been executed 20-Jun-13 10

11 Flow Chart: Start With an initial solution Add new random stand at random period Improvement? Accept new Solution Stop criteria? Stop P(delta)>rand? yes yes yes no no no P(delta)  1 when c is very high. P(delta)  0 when c is very small rand (0,1)

Simulated Annealing Algorithm Initial temperature ( TI) Temperature length ( TL) : number of iterations at a given temperature cooling ratio (function f): rate at which temperature is reduced . f(T) = aT , where a is a constant, 0.8 ≤ a ≤ 0.99 (most often closer to 0.99) stopping criterion 20-Jun-13 12

Simulated Annealing Algorithm construct initial solution x0; xnow = x0 set initial temperature T = TI repeat for i = 1 to TL do generate randomly a neighbouring solution x′ ∈ N(xnow) compute change of cost Δ C = C(x′) - C(xnow) if Δ C ≤ 0 then xnow = x′ (accept new state) else Generate q = random(0,1) if q < exp(- Δ C /T) then xnow = x′ end if end if end for set new temperature T = f(T) until stopping criterion return solution corresponding to the minimum cost function 20-Jun-13 13

Convergence of simulated annealing 20-Jun-13 14

Implementation of Simulated Annealing 20-Jun-13 15 Understand the result: This is a stochastic algorithm. The outcome may be different at different trials. Convergence to global optima can only be realized in asymptotic sense.

Qualitative Analysis Randomized local search . Is simulated annealing greedy? Controlled greed . Is a greedy algorithm better? Where is the difference? Explain with - The ball-on-terrain example. 20-Jun-13 16

Ball on terrain example – Simulated Annealing vs Greedy Algorithms The ball is initially placed at a random position on the terrain. From the current position, the ball should be fired such that it can only move one step left or right. What algorithm should we follow for the ball to finally settle at the lowest point on the terrain? 20-Jun-13 17

Ball on terrain example – SA vs. Greedy Algorithms 20-Jun-13 18

Jigsaw puzzles – Intuitive usage of Simulated Annealing Given a jigsaw puzzle such that one has to obtain the final shape using all pieces together. Starting with a random configuration, the human brain unconditionally chooses certain moves that tend to the solution . However, certain moves that may or may not lead to the solution are accepted or rejected with a certain small probability. The final shape is obtained as a result of a large number of iterations. 20-Jun-13 19

Applications Circuit partitioning and placement . Hardware/Software Partitioning Graph partitioning VLSI: Placement, routing. Image processing Strategy scheduling for capital products with complex product structure. Umpire scheduling in US Open Tennis tournament! Event-based learning situations. etc 20-Jun-13 20

20-Jun-13 21 Advantages: can deal with arbitrary systems and cost functions statistically guarantees finding an optimal solution is relatively easy to code, even for complex problems. generally gives a ``good'' solution This makes annealing an attractive option for optimization problems where heuristic (specialized or problem specific) methods are not available.

20-Jun-13 22 Repeatedly annealing with a 1/log k schedule is very slow, especially if the cost function is expensive to compute. For problems where the energy landscape is smooth , or there are few local minima, SA is overkill - simpler, faster methods (e.g., gradient descent) will work better. But generally don't know what the energy landscape is for a particular problem. Heuristic methods, which are problem-specific or take advantage of extra information about the system, will often be better than general methods, although SA is often comparable to heuristics. The method cannot tell whether it has found an optimal solution. Some other complimentary method (e.g. branch and bound) is required to do this. Disadvantages

Conclusions Simulated Annealing algorithms are usually better than greedy algorithms, when it comes to problems that have numerous locally optimum solutions. Simulated Annealing is not the best solution to circuit partitioning or placement. Network flow approach to solving these problems functions much faster. Simulated Annealing guarantees a convergence upon running sufficiently large number of iterations. 20-Jun-13 23

Reference: 20-Jun-13 24 P.J.M. van Laarhoven, E.H.L. Aarts, Simulated Annealing: Theory and Applications, Kluwer Academic Publisher, 1987. A. A. Zhigljavsky, Theory of Global Random Search, Kluwer Academic Publishers, 1991.

Thank You!