AOT3 Multivariable Optimization Algorithms.pdf

SandipBarik8 126 views 33 slides Feb 07, 2024
Slide 1
Slide 1 of 33
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33

About This Presentation

CADE


Slide Content

Advanced Optimization Theory
MED573
Multi Variable Optimization Algorithms
Dr. Aditi Sengupta
Department of Mechanical Engineering
IIT (ISM) Dhanbad
Email: [email protected]
1

Introduction
2
•Methodsusedtooptimizefunctionshavingmultipledesignvariables.
•Somesinglevariableoptimizationalgorithmsareusedtoperform
unidirectionalsearchalongdesireddirection.
•Broadlyclassifiedintotwocategories:
(i)Directsearchmethods
(ii)Gradient- basedmethods

•Optimalitycriteriadifferfromsingle-variableoptimization.
•Inmulti-variableoptimization,thegradientofafunctionisnotascalarquantity–it
isavectorquantity.
•LetusassumethattheobjectivefunctionisafunctionofNvariablesrepresentedby
x
1,x
2,…,x
N.

Thegradientatanypointx(t)isrepresentedby∇????????????????????????
????????????
whichisN-dimensional
vectorgivenby
∇????????????????????????
????????????
=
????????????????????????
????????????????????????
1
,
????????????????????????
????????????????????????
2
,…,
????????????????????????
????????????????????????
????????????
????????????
atx(t)
3
Optimality Criteria

•Thesecond-orderderivativesformamatrix,∇
2
????????????(????????????
(????????????)
)knownasHessianmatrixgivenby

2
????????????????????????
????????????
=
????????????
2
????????????
????????????????????????
1
2
????????????
2
????????????
????????????????????????
1????????????????????????2

????????????
2
????????????
????????????????????????
1????????????????????????????????????????????????
2
????????????
????????????????????????
1????????????????????????2
????????????
2
????????????
????????????????????????
2
2

????????????
2
????????????
????????????????????????
2????????????????????????????????????
⋮ ⋮ ⋱ ⋮????????????
2
????????????
????????????????????????
????????????????????????????????????1
????????????
2
????????????
????????????????????????
????????????????????????????????????2

????????????
2
????????????
????????????????????????
????????????
2
•Nowthatwehavethederivatives,wearereadytodefinetheoptimalitycriteria.
•Apoint̅????????????isastationarypointif∇????????????̅????????????=0.
•Thepointisaminimum,amaximumoraninflectionpointif∇
2
????????????̅????????????ispositivedefinite,
negativedefiniteorotherwise.
4
Optimality Criteria

•Thematrix∇
2
????????????(????????????
(????????????)
)isdefinedtobepositivedefiniteifforanypointyinthe
searchspacethequantity????????????
????????????

2
????????????
????????????
????????????
????????????≥0.
•Thematrix∇
2
????????????(????????????
(????????????)
)isdefinedtobenegativedefiniteifforanypointyinthe
searchspacethequantity????????????
????????????

2
????????????
????????????
????????????
????????????≤0.
•Ifatsomepointy
+
inthesearchspace????????????
+????????????

2
????????????????????????
????????????
????????????
+
≥0andforsomeother
pointy
-
,
????????????
−????????????

2
????????????????????????
????????????
????????????

≤0thenthematrixisneitherpositivedefinitenor
negativedefinite.
•Othertestforpositivedefiniteness–alleigenvaluesarepositiveorifallprincipal
determinantsarepositive.
5
Optimality Criteria

•Successiveunidirectionalsearchtechniquesareusedtofindminimumalongasearch
direction.
•Unidirectionalsearchisaone-dimensionalsearchperformedbycomparingfunction
valuesonlyalongaspecifieddirection.
•Pointsthatlieonaline(inN-dimensionalspace)passingthroughx(t)andoriented
alongs(t)areconsideredinthesearch,expressedas
????????????????????????=????????????
(????????????)
+????????????????????????
(????????????)
Eq.(1)
whereαisascalarquantity.
6
Unidirectional Search

•Wecanrewritethemultivariableobjectivefunctionintermsofasingle
variableαbysubstitutingxbyx(α).
•Then,usingasinglevariablesearchmethod,minimumisfound.
•Onceoptimumvalueα*isobtained,itscorrespondingpointcanalsobe
found.
7
Unidirectional Search

Minimizef(x
1,x
2)=(x
1–10)
2
+(x
2–10)
2
Theminimumliesatpoint(10,10)
T
fromthecontourplotof
thefunction. Here,functionvalue=0.
Letthepointofinterestbex
(t)
=(2,1)
T
andweareinterested
infindingminimumandcorrespondingfunctionvaluein
searchdirections
(t)
=(2,5)
T
.
Fromright-angledtriangleshownindottedline,theoptimal
pointobtainedisx*=(6.207, 11.517)
T
.
Letusseeifwecanobtainthesolutionbyperforming
unidirectionalsearchalongs
(t).
8
Unidirectional Search: Example

9
Unidirectional Search: Example

Wewilluseabracketingalgorithmtoencloseoptimum
pointandthenuseasingle-variableoptimizationmethod.
First,letususeboundingphasemethod.Assumeinitial
guessofx
(0)
=0andincrementΔ=0.5.
Theboundsforαareobtainedas(0.5,3.5)with6function
evaluations.Thebracketingpointsarethenevaluatedas
(3,3.5)
T
and(9,18.5)
T
.
Next,weusegoldensectionsearchmethodtofind
optimumpoint.Letususea=0.5andb=3.5.Weobtain
α*=2.103asminimum.
Substitutingα*=2.1035,x
(t)
=(2,1)
T
ands
(t)
=(2,5)
T
in
Eq.(1),weobtainx*=(6.207,11.517)
T
10
Unidirectional Search: Example

•In single-variable optimization, there are only two search directions a point
can be modified –either in positive x-direction or negative x-direction.
•In multi-objective optimization, each variable can be modified either in
positive or negative directions leading to 2
N
ways of modification.
•One-variable-at-a-timealgorithms cannotusually solve functions having
nonlinear interactionsbetween design variables.
•Thus, we need to completely eliminateconcept of search direction , and
instead manipulate a set of points to create a better set of points (eg.
Simplex search method).
11
Direct Search Methods

12
Simplex Search Method
•ForNvariables,(N+1)pointsaretobeusedinthe
initialsimplex.
•Toavoidazero-volumehypercubeforaN-variable
function,(N+1)pointsinthesimplexshouldnotlie
alongthesameline.
•Ateachiteration,theworstpointinthesimplex(x
h)
isfoundfirst.
•Then,anewsimplexisformedfromtheoldsimplex
bysomefixedrulesthat“steer” thesearchaway
fromtheworstpointinthesimplex.
Foursituationsmayarisedependingonfunction
valuesofthesimplex.
1.First,thecentroid(x
c)ofallbuttheworstpoint
isdetermined.
2.Theworstpointinthesimplexisreflectedabout
x
candanewpointx
risfound.
3.Iffunctionvalueatx
risbetterthanbestpointof
initialsimplex,reflectionisconsideredtohave
takensimplextoa“goodregion” insearchspace.

13
Simplex Search Method
4.Anexpansionalongthelinejoiningx
ctox
ris
performed,whichiscontrolledbyfactorγ.
5.Iffunctionvalueatx
risworsethanworstpoint
ofinitialsimplex,reflectionisconsideredtohave
takensimplextoa“badregion” insearchspace.
6.Thus,acontractionalongthelinejoiningx
ctox
rismade,controlledbyfactorβ.
7.Finally,iffunctionvalueatx
risbetterthantheworstpointandworsethanthenext-to-worst
pointinthesimplex,contractionismadewithβ>0.

14
Simplex Search Method: Algorithm

15
Simplex Search Method: Example

16
Simplex Search Method: Example

17
Simplex Search Method: Example

18
Simplex Search Method: Example

19
Simplex Search Method: Example

20
Simplex Search Method: Example

21
Simplex Search Method: Example

22
Simplex Search Method: Example

23
Simplex Search Method: Example

24
Simplex Search Method: Example

25
Simplex Search Method: Example

26
Simplex Search Method: Example

First,weneedtocalculatethefirstandsecond-order derivativesforgradient-basedmethods.Thisisdoneusingcentral
differencetechniquesas:

????????????????????????(????????????)
????????????????????????
????????????
????????????(????????????)
=
(f????????????
????????????
????????????
+∆????????????
????????????
????????????
−f(????????????
????????????
????????????
−∆????????????
????????????
(????????????)
))/(2∆????????????
????????????
(????????????)
)

????????????
2
????????????(????????????)
????????????????????????
????????????
2
????????????(????????????)
=
(f????????????
????????????
????????????
+∆????????????
????????????
????????????
−2f(????????????
????????????
????????????
)+f(????????????
????????????
????????????
−∆????????????
????????????
(????????????)
))/(∆????????????
????????????
(????????????)
)
2

????????????
2
????????????(????????????)
????????????????????????
????????????????????????????????????
????????????
????????????
(????????????)
=
f????????????
????????????
????????????
+∆????????????
????????????
????????????
,????????????
????????????
????????????
+∆????????????
????????????
????????????
−f????????????
????????????
????????????
+∆????????????
????????????
????????????
,????????????
????????????
????????????
−∆????????????
????????????
????????????
+f????????????
????????????
????????????
−∆????????????
????????????
????????????
,????????????
????????????
????????????
−∆????????????
????????????
????????????
−f????????????
????????????
????????????
−∆????????????
????????????
????????????
,????????????
????????????
????????????
+∆????????????
????????????
????????????
4∆????????????
????????????
????????????
∆????????????
????????????
????????????
ForthecompletefirstderivativevectorofaN-variablefunction,2Nfunctionevaluationsareneeded.
ForthecompletesecondderivativevectorofaN-variablefunction,3Nfunctionevaluationsareneeded.
ForthecompletemixedderivativevectorofaN-variablefunction,4Nfunctionevaluationsareneeded.
Thus,(2N
2
+1)computationsareneededfortheHessianmatrix.
27
Gradient-Based Methods

•By definition, the first derivative represents direction
of the maximum increase of the function value.
•To get the minimum, we should be searching along
opposite to first derivative function.
28
Gradient-Based Methods
•Any search direction d
(t)
would have smaller function value than that at the current
point x
(t).
•Thus, a search direction d
(t)
that satisfies the following relation is descent direction.

29
Descent Direction
A search direction d
(t)
is a descent directionat point x
(t)
if the condition ∇????????????????????????
????????????
.????????????
(????????????)
≤0is
satisfied in the vicinity of the point x
(t)
.

30
Cauchy’s Steepest Descent Method
•ThesearchdirectionusedinCauchy’smethodisthenegativeofthegradientatany
particularpointx
(k)
:
????????????
(????????????)
=−∇????????????(????????????
????????????
)
•Sincethisdirectiongivesmaximumdescentinfunctionvalues,itisknownassteepest
descentmethod.
•Ateveryiteration,derivativeiscalculatedatcurrentpointandunidirectionalsearchis
performedindirectionwhichisnegativetothederivativedirectiontofindminimum
alongthatdirection.
•Theminimumpointbecomescurrentpointandsearchiscontinuedfromthispoint.
•Algorithmcontinuestillgradientconvergestosufficientlysmallquantity.

31
Cauchy’s Steepest Descent Method: Algorithm

32
Cauchy’s Steepest Descent Method: Example
(See AOT2)

33
Cauchy’s Steepest Descent Method: Example
Tags