5 Kuhn-Tucker Conditions explained for optimization
Jeeva297724
260 views
17 slides
Jul 18, 2024
Slide 1 of 17
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
About This Presentation
Engineering
Size: 1.46 MB
Language: en
Added: Jul 18, 2024
Slides: 17 pages
Slide Content
Multi-variable Optimization with in-equality constraints
Module-5: Classical Optimization Techniques
Multi-variable with in-equality constraint NLP
Problem statement:Find X = (x
1, x
2, …..x
n), to Minimize f= f(X)
subject to
g
j(X)≤0, j=1,2,…..,m
The inequality constraints can be transformed to equality constraints by adding non-negative
slack variables, y
j
2
, as
Note: Since the problem is non-linear and to make the slack variables are non-negative, they are with power 2
n
x
x
x
2
1
X
and
Multi-variable with in-equality constraint NLP
This problem can be solved by the method of Lagrange multipliers. For this, the Lagrange function L is
constructed as:
n
x
x
x
2
1
X
,
and
Multi-variable with in-equality constraint NLP
The Lagrange function L is :
Necessary conditions:
1
2
3
System: No. of unknowns = n + 2 m, and No. of equations = n + 2 m
Solution: The solution gives the optimum solution vector X*, the Lagrange multiplier
vector, *, and the slack variable vector, Y*
Multi-variable with in-equality constraint NLP
•These set of equations (constraints) are called KarushKuhn-Tucker (KKT) Conditions (or Kuhn-Tucker
conditionsor KT conditions).
•These are the necessary conditions to obtain relative or local minima for a given NLPP.
•So, KKT or KT Conditions are first order derivative tests (first order necessary conditions) for a solution
in NLPP to be optimal.
1
2
3
Eq. -Optimality conditions
Eq. -Feasibility conditions
Eq. -Slackness property
Multi-variable with in-equality constraint NLP
2Eq.
and Eq. 3
is ensure that the given problem constraints are satisfied.
is implies that either
j= 0 or y
j= 0
Multi-variable with in-equality constraint NLP
3Consider Eq.
Case i:??????
??????= ??????, then ??????
??????≠ ??????.
This means that the given in-equality constraint is satisfied at given
point x* for local minimization problem.
Let us consider the local minima is at a point, X* :
So, for minimization problem, if ??????
??????= ??????(and y
j0) then the constrain is not
active i.e. inactive constrain.
It means that the j
th
constraint is inactive and hence can be ignored. These
constraints are not taking part in identifying optimal solution.
Multi-variable with in-equality constraint NLP
3Consider Eq.
Case ii :??????
????????????, then ??????
??????= ??????.
This means that the given in-equality constraint is becoming equality
constraint.
Let us consider the local minima is at a point, X* :
It means that the j
th
constraint is active and taking part in identifying optimal solution.
So, when ??????
??????≠ ??????(i.e. ) the associated constraint is active constraint and we
concentrate more on this case.
Multi-variable with in-equality constraint NLP
Consider the division of the constraints into two subsets, J1 and J2 , where J1 + J2 represent
the total set of constraints.
J1 = set of active constraints at the optimum point, i.e. y
j= 0 at the optimum point,
the associated constraint is active constraint
and
J2 = set of inactive constraints at the optimum point, i.e.
j= 0 at the optimum point,
the associated constraint is inactive constraint (so not considering in solution process)
Example-4
Darla
Find X(x
1, x
2) for minimize f(X)
subjected to
Example -1
f(X)
subjected to
KKT conditions
Lagrange function is
??????(??????
1,??????
2,
1,
2)=????????????+
1??????
1+??????
2−3+
2−2??????
1+??????
2−2
First order derivatives are
Optimality: