Mastering Greedy Algorithms: Optimizing Solutions for Efficiency"
22bcs058
107 views
70 slides
Apr 25, 2024
Slide 1 of 70
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
About This Presentation
Greedy algorithms are fundamental techniques used in computer science and optimization problems. They belong to a class of algorithms that make decisions based on the current best option without considering the overall future consequences. Despite their simplicity and intuitive appeal, greedy algori...
Greedy algorithms are fundamental techniques used in computer science and optimization problems. They belong to a class of algorithms that make decisions based on the current best option without considering the overall future consequences. Despite their simplicity and intuitive appeal, greedy algorithms can provide efficient solutions to a wide range of problems across various domains.
At the core of greedy algorithms lies a simple principle: at each step, choose the locally optimal solution that seems best at the moment, with the hope that it will lead to a globally optimal solution. This principle makes greedy algorithms easy to understand and implement, as they typically involve iterating through a set of choices and making decisions based on some criteria.
One of the key characteristics of greedy algorithms is their greedy choice property, which states that at each step, the locally optimal choice leads to an optimal solution overall. This property allows greedy algorithms to make decisions without needing to backtrack or reconsider previous choices, resulting in efficient solutions for many problems.
Greedy algorithms are commonly used in problems involving optimization, scheduling, and combinatorial optimization. Examples include finding the minimum spanning tree in a graph (Prim's and Kruskal's algorithms), finding the shortest path in a weighted graph (Dijkstra's algorithm), and scheduling tasks to minimize completion time (interval scheduling).
Despite their effectiveness in many situations, greedy algorithms may not always produce the optimal solution for a given problem. In some cases, a greedy approach can lead to suboptimal solutions that are not globally optimal. This occurs when the greedy choice property does not guarantee an optimal solution at each step, or when there are conflicting objectives that cannot be resolved by a greedy strategy alone.
To mitigate these limitations, it is essential to carefully analyze the problem at hand and determine whether a greedy approach is appropriate. In some cases, greedy algorithms can be augmented with additional techniques or heuristics to improve their performance or guarantee optimality. Alternatively, other algorithmic paradigms such as dynamic programming or divide and conquer may be better suited for certain problems.
Overall, greedy algorithms offer a powerful and versatile tool for solving optimization problems efficiently. By understanding their principles and characteristics, programmers and researchers can leverage greedy algorithms to tackle a wide range of computational challenges and design elegant solutions that balance simplicity and effectiveness.
Knapsack Problem using Greedy
method
Two main kinds of Knapsack Problems
0-1 Knapsack:
N items (can be the same or different)
Haveonly oneof each
Mustleave or take(i.e., 0-1) each item (ingots of gold)
DP works, greedy does not since items cannot be broken
Fractional Knapsack:
N items (can be the same or different)
Can takefractional partof each item (bags of gold dust)
Greedy works and DP algorithms work
6
Knapsack Problem using Greedy method
Greedy approach for Fractional
Knapsack Problem
StepstosolvetheFractionalProblem:
•Calculatetheratio(value/weight)foreachitem.
•Sortalltheitemsindecreasingorderoftheratio.
•Initializeres=0,curr_cap=given_cap.
•Dothefollowingforeveryiteminthesortedorder:
•Iftheweightofthecurrentitemislessthanorequaltotheremainingcapacity,
thenaddthevalueofthatitemintotheresult
•Elseaddthecurrentitemasmuchaswecan(fraction)andbreakoutoftheloop.
•Returnres.
7
Fractional Knapsack Problem
Givenasetofitems,eachwithaweightandavalue/profit,determineasubsetof
itemstoincludeinacollectionsothatthetotalweightislessthanorequaltoagiven
limitandthetotalvalue/profitisaslargeaspossible.
Greedy approach for Fractional
Knapsack Problem
Example:Consider5itemsalongtheirrespectiveweightsandvalues:-
I=(I
1,I
2,I
3,I
4,I
5)
w=(5,10,20,30,40)
v=(30,20,100,90,160)
ThecapacityofknapsackW=60
8
Thegreedyalgorithm:
Step1:Sortv
i/w
iintonon-increasingorder.
Step2:Puttheobjectsintotheknapsackaccordingtothesortedsequenceas
possibleaswecan.
Fractional Knapsack Problem
Greedy approach for Fractional
Knapsack Problem
Example:Consider5itemsalongtheirrespectiveweightsandvalues:-
ThecapacityofknapsackW=60
First, we choose the item I
iwhose weight is 5.
Then choose item I
3whose weight is 20.
Now, the total weight of knapsack is 20 + 5 = 25
NowthenextitemisI
5,anditsweightis40,but
wewantonly35,sowechosethefractionalpartofit,
So,wearrangethevalueofV/Windecreasingorderasgivenintableandthiswillbethe
solutionofgivenproblem.
ThealgorithmusessortingtosorttheitemswhichtakesO(n×logn)timecomplexityandthen
loopsthrougheachitemwhichtakesO(n).Hencesumminguptoatimecomplexityof
O(n×logn+n)=O(n×logn).Iftheitemsarealreadysorted,thenittakesO(n)timecomplexity.
9
I I1 I2 I3 I4 I5
W 5 10 20 30 40
V 30 20 100 90 160
V/W 6 2 5 3 4
I I1 I3 I5 I4 I2
W 5 20 40 30 10
V 30 100 160 90 20
V/W 6 5 4 3 2
Fractional Knapsack Problem
Find Minimum number of Coins
SupposewewantachangeforMRs.andwehaveaninfinitesupplyofeachofthe
denominationsinIndiancurrency,i.e.,wehaveaninfinitesupplyof{1,2,5,10,20,
50,100,500,1000}valuedcoins/notes,whatistheminimumnumberofcoins
and/ornotesneededtomakethechange?
Input:M=70
Output:2
Weneeda50Rsnoteanda20Rsnote.
Input:V=121
Output:3
Weneeda100Rsnote,a20Rsnoteanda1Rscoin.
10
Solution:Greedy Approach.
Howmanyspanningtreewillbepossibleforabovegraph?
Totalnumberofvertices(V)=4
Totalnumberofedges(e)=4,Spanningtreecanbecreatedbytakingv-1edgesout
ofeedges.
So,therewillbe
4
C
3spanningtrees.
1 2
4 3
Suppose the above graph contain one more edge b/w 1 and 3. Then, above formula
will be modified as follows.
5
C
3 –a
Where, ais the number of cycles formed. For the above graph, a=2.
1 2
4 3
Minimum Spanning Tree
Minimum Spanning Tree
Howmanyspanningtreewillbepossibleforabovegraph?
TherearetwofamousalgorithmsforfindingtheMinimumSpanningTree:
•Kruskal’sAlgorithm
•PrimsAlgorithm
Minimum Spanning Tree
Use Prim’s algorithm to find an MST in the following graph.
The state at the start of the algorithm:
Intheabovediagram,theredtextisthekey
valuesofthevertices(i.e.,v.key)andthegreen
textisthepredecessorvertex.
Minimum Spanning Tree
The state at the start of the algorithm:
Intheabovediagram,theredtextis
thekeyvaluesofthevertices(i.e.,
v.key)andthegreentextisthe
predecessorvertex.
Firstthealgorithmpicksanarbitrarystarting
vertexandupdatesitskeyvalueto0.
HerewearbitrarilychooseAasourstarting
vertex.
Vertex A B C D E F
Key 0 ∞ ∞ ∞ ∞ ∞
PredecessorNIL NIL NIL NIL NIL NIL
Minimum Spanning Tree
Firstthealgorithmpicksanarbitrarystarting
vertexandupdatesitskeyvalueto0.
HerewearbitrarilychooseAasourstarting
vertex.
ThenAisextractedfromthequeue,andthe
keysofitsneighboursareupdated:
Vertexcolours:Blue:currentvertex,
green:verticesaddedtotree.
Vertex A B C D E F
Key 0 1 1 ∞ ∞ ∞
PredecessorNIL A A NIL NIL NIL
Minimum Spanning Tree
Then D is extracted from the queue Finally,Fisextractedfromthequeueandthe
algorithmiscomplete
Dijkstra Algorithm
DijkstraAlgorithm
Start with a weighted graph
Chooseastartingvertexandassigninfinity
pathvaluestoallothervertices
Gotoeachvertexandupdateits
pathlength
Ifthepathlengthoftheadjacentvertexis
lesserthannewpathlength,don'tupdateit
Dijkstra Algorithm
DijkstraAlgorithm
Avoidupdatingpathlengthsof
alreadyvisitedvertices
After each iteration, we pick the unvisited
vertex with the least path length. So, we
choose 5 before 7
Notice how the rightmost vertex has its
path length updated twice
Repeat until all the vertices have been visited
Dijkstra Algorithm
Pseudo-code
Dijkstra(G,W,s) //usespriorityqueueQ
Initialize(G,s)
S←φ
Q←V[G] //InsertintoQ -----
whileQ!=φ
dou←EXTRACT-MIN(Q)//deletesufromQ------
S=S∪{u}
foreachvertexvinAdj[u]
doRELAX(u,v,w)←DECREASEKEYoperation
V times
V log V times
V log V times
-----V
2
log V times
Job Sequencing with Deadlines: Algorithm
•Sort the jobs based on decreasing order of profit and find the largest
deadline
•Create a Gantt chart of maximum deadline size
•Iterate through the jobs and perform the following:
•Choose aSlot iif:
•Slot iisn’t previously selected.
•Slot ishould be closest to its deadline
•imust be asmaximum as possible
•If no such slot exists, ignore the job and continue.
Greedy approach
Job Sequencing with Deadlines: Analysis
•Sort the jobs based on decreasing order of profit and find the largest deadline---nlogn
•Create a Gantt chart of maximum deadline size --------------K
•Iterate through the jobs and perform the following: -------------n
•Choose aSlot iif:
•Slot iisn’t previously selected. ------------nxm
•Slot ishould be closest to its deadline
•imust be asmaximum as possible
•If no such slot exists, ignore the job and continue.
Since,sizeoftheGantt
chartarraywillbemand
maxcomparisontoplace
anitemmaybemand
fornjobsitwillnxm
Time complexity= nlogn+ n+ nxm+k
= nxm=O(n
2
) if n==m
Letusconsiderthegivenfiles,f1,f2,f3,f4andf5with20,30,10,5and30number
ofelements,respectively.
Optimal Merge Pattern
If merge operations are performed according to the provided sequence, then
M1 = merge f1 and f2 => 20 + 30 = 50
M2 = merge M1 and f3 => 50 + 10 = 60
M3 = merge M2 and f4 => 60 + 5 = 65
M4 = merge M3 and f5 => 65 + 30 = 95
Hence, the total number of operations is
50 + 60 + 65 + 95 = 270
Now, the question arises is there any better solution?
Letusconsiderthegivenfiles,f1,f2,f3,f4andf5with20,30,10,5and30number
ofelements,respectively.
Optimal Merge Pattern
Sortingthenumbersaccordingtotheirsizeinanascendingorder,wegetthefollowing
sequence−
f4,f3,f1,f2,f5
Hence,mergeoperationscanbeperformedonthissequence
M1=mergef4andf3=>5+10=15
M2=mergeM1andf1=>15+20=35
M3=mergeM2andf2=>35+30=65
M4=mergeM3andf5=>65+30=95
Therefore,thetotalnumberofoperationsis15+35+65+95=210,whichislessthan
firstone.
So, we need to find a merge pattern that can merge elements in the least number
of comparison.
Algorithm OPTIMAL_MERGE_PATTERNS(S) // S is set of sequences
Create min heap H from S
while H.length> 1 do
min1 ← minDel(H) // returns minimum element from H & delete it
min2 ← minDel(H)
NewNode.Data← min1 + min2
NewNoode.LeftChild ← min1
NewNode.RightChild ← min2
Insert(NewNode, H)// Insert node NewNodein Heap
end
Optimal Merge Pattern: Greedy approach
Optimal Merge Pattern: Greedy approach
Optimal Merge Pattern: Greedy approach
Hence, the solution takes 15 + 35 + 60 + 95 = 205 number of comparisons.
AlgorithmOPTIMAL_MERGE_PATTERNS(S) // S is set of sequences
Create min heap H from S
while H.length> 1 do
min1 ← minDel(H)
min2 ← minDel(H)
NewNode.Data← min1 + min2
NewNoode.LeftChild ← min1
NewNode.RightChild ← min2
Insert(NewNode, H)
Optimal Merge Pattern: Greedy approach
ComplexityAnalysis
Themainloopinthisalgorithmisexecutedinn-1times.Ineveryiteration,
twodeleteminimum,andoneinsertoperationisperformed.Constructionof
heaptakesO(logn)time.ThetotalrunningtimeofthisalgorithmisO(nlogn).
T(n)=(n–1)*max(O(findmin),O(insert))
Steps to build Huffman Tree-
SelecttwominimumelementsfromQandaddthemintreeusingthestep
3.
Huffman Coding
For each non-leaf node, assign 0 to the left edge and 1 to the right edge
Steps to build Huffman Tree-
Assign0totheleftedgeand1totherightedge
Huffman Coding
Forsendingthestringoveranetwork,wehavetosendthetreeaswellasthe
compressed-code.Thetotalsizeisgivenbythetablebelow.
Huffman Coding
Character Frequency Code Size
A 5 11 5*2 = 10
B 1 100 1*3 = 3
C 6 0 6*1 = 6
D 3 101 3*3 = 9
4 * 8 = 32 bits15 bits 28 bits
Without encoding, the total size of the string was 120 bits. After encoding the
size is reduced to 32 + 15 + 28 = 75
Travelling Sales Person Problem
Thetravellingsalesmanproblemsabidebyasalesmanandasetofcities.Thesalesmanhas
tovisiteveryoneofthecitiesstartingfromacertainone(e.g.,thehometown)andtoreturn
tothesamecity.Thechallengeoftheproblemisthatthetravellingsalesmanneedsto
minimizethetotallengthofthetrip.
Example:Anewspaperagentdailydropsthenewspapertotheareaassignedinsucha
mannerthathehastocoverallthehousesintherespectiveareawithminimumtravel
cost.Computetheminimumtravelcost.
Theareaassignedtotheagentwherehehastodropthenewspaperisshowninfig:
61
Travelling Sales Person Problem
TheTSPdescribesascenariowhereasalesmanisrequiredtotravelbetween
ncities.Hewishestotraveltoalllocationsexactlyonceandhemustfinishat
hisstartingpoint.Theorderinwhichthecitiesarevisitedisnotimportant,
buthewishestominimizethedistancetraveled.
Thegreedyalgorithmgoesasfollows:
1.Sortalloftheedgesinthenetwork.
2.Selecttheshortestedgeandaddittoourtourifitdoesnotviolateanyof
thefollowingconditions:therearenocyclesinourtourwithlessthann
edgesorincreasethedegreeofanynode(city)tomorethan2.
3.Ifwehavenedgesinourtourstop,ifnotrepeatstep2.
62
63
Travelling Sales Person Problem
Findtheoptimalpathforthefollowinggraph.
The weights of the edges of the graph is as
follows-
A <-> B = 2.4,
A <-> D =4.7,
A <-> C =6.8,
B <-> D = 5.1,
C <-> B =6,
C <-> D =5.9,
64
Travelling Sales Person Problem
Findtheoptimalpathforthefollowinggraph.
65
Travelling Sales Person Problem
Findtheoptimalpathforthefollowinggraph.
66
Travelling Sales Person Problem
Findtheoptimalpathforthefollowinggraph.
67
Travelling Sales Person Problem
Findtheoptimalpathforthefollowinggraph.
The Final answer is A -> B -> D -> C -> A = 2.4 + 5.1 +
5.9 + 6.8 =20.2
68
Travelling Sales Person Problem
Findtheoptimalpathforthefollowinggraph.
Usealgorithmandfindthepath
WecanaddtheroutesB<->D,B<->Aand
C<->Dtoourtourwithoutproblem
Path weight in ascending order
B <-> D = 2,
A <-> B = 4,
C <-> D =5,
A <-> D =6,
A <-> C =7,
C <-> B =8
69
Travelling Sales Person Problem
Findtheoptimalpathforthefollowinggraph.
Usealgorithmandfindthepath
WecanaddtheroutesB<->D,A<->
BandC<->Dtoourtourwithout
problem
Path weight in ascending order
B <-> D = 2,
A <-> B = 4,
C <-> D =5,
A <-> D =6,
A <-> C =7,
C <-> B =8
WecannotaddA<->Dtoourtourasit
wouldcreateacyclebetweenthenodesA,
BandDandincreasetheorderofnodeDto
3.
Wethereforeskipthisedgeandultimately
addedgeA<->Ctothetour.
70
Travelling Sales Person Problem
Complexity:Thisalgorithmsearchesforthelocaloptimaandoptimizesthe
localbestsolutiontofindtheglobaloptima.
•Itbeginsbysortingalltheedgesandthenselectstheedgewiththe
minimumcost.
•Itcontinuouslyselectsthebestnextchoicesgivenaconditionthatno
loopsareformed.
ThecomputationalcomplexityofthegreedyalgorithmisO(N
2
log2(N))and
thereisnoguaranteethataglobaloptimumsolutionisfound.