ch06-03-30-2018.ppt aaaaaaaaaaaaaaaaaaaaaa

Gahan1 19 views 56 slides Sep 04, 2024
Slide 1
Slide 1 of 56
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41
Slide 42
42
Slide 43
43
Slide 44
44
Slide 45
45
Slide 46
46
Slide 47
47
Slide 48
48
Slide 49
49
Slide 50
50
Slide 51
51
Slide 52
52
Slide 53
53
Slide 54
54
Slide 55
55
Slide 56
56

About This Presentation


Slide Content

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
1
Transform and ConquerTransform and Conquer
This group of techniques solves a problem by a This group of techniques solves a problem by a
transformationtransformation to to

a simpler/more convenient instance [eg sorted] of the same a simpler/more convenient instance [eg sorted] of the same
problem (problem (instance simplificationinstance simplification) )

a different representation of the same instance [eg different a different representation of the same instance [eg different
data structure] (data structure] (representation changerepresentation change))

a different a different problem problem for which an algorithm is for which an algorithm is already already
available available ((problem reductionproblem reduction) )

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
2
Instance simplification - PresortingInstance simplification - Presorting
Solve a problem’s instance by transforming it intoSolve a problem’s instance by transforming it into
another simpler/easier instance of the same problemanother simpler/easier instance of the same problem
PresortingPresorting
Many problems involving lists are easier when list is sorted, e.g.Many problems involving lists are easier when list is sorted, e.g.

searching searching

computing the median (selection problem)computing the median (selection problem)
checking if all elements are distinct (element uniqueness)checking if all elements are distinct (element uniqueness)
Also: Also:

Topological sorting helps solving some problems for dags.Topological sorting helps solving some problems for dags.

Presorting is used in many geometric algorithms [eg close pr]Presorting is used in many geometric algorithms [eg close pr]

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
3
How fast can we sort ?How fast can we sort ?
Efficiency of algorithms involving sorting depends onEfficiency of algorithms involving sorting depends on
efficiency of sorting.efficiency of sorting.
TheoremTheorem (see Sec. 11.2): (see Sec. 11.2): loglog
2 2 nn!!  n n loglog
2 2 n n comparisonscomparisons are are
necessarynecessary in the worst case to sort a list of size in the worst case to sort a list of size nn by by anyany
comparison-based algorithm.comparison-based algorithm.
Note: About Note: About nnloglog
2 2 nn comparisons are also comparisons are also sufficientsufficient to sort array to sort array
of size of size n n (by mergesort).(by mergesort).

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
4
Searching with presortingSearching with presorting
Problem: Search for a given Problem: Search for a given KK in A[0.. in A[0..nn-1]-1]
Presorting-based algorithm:Presorting-based algorithm:
Stage 1 Sort the array by an efficient sorting algorithmStage 1 Sort the array by an efficient sorting algorithm
Stage 2 Apply binary search Stage 2 Apply binary search
Efficiency: Efficiency: ΘΘ((nnlog log nn) + O(log ) + O(log nn) = ) = ΘΘ((nnlog log nn) )
Good or bad?Good or bad?
Why do we have our dictionaries, telephone directories, etc. Why do we have our dictionaries, telephone directories, etc.
sorted?sorted?

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
5
Element Uniqueness with presortingElement Uniqueness with presorting

Presorting-basedPresorting-based algorithm algorithm
Stage 1: sort by efficient sorting algorithm (e.g. mergesort)Stage 1: sort by efficient sorting algorithm (e.g. mergesort)
Stage 2: scan array to check pairs of Stage 2: scan array to check pairs of adjacentadjacent elements elements
EfficiencyEfficiency: : ΘΘ((nnlog log nn) + O() + O(nn) = ) = ΘΘ((nnlog log nn))

Brute force algorithm Brute force algorithm
Compare all pairs of elements Compare all pairs of elements
Efficiency: O( Efficiency: O(nn
22
))

Another 2 algorithms? … Another 2 algorithms? …

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
6
Element Uniqueness with presortingElement Uniqueness with presorting

Presorting-basedPresorting-based algorithm algorithm
Stage 1: sort by efficient sorting algorithm (e.g. mergesort)Stage 1: sort by efficient sorting algorithm (e.g. mergesort)
Stage 2: scan array to check pairs of Stage 2: scan array to check pairs of adjacentadjacent elements elements
EfficiencyEfficiency: : ΘΘ((nnlog log nn) + O() + O(nn) = ) = ΘΘ((nnlog log nn))

Brute force algorithm Brute force algorithm
Compare all pairs of elements Compare all pairs of elements
Efficiency: O( Efficiency: O(nn
22
))

Another 2 algorithms? Hashing – frequently usefulAnother 2 algorithms? Hashing – frequently useful

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
7
Element Uniqueness with presortingElement Uniqueness with presorting

Presorting-basedPresorting-based algorithm algorithm
Stage 1: sort by efficient sorting algorithm (e.g. mergesort)Stage 1: sort by efficient sorting algorithm (e.g. mergesort)
Stage 2: scan array to check pairs of Stage 2: scan array to check pairs of adjacentadjacent elements elements
EfficiencyEfficiency: : ΘΘ((nnlog log nn) + O() + O(nn) = ) = ΘΘ((nnlog log nn))

Brute force algorithm Brute force algorithm
Compare all pairs of elements Compare all pairs of elements
Efficiency: O( Efficiency: O(nn
22
))

Another 2 algorithms? Hashing – frequently usefulAnother 2 algorithms? Hashing – frequently useful
•Closest pairClosest pair

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
Instance simplification Instance simplification – Gaussian Elimination– Gaussian Elimination
Given: A system of Given: A system of nn linear equations in linear equations in n n unknowns with an unknowns with an
arbitrary coefficient matrixarbitrary coefficient matrix..
Transform to: An Transform to: An equivalent system equivalent system of of nn linear equations in linear equations in n n
unknowns with an unknowns with an upper triangular coefficient matrixupper triangular coefficient matrix..

Solve the latter by substitutions starting with the last equation Solve the latter by substitutions starting with the last equation
and moving up to the first one.and moving up to the first one.
aa
1111xx
1 1 + + aa
1212xx
2 2 + … + + … + aa
11nnxx
nn = = b b
1 1 aa
1,11,1xx
11+ + aa
1212xx
2 2 + … + + … + aa
11nnxx
nn = = b b
11
aa
2121xx
11 + + aa
2222xx
2 2 + … + + … + aa
22nnxx
nn = = b b
2 2 aa
2222xx
2 2 + … + + … + aa
22nnxx
nn = = b b
22

aa
nn11xx
11 + + aa
nn22xx
2 2 + … + + … + aa
nnnnxx
nn = = b b
nn aa
nnnnxx
nn = = b b
nn

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
Gaussian Elimination (cont.) SKIPGaussian Elimination (cont.) SKIP
The transformation is accomplished by a sequence of elementary The transformation is accomplished by a sequence of elementary
operations on the system’s coefficient matrix (which don’t operations on the system’s coefficient matrix (which don’t
change the system’s solution):change the system’s solution):
forfor i i ←←1 to 1 to n-n-1 1 dodo
replace each of the subsequent rows (i.e., rows replace each of the subsequent rows (i.e., rows ii+1, …, +1, …, nn) by ) by
a difference between that row and an appropriate multiple a difference between that row and an appropriate multiple
of the of the i-i-th row to make the new coefficient in the th row to make the new coefficient in the i-i-thth column column
of that row 0 of that row 0

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
Example of Gaussian EliminationExample of Gaussian Elimination
Solve 2Solve 2xx
11 - 4 - 4xx
2 2 + + xx
3 3 = 6 = 6
3 3xx
11 - - xx
2 2 + + xx
3 3 = 11= 11
xx
11 + + xx
2 2 - - xx
3 3 = -3= -3
Gaussian eliminationGaussian elimination
22 -4 -4
1 6 2 1 6 2 -4 -4
1 1
6 6
3 -1 1 11 row2 – (3/2)*row1 0 5 -1/2 2 3 -1 1 11 row2 – (3/2)*row1 0 5 -1/2 2
1 1 -1 1 1 -1 -3 row3 – (1/2)*row1 0 3 -3/2 -6 row3–(3/5)*row2-3 row3 – (1/2)*row1 0 3 -3/2 -6 row3–(3/5)*row2
22 -4 -4
1 1
6 6
0 5 -1/2 20 5 -1/2 2
0 0 -6/5 -36/50 0 -6/5 -36/5
Backward substitutionBackward substitution
xx
33 = (-36/5) / (-6/5) = 6 = (-36/5) / (-6/5) = 6
xx
22 = (2+(1/2)*6) / 5 = 1 = (2+(1/2)*6) / 5 = 1
xx
11 = (6 – 6 + 4*1)/2 = 2 = (6 – 6 + 4*1)/2 = 2

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
Pseudocode and Efficiency of Gaussian Elim SKIPPseudocode and Efficiency of Gaussian Elim SKIP
Stage 1: Reduction to an upper-triangular matrixStage 1: Reduction to an upper-triangular matrix
forfor i i ←← 1 1 toto n-n-1 1 dodo
forfor j j ←← ii+1 +1 toto n n dodo
for for k k ←← i i toto n+n+1 1 dodo
A A[[jj,, k k] ] ←← AA[[jj,, k k] - ] - AA[[ii,, k k]] * A * A[[jj,, i i] / ] / AA[[ii,, i i] //improve!] //improve!
Stage 2: Back substitutionsStage 2: Back substitutions
forfor j j ←← nn downto downto 1 1 dodo
tt ←← 0 0
forfor k k ←← j j +1+1 toto n n dodo
tt ←← tt + + AA[[jj,, k k] * ] * xx[[kk] ]
xx[[jj] ] ←← ( (AA[[jj,, n n+1] - +1] - tt) / ) / AA[[jj,, j j] ]

Efficiency: Efficiency: ΘΘ((nn
33
)) + + ΘΘ((nn
22
)) = = ΘΘ((nn
33
))

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
12
Searching ProblemSearching Problem
ProblemProblem: Given a (multi)set : Given a (multi)set S S of keys and a search of keys and a search
key key KK, find an occurrence of , find an occurrence of KK in in SS, if any, if any

Searching must be considered in the Searching must be considered in the context context of [context determines of [context determines
best data structurre]:best data structurre]:
•file size (internal vs. external)file size (internal vs. external)
•dynamics of data (static vs. dynamic)dynamics of data (static vs. dynamic)

Dictionary operations (dynamic data) [data structure determines Dictionary operations (dynamic data) [data structure determines
performance]:performance]:
•find (search)find (search)
•insertinsert
•deletedelete

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
13
Taxonomy of Searching AlgorithmsTaxonomy of Searching Algorithms

List searching …List searching …

Tree searching …Tree searching …

Hashing …Hashing …

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
14
Taxonomy of Searching AlgorithmsTaxonomy of Searching Algorithms

List searching:List searching:
•sequential searchsequential search
•binary searchbinary search
•interpolation search [SKIP]interpolation search [SKIP]

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
15
Taxonomy of Searching AlgorithmsTaxonomy of Searching Algorithms

Tree searching Tree searching
•binary search treebinary search tree
•balancedbalanced binary trees: binary trees:
–AVL trees (SKIP)AVL trees (SKIP)
–red-black trees (Consider after 2-3-4 trees)red-black trees (Consider after 2-3-4 trees)
•multiway trees [multiway trees [balancedbalanced]: ]:
–2-3 trees (Chap 6)2-3 trees (Chap 6)
–2-3-4 trees, B trees (Chap 7)2-3-4 trees, B trees (Chap 7)

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
16
Taxonomy of Searching AlgorithmsTaxonomy of Searching Algorithms

HashingHashing
•open hashing (separate chaining)open hashing (separate chaining)
•closed hashing (open addressing)closed hashing (open addressing)

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
17
Binary Search TreeBinary Search Tree
Arrange keys in a binary tree with the Arrange keys in a binary tree with the binary search binary search
tree propertytree property::
K
<K >K
Example: 5, 3, 1, 10, 12, 7, 9 Example: 5, 3, 1, 10, 12, 7, 9
(Which could/should be the root? Best/Worst case?)(Which could/should be the root? Best/Worst case?)

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
18
Dictionary Operations on Binary Search TreesDictionary Operations on Binary Search Trees
Searching – straightforwardSearching – straightforward
Insertion – search for key, insert at leaf where search terminatedInsertion – search for key, insert at leaf where search terminated
Deletion – 3 cases:Deletion – 3 cases:
deleting key at a leafdeleting key at a leaf
deleting key at node with single childdeleting key at node with single child
deleting key at node with two childrendeleting key at node with two children
Efficiency depends of the tree’s height: Efficiency depends of the tree’s height: loglog
2 2 nn  hh  nn-1,-1,
with height average (random files) be about 3with height average (random files) be about 3loglog
2 2 nn
Maximum number of nodes if height k?Maximum number of nodes if height k?
Thus all three operations haveThus all three operations have
• worst case efficiency: worst case efficiency: ((nn) )
• average case efficiency: average case efficiency: (log (log nn))
BonusBonus: inorder traversal produces sorted list: inorder traversal produces sorted list

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
19
Balanced Search Trees Balanced Search Trees
Attractiveness of Attractiveness of binary search tree binary search tree is marred by the bad (linear) is marred by the bad (linear)
worst-case efficiency. Two ideas to overcome it (ie keep BST worst-case efficiency. Two ideas to overcome it (ie keep BST
close to balanced) are:close to balanced) are:

to restructure BST to maintain balance when an to restructure BST to maintain balance when an
insertion makes the tree “too unbalanced”insertion makes the tree “too unbalanced”
• AVL trees AVL trees
• red-black trees red-black trees [come back to after 2-3-4 trees][come back to after 2-3-4 trees]

to allow more than one key per node of a search treeto allow more than one key per node of a search tree
• 2-3 trees 2-3 trees [Chapter 6][Chapter 6]
• 2-3-4 trees 2-3-4 trees [Chapter ?][Chapter ?]
• B-trees B-trees [Chapter 7] (Red black tree = 234 in binary tree)[Chapter 7] (Red black tree = 234 in binary tree)

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
20
Balanced trees: AVL trees Balanced trees: AVL trees
DefinitionDefinition An An AVL treeAVL tree is a binary search tree in which, for is a binary search tree in which, for
every node, the difference between the heights of its left and every node, the difference between the heights of its left and
right subtrees, called the right subtrees, called the balance factorbalance factor, is at most 1 (with , is at most 1 (with
the height of an empty tree defined as -1)the height of an empty tree defined as -1)
5
20
12
4 7
2
(a)
10
1
8
10
1
0
-1
0
0
5
20
4 7
2
(b)
10
2
8
00
1
0
-1
0
Tree (a) is an AVL tree; tree (b) is not an AVL treeTree (a) is an AVL tree; tree (b) is not an AVL tree

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
21
Rotations Rotations
If a key insertion violates the balance requirement at some If a key insertion violates the balance requirement at some
node, the subtree rooted at that node is transformed via one of node, the subtree rooted at that node is transformed via one of
the four the four rotations. rotations. (The rotation is always performed for a (The rotation is always performed for a
subtree rooted at an “unbalanced” node closest to the new leaf.)subtree rooted at an “unbalanced” node closest to the new leaf.)
3
2
2
1
1
0
2
0
1
0
3
0
>
R
(a)
3
2
1
-1
2
0
2
0
1
0
3
0
>
LR
(c)
Single Single R-R-rotationrotation Double Double LR-LR-rotationrotation

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
22
General case: Single R-rotation [SKIP]General case: Single R-rotation [SKIP]

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
23
General case: Double LR-rotation [SKIP]General case: Double LR-rotation [SKIP]

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
24
AVL tree construction - an example [SKIP]AVL tree construction - an example [SKIP]
Construct an AVL tree for the list 5, 6, 8, 3, 2, 4, 7 Construct an AVL tree for the list 5, 6, 8, 3, 2, 4, 7
5
-1
6
0
5
0
5
-2
6
-1
8
0
>
6
0
8
0
5
0
L(5)
6
1
5
1
3
0
8
0
6
2
5
2
3
1
2
0
8
0
>
R (5)
6
1
3
0
2
0
8
0
5
0

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
25
AVL tree construction - example (cont. SKIP)AVL tree construction - example (cont. SKIP)
6
2
3
-1
2
0
5
1
4
0
8
0
>
LR (6)
5
0
3
0
2
0
4
0
6
-1
8
0
5
-1
3
0
2
0
4
0
6
-2
8
1
7
0
>
RL (6)
5
0
3
0
2
0
4
0
7
0
8
0
6
0

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
26
Analysis of AVL trees Analysis of AVL trees
hh  1.4404 1.4404 loglog
22 ( (n n + 2) - 1.3277 + 2) - 1.3277
average height: 1.01 average height: 1.01 loglog
22n n + + 0.1 for large 0.1 for large n n (found empirically)(found empirically)

Search and insertion are O(Search and insertion are O(log log nn) )

Deletion is more complicated but is also O(Deletion is more complicated but is also O(log log nn))

Disadvantages: Disadvantages:
•frequent rotationsfrequent rotations
•complexitycomplexity

A similar idea: A similar idea: red-black treesred-black trees (height of subtrees is allowed to (height of subtrees is allowed to
differ by up to a factor of 2) differ by up to a factor of 2)

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
27
Multiway Search TreesMultiway Search Trees
DefinitionDefinition A A multiway search treemultiway search tree is a search tree that allowsis a search tree that allows
more than one key more than one key in tree nodes.in tree nodes.
DefinitionDefinition A node of a search tree is called an A node of a search tree is called an nn--nodenode if it if it
contains contains n-n-1 ordered keys (which divide the entire key range into 1 ordered keys (which divide the entire key range into
nn intervals pointed to by the node’s intervals pointed to by the node’s n n links to its children):links to its children):


Note: Every node in a classical binary search tree is a 2-nodeNote: Every node in a classical binary search tree is a 2-node

k
1 < k
2 < … < k
n-1
< k
1 [k
1, k
2 )  k
n-1

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
28
2-3 Tree 2-3 Tree
DefinitionDefinition A A 2-3 tree2-3 tree is a search tree that is a search tree that

may have 2-nodes and 3-nodes (How many keys per node?)may have 2-nodes and 3-nodes (How many keys per node?)

height-balanced (height-balanced (all leaves all leaves are on the are on the same level !! same level !! ))
A 2-3 tree is constructed by successive insertions of keys given, A 2-3 tree is constructed by successive insertions of keys given,
with a with a new key always inserted into a leaf new key always inserted into a leaf of the tree. If the leaf of the tree. If the leaf
is a 3-node, it’s is a 3-node, it’s splitsplit into two with the middle key into two with the middle key promotedpromoted to to
the parent. Splitting can be repeated up the tree, as needed.the parent. Splitting can be repeated up the tree, as needed.
K K , K
1 2
(K , K )
1 2
2-node 3-node
< K > K< K > K
1 2

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
29
2-3 tree construction – an example2-3 tree construction – an example
Construct a 2-3 tree the list 9, 5, 8, 3, 2, 4, 7Construct a 2-3 tree the list 9, 5, 8, 3, 2, 4, 7
9
>
8
955, 9 5, 8, 9
8
93, 5
2, 3, 5
8
9
>
>
3, 8
92 5
3, 8
92 4, 5
3, 8
4, 5, 72 9
> 3, 5, 8
2 4 7 9
5
3
42
8
97
LEAVES?

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
30
Analysis of 2-3 treesAnalysis of 2-3 trees

Consider:Consider:
•Shortest tree with 8 nodes (add 1,4,10 to [3,8/2-5-9])Shortest tree with 8 nodes (add 1,4,10 to [3,8/2-5-9])
•Tallest tree with 7 nodes [which tree]Tallest tree with 7 nodes [which tree]
loglog
3 3 ((n n + 1) - 1 + 1) - 1  hh  loglog
22 ( (n n + 1) - 1+ 1) - 1
•Does this work for examples above? What is Does this work for examples above? What is hh??

Search, insertion, and deletion are in Search, insertion, and deletion are in ((log log nn) )

The idea of 2-3 tree can be generalized by allowing more keys per The idea of 2-3 tree can be generalized by allowing more keys per
node node
•2-3-4 trees 2-3-4 trees
•B-treesB-trees

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
31
Heaps and HeapsortHeaps and Heapsort
DefinitionDefinition A A heapheap is a binary tree with keys at its nodes (one is a binary tree with keys at its nodes (one
key per node) such that:key per node) such that:

It is essentially complete, i.e., all its levels are full except It is essentially complete, i.e., all its levels are full except
possibly the last level, where only some rightmost keys may possibly the last level, where only some rightmost keys may
be missingbe missing

The key at each node is The key at each node is ≥ all keys in its children (and ≥ all keys in its children (and
descendents)descendents)

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
32
Illustration of the heap’s definitionIllustration of the heap’s definition
10
5
4 2
7
1
10
5
2
7
1
10
5
6 2
7
1
a heapa heap not a heapnot a heap not a heapnot a heap
Note: Heap’s elements are ordered top down (along any path Note: Heap’s elements are ordered top down (along any path
down from its root), but they are not ordered left to right down from its root), but they are not ordered left to right

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
33
Some Important Properties of a HeapSome Important Properties of a Heap

Given Given n,n, there exists a unique binary tree (structure) with there exists a unique binary tree (structure) with nn
nodes that is essentially complete, with nodes that is essentially complete, with h h = = loglog
2 2 nn

The root contains the largest keyThe root contains the largest key

Every subtree rooted at every node of a heap is also a heapEvery subtree rooted at every node of a heap is also a heap

A heap can easily be represented as an array (and usually A heap can easily be represented as an array (and usually
is). Look at the following example, and then explain why is). Look at the following example, and then explain why
the array representation always works.the array representation always works.

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
34
Heap’s Array RepresentationHeap’s Array Representation
Store heap’s elements in an array (whose elements indexed, Store heap’s elements in an array (whose elements indexed,
for convenience, 1 to for convenience, 1 to nn) in top-down left-to-right order) in top-down left-to-right order
Example:Example:

Left child of node Left child of node jj is at 2 is at 2jj

Right child of node Right child of node jj is at 2 is at 2jj+1+1

Parent of node Parent of node jj is at is at jj/2/2
Parental (ie interior) nodes are in the first Parental (ie interior) nodes are in the first nn/2/2 locationslocations
9
1
5 3
42
1 2 3 4 5 6
9 5 3 1 4 2

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
35
Step 0: Initialize structure (ie array) with keys in order givenStep 0: Initialize structure (ie array) with keys in order given
Step 1: Step 1: StartStarting with the last (ing with the last (rightmostrightmost) parental (ie ) parental (ie interiorinterior) node, ) node,
fix the heap rooted at it, if it doesn’t satisfy thefix the heap rooted at it, if it doesn’t satisfy the
heap condition: keep exchanging it with its largest heap condition: keep exchanging it with its largest
child until the heapchild until the heap condition holdscondition holds
Step 2: Repeat Step 1 for the preceding parental nodeStep 2: Repeat Step 1 for the preceding parental node
Bottom up: Adding nodes to heap from bottom to top,Bottom up: Adding nodes to heap from bottom to top,
pushing elements down, as neededpushing elements down, as needed
Step 1 called Heapify:Step 1 called Heapify:
Heap Construction (bottom-up)Heap Construction (bottom-up)

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
36
Example of Heap Construction [Bottom Up]Example of Heap Construction [Bottom Up]
7
2
9
6 5 8
>
2
9
6 5
8
7
2
9
6 5
8
7
2
9
6 5
8
7
>
9
2
6 5
8
7
9
6
2 5
8
7
>
Construct a heap for the list 2, 9, 7, 6, 5, 8 . Insert elements into Construct a heap for the list 2, 9, 7, 6, 5, 8 . Insert elements into
the array. Process interior nodes R to L. Why? Before and after the array. Process interior nodes R to L. Why? Before and after
each step? Heapify all the way down at each step. Bottom up?each step? Heapify all the way down at each step. Bottom up?
Worst case? Number of comparisons for a node??

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
37
Pseudocode of bottom-up heap constructionPseudocode of bottom-up heap construction

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
38
Insert a New Element into a HeapInsert a New Element into a Heap

Basic operation in building heap in Top Down orderBasic operation in building heap in Top Down order

Insert the new element at Insert the new element at last position last position in heap (size++)in heap (size++)

Compare new element with its parent and, if it violates heap Compare new element with its parent and, if it violates heap
condition, exchange themcondition, exchange them

Continue comparing the new element with nodes up the tree Continue comparing the new element with nodes up the tree
until the heap condition is satisfieduntil the heap condition is satisfied
Example:Example: Insert key 10 into heap of size 6Insert key 10 into heap of size 6
What order is this? Efficiency: O(log What order is this? Efficiency: O(log nn))
9
6
2 5
8
7 10
9
6
2 5
10
7 8
> >
10
6
2 5
9
7 8

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
39
Example of Heap Construction [Top Down]Example of Heap Construction [Top Down]
Construct a heap for the list 2, 9, 7, 6, 5, 8 [Start with elements in Construct a heap for the list 2, 9, 7, 6, 5, 8 [Start with elements in
the array. Process interior nodes from L to R. Add nodes from the array. Process interior nodes from L to R. Add nodes from
top to bottom. Add each node, and push up as needed. top to bottom. Add each node, and push up as needed.
22
2 92 9
9/ 29/ 2
9/ 2 79/ 2 7
9/ 2 7/ 69/ 2 7/ 6
9/ 6 7/ 2; 9/ 6 7/ 2 5; 9/ 6 7/ 2 5, 8; 9/ 6 8/ 2 5, 79/ 6 7/ 2; 9/ 6 7/ 2 5; 9/ 6 7/ 2 5, 8; 9/ 6 8/ 2 5, 7
Is this a heap??Is this a heap??

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
40
Stage 1: Construct a heap for a given list of Stage 1: Construct a heap for a given list of nn keys keys
Stage 2: Repeat operation of root removal Stage 2: Repeat operation of root removal nn-1 times:-1 times:
–Exchange keys in the root and in the last Exchange keys in the root and in the last
(rightmost) leaf(rightmost) leaf
–Decrease heap size by 1Decrease heap size by 1
–If necessary, repeatedly swap new root (node) If necessary, repeatedly swap new root (node)
with larger child until the heap condition again with larger child until the heap condition again
holdsholds
HeapsortHeapsort

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
41
Sort the list 2, 9, 7, 6, 5, 8 by heapsortSort the list 2, 9, 7, 6, 5, 8 by heapsort
Stage 1 (heap construction)Stage 1 (heap construction) Stage 2 (root/max removal) Stage 2 (root/max removal)
1 9 1 9 77 6 5 8 6 5 8 99 6 8 2 5 7 6 8 2 5 7
2 2 99 8 6 5 7 8 6 5 7 7 6 8 2 5 | 9 7 6 8 2 5 | 9
22 9 8 6 5 7 9 8 6 5 7 88 6 7 2 5 | 9 6 7 2 5 | 9
9 9 22 8 6 5 7 8 6 5 7 5 6 7 2 | 8 9 5 6 7 2 | 8 9
9 6 8 2 5 79 6 8 2 5 7 77 6 5 2 | 8 9 6 5 2 | 8 9
2 6 5 | 7 8 92 6 5 | 7 8 9
66 2 5 | 7 8 9 2 5 | 7 8 9
55 2 | 6 7 8 9 2 | 6 7 8 9
55 2 | 6 7 8 9 2 | 6 7 8 9
2 | 5 6 7 8 92 | 5 6 7 8 9
Example of Sorting by HeapsortExample of Sorting by Heapsort

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
42
Stage 1: Build heap for a given list of Stage 1: Build heap for a given list of nn keys (of height h) keys (of height h)
worst-caseworst-case
CC((nn) = ) =
Stage 2: Repeat operation of root removal Stage 2: Repeat operation of root removal nn-1 times (fix heap)-1 times (fix heap)
worst-caseworst-case
CC((nn) = ) =
Both worst-case and average-case efficiency: Both worst-case and average-case efficiency: ((nnloglognn) )
In-place: yesIn-place: yes
Stability: no (e.g., apply heapsort to 1 1)Stability: no (e.g., apply heapsort to 1 1)
2(2(h-ih-i) 2) 2
i i
= = 2 ( 2 ( nn – log – log
22((n n + 1)) + 1))  ((nn))
ii=0=0
hh-1-1
# nodes at level i

i=i=11
nn-1-1
2log2log
2 2 i i  ((nnloglognn))
Analysis of HeapsortAnalysis of Heapsort

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
43
A A priority queue priority queue is the ADT of a set of elements with is the ADT of a set of elements with
numerical priorities with the following operations:numerical priorities with the following operations:
•find element with highest priorityfind element with highest priority
•delete element with highest prioritydelete element with highest priority
•insert element with assigned priority (see below)insert element with assigned priority (see below)

Heap is a very efficient way for implementing priority queuesHeap is a very efficient way for implementing priority queues

Two ways to handle priority queue in whichTwo ways to handle priority queue in which
highest priority = smallest number highest priority = smallest number
Priority QueuePriority Queue

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
44
Horner’s Rule For Polynomial EvaluationHorner’s Rule For Polynomial Evaluation
Given a polynomial of degree Given a polynomial of degree nn
pp((xx) = ) = aa
nnxx
nn
+ a + a
nn-1-1xx
nn-1-1
+ … + + … + aa
11xx + + aa
00
and a specific value of and a specific value of xx, find the value of , find the value of pp at that point. at that point.
Two brute-force algorithms:Two brute-force algorithms:
pp  0 0 pp  aa
00; ; powerpower  1 1
for for i i  nn downto 0downto 0 dodo for for i i  1 to 1 to n n dodo
power power  1 1 powerpower  powerpower * * xx
for for jj  1 to 1 to ii do do pp  pp + + aa
i i * * powerpower
powerpower  powerpower * * x x return return pp
pp  p + ap + a
ii * * powerpower
return return pp

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
45
Horner’s RuleHorner’s Rule
Example: Example: pp(x) = 2(x) = 2xx
44
- - xx
33
+ 3 + 3xx
22
+ + xx - 5 - 5
= = xx(2(2xx
33
- - xx
22
+ 3 + 3xx + 1) - 5 + 1) - 5
= = xx((xx(2(2xx
22
- - xx + 3) + 1) - 5 + 3) + 1) - 5
= = xx((xx((xx(2(2xx - 1) + 3) + 1) - 5 - 1) + 3) + 1) - 5
Substitution into the last formula leads to a faster algorithm Substitution into the last formula leads to a faster algorithm
Same sequence of computations are obtained by simply Same sequence of computations are obtained by simply
arranging the coefficient in a table and proceeding as follows: arranging the coefficient in a table and proceeding as follows:
coefficientscoefficients22 -1-1 3 3 1 1 -5-5
xx=3=3

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
46
Horner’s Rule pseudocodeHorner’s Rule pseudocode
Efficiency of Horner’s Rule: # multiplications = # additions = Efficiency of Horner’s Rule: # multiplications = # additions = nn
SSynthetic divisionynthetic division of of of of pp((xx) by () by (x-xx-x
00) )
Example: Let Example: Let pp((xx) = 2) = 2xx
44
- - xx
33
+ 3 + 3xx
22
+ + x x - 5. Find - 5. Find pp((xx):():(xx-3)-3)

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
47
Computing Computing aa
nn
(revisited) (revisited)
Left-to-right binary exponentiationLeft-to-right binary exponentiation
Initialize product accumulatorInitialize product accumulator by 1.by 1.
Scan Scan nn’s binary expansion from left to right and do the ’s binary expansion from left to right and do the
following: following:
If the current binary digit is 0, square the accumulator (S);If the current binary digit is 0, square the accumulator (S);
if the binary digit is 1, square the accumulator and multiply it if the binary digit is 1, square the accumulator and multiply it
by by a a (SM).(SM).
Example: Compute aExample: Compute a
1313
. Here, . Here, nn = 13 = 1101 = 13 = 1101
22
binary rep. of 13: 1 1binary rep. of 13: 1 1 0 1 0 1
SM SM SM SM S S SM SM
accumulator: 1 1accumulator: 1 1
22
**a=aa=a aa
22
**aa = = aa
33
( (aa
33
))
22
= = aa
66
( (aa
66
))
22
**aa= = aa
13 13

(computed left-to-right)(computed left-to-right)
Efficiency: Efficiency: bb ≤ M(≤ M(nn) ≤ 2) ≤ 2bb where where b = b = loglog
2 2 nn + 1 + 1

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
48
Computing Computing aa
nn
(cont.) (cont.)
Right-to-left binary exponentiationRight-to-left binary exponentiation
Scan Scan nn’s binary expansion from right to left and compute ’s binary expansion from right to left and compute aa
nn
as as
the product of terms the product of terms aa
2 2
ii
corresponding to 1’s in this expansion. corresponding to 1’s in this expansion.
ExampleExample Compute Compute aa
13 13
by the right-to-left binary exponentiation. by the right-to-left binary exponentiation.
Here, Here, nn = 13 = 1101 = 13 = 1101
22. .
11 1 1 0 1 0 1
a a
88
a a
44
a a
22
a a : : aa
2 2
ii
terms terms
a a
88
* a * a
44
* a * a : product : product
(computed right-to-left) (computed right-to-left)
Efficiency: same as that of left-to-right binary exponentiationEfficiency: same as that of left-to-right binary exponentiation

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
49
Problem ReductionProblem Reduction
This variation of transform-and-conquer solves a problem by a This variation of transform-and-conquer solves a problem by a
transforming it into different problem for which an algorithm is transforming it into different problem for which an algorithm is
already available.already available.
Reduce P to Q = use a solution of Q as part of a solution to P Reduce P to Q = use a solution of Q as part of a solution to P
To solve P, first solve Q then use solution to Q in solution to PTo solve P, first solve Q then use solution to Q in solution to P
OR To solve P, write an alg that calls QOR To solve P, write an alg that calls Q
To be of practical value, the combined time of the transformation and To be of practical value, the combined time of the transformation and
solving the other problem should be smaller than solving the problem solving the other problem should be smaller than solving the problem
as given by another method [eg by brute force]. as given by another method [eg by brute force].
VERY IMPORTANT FOR P=NP (Last topic in course)VERY IMPORTANT FOR P=NP (Last topic in course)

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
50
Examples of Solving Problems by ReductionExamples of Solving Problems by Reduction

computing lcm(computing lcm(mm, , nn) via computing gcd() via computing gcd(m, nm, n) !!) !!

… … Let’s examine LCM before looking at more examplesLet’s examine LCM before looking at more examples

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
51
Problem ReductionProblem Reduction
Reduce P to Q = use a solution of Q as part of a solution to P Reduce P to Q = use a solution of Q as part of a solution to P
To solve P, first solve Q then use solution to Q in solution to PTo solve P, first solve Q then use solution to Q in solution to P
OR equivalently: To solve P, write an alg that calls QOR equivalently: To solve P, write an alg that calls Q
P = LCM, Q = GCDP = LCM, Q = GCD
Reduce P to Q: Reduce LCM to GCDReduce P to Q: Reduce LCM to GCD
LCM(x, y) = (x * y) / GCD(x, y)LCM(x, y) = (x * y) / GCD(x, y)
LCM(60, 42) = (60*42) / GCD(60, 42) = 2520 / 6 = 420LCM(60, 42) = (60*42) / GCD(60, 42) = 2520 / 6 = 420
60=2*2*3*5. 42=2*3*7. GCD=2*3. LCM=2*2*3*5*7.60=2*2*3*5. 42=2*3*7. GCD=2*3. LCM=2*2*3*5*7.
Assume * and / are constant time (ie Assume * and / are constant time (ie ΘΘ(1)).(1)).
Compare performance of Best Algorithms for P & Q.Compare performance of Best Algorithms for P & Q.

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
52
Problem Reduction: NotationProblem Reduction: Notation
Reduce LCM to GCD:Reduce LCM to GCD:
LCM(x, y) = (x * y) / GCD(x, y)LCM(x, y) = (x * y) / GCD(x, y)

Notation: P Notation: P  Q means P reduces to Q. Q means P reduces to Q.
LCM LCM  GCD. GCD.
Does this notation seem backwards?Does this notation seem backwards?

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
53
Problem Reduction: Best Alg. PerformanceProblem Reduction: Best Alg. Performance
P = LCM, Q = GCD. Reduce P to Q: Reduce LCM to GCDP = LCM, Q = GCD. Reduce P to Q: Reduce LCM to GCD
LCM(x, y) = (x * y) / GCD(x, y)LCM(x, y) = (x * y) / GCD(x, y)
LCM(60, 42) = (60*42) / GCD(60, 42) = 2520 / 6 = 420LCM(60, 42) = (60*42) / GCD(60, 42) = 2520 / 6 = 420

Assume reduction steps are faster than Q (eg * and / are Assume reduction steps are faster than Q (eg * and / are ΘΘ(1)).(1)).
Assume best alg for GCD is used. Are these stmts true or false?Assume best alg for GCD is used. Are these stmts true or false?
- The best alg for LCM can be faster than best alg for GCD.- The best alg for LCM can be faster than best alg for GCD.
- The best alg for GCD can be faster than best alg for LCM.- The best alg for GCD can be faster than best alg for LCM.
- Examples: - Examples:
Best LCM: O(n) and Best GCD: O(n^2)Best LCM: O(n) and Best GCD: O(n^2)
Best GCD: O(n) and Best LCM: O(n^2)Best GCD: O(n) and Best LCM: O(n^2)
Notation: Notation: LCM LCM  GCD. GCD.

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
54
More Examples of ReductionMore Examples of Reduction

Reduce P = lcm(Reduce P = lcm(mm, , nn) to Q = gcd() to Q = gcd(m, nm, n) … we’ve seen) … we’ve seen

Reduce finding Reduce finding medianmedian to ??, followed by ?? to ??, followed by ??
•Performance of best alg for median? Perf of best alg for ??Performance of best alg for median? Perf of best alg for ??

Reduce finding Reduce finding minimumminimum to ??, followed by ?? to ??, followed by ??
•Performance of best alg for min? Perf of best alg for ??Performance of best alg for min? Perf of best alg for ??

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
55
More Examples of ReductionMore Examples of Reduction

Reduce P = lcm(Reduce P = lcm(mm, , nn) to Q = gcd() to Q = gcd(m, nm, n) … we’ve seen) … we’ve seen

Reduce finding Reduce finding medianmedian to ??, followed by ?? to ??, followed by ??
•Performance of best alg for median? Perf of best alg for ??Performance of best alg for median? Perf of best alg for ??
•Median Median  Sort. Sort.

Reduce finding Reduce finding minimumminimum to ??, followed by ?? to ??, followed by ??
•Performance of best alg for min? Perf of best alg for ??Performance of best alg for min? Perf of best alg for ??
•Min Min  Sort. Sort.

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
56
More Examples of ReductionMore Examples of Reduction

reduction to graph problems (e.g., solving puzzles via state-reduction to graph problems (e.g., solving puzzles via state-
space graphs) space graphs)

transforming a maximization problem to a minimization transforming a maximization problem to a minimization
problem and vice versaproblem and vice versa

SKIP:SKIP:
•linear programminglinear programming
•counting number of paths of length counting number of paths of length n n in a graph by raising the in a graph by raising the
graph’s adjacency matrix to the graph’s adjacency matrix to the n-n-th powerth power
Tags