Algorithms Analysis & Design - Lecture 2

1mohamedgamal54 20 views 36 slides Aug 19, 2024
Slide 1
Slide 1 of 36
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36

About This Presentation

Introduction to the Design and Analysis of Algorithms


Slide Content

Algorithms
By Mohamed Gamal
© Mohamed Gamal 2024

What to Analyze?
–An algorithm can require different times to solve different problems of the same size.
–Worst-Case Analysis (Big O): The maximum amount of time that an algorithm require to solve a
problem of size n.
–This gives an upper bound for the time complexity of an algorithm.
–Normally, we try to find worst-case behavior of an algorithm.
–Best-Case Analysis (Big ??????): The minimum amount of time that an algorithm require to solve a
problem of size n.
–The best-case behavior of an algorithm is NOT so useful.
–Average-Case Analysis (Big ??????): The average amount of time that an algorithm require to solve a
problem of size n.
–Sometimes, it is difficult to find the average-case behavior of an algorithm.
–We have to look at all possible data organizations of a given size n, and their distribution probabilities of
these organizations.
Worst-case analysis is more common than average-case analysis.

Asymptotic Notations
O notation: asymptotic "less than"
•Big O
•�� = ??????�� implies �� ≤ ?????? �� in the limit.
•?????? is a constant.
•Provides an upper bound of running time.
•Used in worst-case analysis.

Asymptotic Notations (Cont.)
Θ notation: asymptotic "equality"
•Big Theta
•�� =Θ�� implies ��= ?????? �� in the limit.
•?????? is a constant.
•Provides a tight bound of running time.
•Used in average-case analysis.

Asymptotic Notations (Cont.)
Ω notation: asymptotic "greater"
•Big Omega
•�� =Ω�� implies ��≥?????? �� in the limit.
•?????? is a constant.
•Provides a lower bound of running time.
•Used in best-case analysis.

Orders of growth from the lowest to highest.
O1 Best
Olog�
O2log�
O(�)
O�log�
O(�
2
)
O(�
3
)
O(2
??????
)
O(�!) Worst

Common orders of magnitude

f
A(n) = 30n + 8 is O(n)
f
B(n) = n
2
+ 1 is O(n
2
)
f
c(n) = 10n
3
+ 2n
2
is O(n
3
)
f
D(n) = n
3
– n
2
is O(n
3
)
f
E(n) = 1273 is O(1)
Big-O Notation – Examples
But it is important to use as “tight”
bounds as possible!
or O(n
2
)
or O(n
4
)
or O(n
4
)
or O(n
5
)
or O(n)

Big-O Notation – Examples (Cont.)
Solution:

Big-O Notation – Examples (Cont.)
Solution:
List the following function according to their order of growth from the lowest to
the highest:
(�−2)!, 5lg�+100
10
, 2
2??????
, 0.001�
4
+ 3�
3
+1, ln
2
�,
3
�, 3
??????
5lg�+100
10
, ln
2
�,
3
�, 0.001�
4
+3�
3
+1, 3
??????
, 2
2??????
, (�−2)!

Big-O Notation – Examples (Cont.)
Solution:

Question 1

ALGORITHM FindCommonElements(A[0..n-1], B[0..m-1])
// Input: Two sorted lists A and B
// Output: List of common elements in both A and B
i ← 0
j ← 0
result ← empty list
while i < n AND j < m do
if A[i] = B[j] then
result.append(A[i])
i ← i + 1
j ← j + 1
else if A[i] < B[j] then
i ← i + 1
else
j ← j + 1
return result
a)

b)The maximum number of comparisons where the two lists are
sorted occurs when lists have no common elements, and it compares
every element in both lists is &#3627408474;+&#3627408475;−1.
For unsorted lists, you would compare each element of one list with
every element of the other list, in this case, the number of
comparisons would be &#3627408474;×&#3627408475;.

Question 2

First iteration when i = 0
i j
Array A
Array Count
603581981447
100000
a)

First iteration when i = 0
i j
Array A
Array Count
603581981447
101000

First iteration when i = 0
i j
Array A
Array Count
603581981447
101100

First iteration when i = 0
i j
Array A
Array Count
603581981447
201100

First iteration when i = 0
i j
Array A
Array Count
603581981447
301100

Second iteration when i = 1
i j
Array A
Array Count
603581981447
302100

Second iteration when i = 1
i j
Array A
Array Count
603581981447
302200

Second iteration when i = 1
i j
Array A
Array Count
603581981447
312200

Second iteration when i = 1
i j
Array A
Array Count
603581981447
312201

Third iteration when i = 2
i j
Array A
Array Count
603581981447
312301

Third iteration when i = 2
i j
Array A
Array Count
603581981447
313301

Third iteration when i = 2
i j
Array A
Array Count
603581981447
314301

Fourth iteration when i = 3
i j
Array A
Array Count
603581981447
314401

Fourth iteration when i = 3
i j
Array A
Array Count
603581981447
314501

Last iteration when i = 4
i j
Array A
Array Count
603581981447
314502
143547608198Sorted Array

603581981447
314502
143547608198Sorted Array
Array A
Array Count
0 1 2 3 4 5

–An algorithm is considered stable if it preserves the relative order
of duplicate elements in its output as they appeared in its input.
Stability for algorithms

b)The algorithm is not stable. Consider, as a counterexample,
the result of its application to 1, 1.
c)The algorithm is not in place because it uses two extra
arrays of size &#3627408475; (i.e., Count and S).

Thank You!