Analysis of Algorithms, recurrence relation, solving recurrences

MinaksheePatil 250 views 37 slides Jun 11, 2024
Slide 1
Slide 1 of 37
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37

About This Presentation

Computer Algorithm


Slide Content

Introduction

Agenda Introduction Characteristics of an algorithm Pseudocode conventions Recursive algorithms Performance analysis Performance Measurement

Algorithm An algorithm is a finite set of instructions that, if followed, accomplishes a particular task. Characteristics of algorithms are as follows: Input Output Definiteness Finiteness Effectiveness

Pseudocode Conventions Comments // Block of statements { and } Identifiers Assignment of values to variables Boolean values Multidimensional array Looping statemets Conditional statements Input and output Procedure

Recursive Algorithms A recursive function is a function that is defined in terms of itself. An algorithm is said to be recursive if the same algorithm is invoked in the body. An algorithm that calls itself is direct recursive. Algorithm A is said to be indirect recursive if it calls another algorithm which in turn calls A.

Example Write algorithm to print Fibonacci series for given number Write an algorithm to print factorial of given number Write an algorithm to print sum of array elements

Performance Analysis Aim of this course is to develop skills for making evaluative judgments about algorithm. There are two criteria for judging algorithms that have a more direct relationship to performance. These have to do with their computing time and storage requirements. [Space/Time complexity] The space complexity of an algorithm is the amount of memory it needs to run to completion. The time complexity of an algorithm is the amount of computer time it needs to run to completion Performance evaluation can be loosely divided into two major phases: (1) a priori estimates and (2) a posteriori testing. We refer to these as performance analysis and performance measurement respectively .

Space Complexity The space complexity of an algorithm is the amount of memory it needs to run to completion. The space needed by each algorithms is seen to be the sum of the following components: 1. A fixed part that is independent of the characteristics (e.g., number, size)of the inputs and outputs. This part typically includes the instruction space(i.e., space for the code), space for simple variables , space for constants, and soon. 2.A variable part that consists of the space needed by component variables whose size is dependent on the particular problem instance being solved, the space needed by referenced variables, the recursion stack space.

Space Complexity

Time Complexity The time complexity of an algorithm is the amount of computer time it needs to run to completion. The time T(P) taken by a program P is the sum of the compile time and the run (or execution)time.

Example

Asymptotic Notations Analysis of algorithm is the process of analyzing the problem-solving capability of the algorithm in terms of the time and size required (the size of memory for storage while implementation). However, the main concern of analysis of algorithms is the required time or performance. Generally, we perform the following types of analysis − Worst-case − The maximum number of steps taken on any instance of size a . Best-case − The minimum number of steps taken on any instance of size a . Average case − An average number of steps taken on any instance of size a .

Asymptotic Analysis When analyzing the running time or space usage of programs, we usually try to estimate the time or space as function of the input size. For example, when analyzing the worst case running time of a function that sorts a list of numbers, we will be concerned with how long it takes as a function of the length of the input list.  For example, we say the standard insertion sort takes time  T ( n ) where  T ( n ) = c*n 2 +k  for some constants  c and k .  In contrast, merge sort takes time  T  ′( n )  = c ′ *n*log 2 ( n ) +  k ′ . Comp 122

Asymptotic Complexity Running time of an algorithm as a function of input size n for large n . Describes behavior of function in the limit. Written using Asymptotic Notation . Comp 122

O -notation Comp 122 g ( n ) is an asymptotic upper bound for f ( n ).

Examples 3n+2=O(n) 3n+2 <= c* n 3n+2 <= 4n for all n>=2 100n +6 =O(n) 100n + 6 <= 101n for all n>=6 Show that 3 n 3 = O ( n 4 ) for appropriate c and n . Comp 122

-notation Comp 122 g ( n ) is an asymptotically tight bound for f ( n ).

Example 3n +2 = Q( n ) c 1* n <= 3n +2 <= c 2* n What constants for n , c 1 , and c 2 will work? Make c 1 a little smaller than the leading coefficient, and c 2 a little bigger. c 1 =3, c 2 =4, and n =2 3*n<=3n+2<=4*n for all n>=2 Prove that 10n 2 -3n= Q ( n 2 ) Prove that n 2 /2-3 n = Q ( n 2 ) Comp 122

 -notation Comp 122 g ( n ) is an asymptotic lower bound for f ( n ).

Example 3n + 2 =  (n) 3n + 2>= c* n 3n +2>= 3n for all n>=0 n =  (log n ). Choose c and n . Comp 122

Relations Between Q , O, W Comp 122

Little oh o:

Little Omega ω

Ordered functions by their growth rates Comp 122

Comp 122

Recurrence Recurrences arise when an algorithm contains recursive calls to itself What is the actual running time of the algorithm? Need to solve the recurrence Find an explicit formula of the expression Bound the recurrence by an expression that involves n

Example Recurrences T(n) = T(n-1) + n Θ (n 2 ) Recursive algorithm that loops through the input to eliminate one item T(n) = T(n/2) + c Θ ( logn ) Recursive algorithm that halves the input in one step T(n) = T(n/2) + n Θ (n) Recursive algorithm that halves the input but must examine every item in the input T(n) = 2T(n/2) + 1 Θ (n) Recursive algorithm that splits the input into 2 halves and does a constant amount of other work

Methods for Solving Recurrences Substitution method Recursion tree method Master method

The substitution method Guess a solution Use induction to prove that the solution works

Example T(n) = 2T(n/2) + 1 We guess that the solution is T(n)= O(n) The substitution method requires us to prove that T(n)<= c*n for an appropriate choice of the constant c>0 We start by assuming that this bound holds for all positive m<n in particular for m=n/2 T(n/2)<= cn /2 T(n) <= 2( cn /2) +1 <= cn + 1 <= cn

The recursion-tree method Convert the recurrence into a tree: Each node represents the cost incurred at various levels of recursion Sum up the costs of all levels

Example 1 32 W(n) = 2W(n/2) + n 2 Subproblem size at level i is: n/2 i Subproblem size hits 1 when 1 = n/2 i  i = lgn Cost of the problem at level i = ( n/2 i ) 2 No. of nodes at level i = 2 i Total cost:  W(n) = O(n 2 )

33 Example 2 E.g.: T(n) = 3T(n/4) + cn 2 Subproblem size at level i is: n/4 i Subproblem size hits 1 when 1 = n/4 i  i = log 4 n Cost of a node at level i = c( n/4 i ) 2 Number of nodes at level i = 3 i  last level has 3 log 4 n = n log 4 3 nodes Total cost:  T(n) = O(n 2 )

34 Master’s method “Cookbook” for solving recurrences of the form: where, a ≥ 1 , b > 1 , and f(n) > 0 f(n) is asymptotically positive for all sufficiently large n.

35 Master’s method “ Cookbook” for solving recurrences of the form: where, a ≥ 1 , b > 1 , and f(n) > 0 Idea: compare f(n) with n log b a Case 1: if f(n) = O( n log b a -  ) for some  > 0 , then: T(n) = ( n log b a ) Case 2: if f(n) = ( n log b a ) , then: T(n) = ( n log b a logn ) Case 3: if f(n) = ( n log b a +  ) for some  > 0 , and if af (n/b) ≤ cf (n) for some c < 1 and all sufficiently large n, then: T(n) = (f(n)) regularity condition

36 Examples T(n) = 2T(n/2) + n a = 2, b = 2, log 2 2 = 1 Compare n log 2 2 with f(n) = n  f(n) = (n)  Case 2  T(n) = ( nlogn )

37 Examples T(n) = 2T(n/2) + n 2 a = 2, b = 2, log 2 2 = 1 Compare n with f(n) = n 2  f(n) = (n 1+ ) Case 3  verify regularity cond. a f(n/b) ≤ c f(n)  2 n 2 /4 ≤ c n 2  c = ½ is a solution (c<1)  T(n) = (n 2 )
Tags