Complexity Analysis of Recursive Function

MeghajKumarMallick 366 views 19 slides Apr 13, 2020
Slide 1
Slide 1 of 19
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19

About This Presentation

This is an PPT of Data Structure. It include the following topic "Complexity Analysis of Recursive Function".


Slide Content

‘ complexity analysis of recursive function ’ PRESENTED BY: Prashi Jain ROLL NO. : MCA/25022/18

Recursive algorithm A recursive function is a function that is defined in terms of itself. Similarly, an algorithm is said to be recursive if the same algorithm is invoked in the body. Direct recursion occurs when an algorithm calls itself . Indirect recursion occurs when a function calls another function , eventually resulting in the original method being called again. A A A B

Properties A recursive function can go infinite like a loop. To avoid infinite running of recursive function, there are two properties that a recursive function must have − Base criteria   − There must be at least one base criteria or condition, such that, when this condition is met the function stops calling itself recursively. Progressive approach  − The recursive calls should progress in such a way that each time a recursive call is made it comes closer to the base criteria.

Analysis of Recursion One may argue why to use recursion, as the same task can be done with iteration. The first reason is, recursion makes a program more readable and because of latest enhanced CPU systems, recursion is more efficient than iterations. There are mainly two ways for analysis of any algorithm :- 1- Time Complexity 2- Space Complexity

Time Complexity Time complexity is the amount of time it takes to run an algorithm. In case of iterations, we take number of iterations to count the time complexity. Likewise, in case of recursion, assuming everything is constant, we try to figure out the number of times a recursive call is being made. A call made to a function is Ο(1), hence the (n) number of times a recursive call is made makes the recursive function Ο(n).

SPACE COMPLEXITY: Space complexity is counted as what amount of extra space is required for a module to execute. In case of iterations, the compiler hardly requires any extra space. The compiler keeps updating the values of variables used in the iterations. In case of recursion, the system needs to store activation record each time a recursive call is made. Hence, it is considered that space complexity of recursive function may go higher than that of a function with iteration.

Finding complexity using tfc We can find complexity of iteration algorithm using frequency count. For example: For(i=1;i<n;i++) n times { For(j=1 ; j<n ;j++) (n-1)n times { Statement; n 2 times } } The total frequency count for the above algorithm is 4n 2 -1. Then the complexity of algorithm is O(n 2 ).

Why recurrence relation ? When we use tfc (total frequency count) method for analysis of any algorithm then it is useful for many algorithms . But Many algorithms such as Tower of Hanoi, selection sort , binary search , Fibonacci and many more , where we use a recursive then finding the complexity of that using tfc is not possible , because you can not exactly find out how many times a statement will execute in a recursive function. For solving the above problem we use recurrence relation to find out the complexity of any recursive algorithm.

What is a recurrence relation? When analysing a recursive function for its step count ( running time), we often obtain a recursive formula. These recursive formulas are referred to as recurrence relations which are solved by repeated substitutions method. A recurrence relation, T(n), is a recursive function of an integer variable n. Example: 1 if n=0 T(n)= 1+T(n-1) for n>o The portion of the definition that does not contain T is called the base case of the recurrence relation . The portion that contains T is called the recurrent or recursive case .

Steps for finding complexity of recursive function forming a recurrence relation solving the recurrence relation

Forming a recurrence relation Example function factorial(n) 1-2. if (n=1) then factorial=1; Or else 3. Factorial =n*factorial(n-1) And end factorial The base case is reached when n = = 0. The method performs one comparison. Thus, the number of operations when n = = 0, T(0), is some constant a. When n > 0, the method performs two basic operations and then calls itself, using ONE recursive call, with a parameter n – 1. Therefore the recurrence relation is: T(0) = a for some constant a T(n) = b + T(n – 1) for some constant b

Solving Recurrence Relations Steps: Expand the recurrence relation. Express the expansion as a summation by plugging the recurrence back into itself until you see a pattern.   Evaluate the summation

For example: tower of Hanoi problem

Algorithm : tower of Hanoi TOH(n ,x , y , z) { If(n>=1) { //put (n-1) disk to z by using y TOH((n-1), x ,z , y) // move larger disk to right place Move : x- -> y //put (n-1) disk to right place TOH((n-1), z ,y ,x) } x y z

Solution: The recurrence relation for the running time of the method hanoi is: T(n) = a if n = 1 T(n) = 2T(n - 1) + b if n > 1

Expanding: T(1) = a (1) T(n) = 2T(n – 1) + b if n > 1 (2) = 2[2T(n – 2) + b] + b = 2 2 T(n – 2) + 2b + b by substituting T(n – 1) in (2) = 2 2 [2T(n – 3) + b] + 2b + b = 2 3 T(n – 3) + 2 2 b + 2b + b by substituting T(n-2) in (2) = 2 3 [2T(n – 4) + b] + 2 2 b + 2b + b = 2 4 T(n – 4) + 2 3 b + 2 2 b + 2 1 b + 2 b by substituting T(n – 3) in (2) = …… = 2 k T(n – k) + b[2 k- 1 + 2 k– 2 + . . . 2 1 + 2 ] The base case is reached when n – k = 1  k = n – 1, we then have: Therefore the complexity of Tower of Hanoi is O(2 n ).

Reference: Book: Data structure and algorithms (GAV PAI) Websites: www.tutorialspoint.com

Thank you
Tags