Analysis of Algorithms_Under Graduate Class Slide

HanumatGSastry 16 views 33 slides Mar 12, 2025
Slide 1
Slide 1 of 33
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33

About This Presentation

Analysis of Algorithms Under Graduate Class Slide


Slide Content

Analysis of Algorithms

Computation Models Turing Machine Model Random Access Machine (RAM) Model.

Analysis of Algorithms The present day algorithms are based on the RAM (Random Access Machine) model. In RAM model, instructions execute one after another with no concurrent operations.

Analysis of Algorithms

Analysis of Algorithms Worst Case Complexity Average Case Complexity Best Case Complexity

Worst Case Complexity The Worst Case Complexity of an algorithm is the function defined by the maximum number of steps taken on any instance size n.

Best Case Complexity The best case complexity of the algorithm is the function defined by the minimum number of steps taken on any instance of size n.

Average Case Complexity The average case complexity of the algorithm is the function defined by an average number of steps taken on an instance of size n.

Graphical representation of Worst Case, Average Case and Best Case Complexity

Performance Evaluation of Algorithms Performance evaluation can be divided into two major phases. 1) Priori estimates (Performance analysis) 2) Posteriori testing (Performance measurement)

Performance Analysis The efficiency of an algorithm can be decided by measuring the performance of an algorithm. The performance of an algorithm depends upon two factors. 1) Amount of time required by an algorithm to execute ( known as time complexity). 2) Amount of storage required by an algorithm ( known as Space complexity).

Space Complexity The space complexity of an algorithm is the amount of memory it needs to run to completion .

Computing Space Complexity The space requirement S(P) of any algorithm P may be written as S(P)= c+S p , where c is a constant and S p is a instance characteristics.

Two factors of Space Complexity Two factors are involved in Space complexity computation (constant and instance characteristics). Constant characteristic (c) is a constant, it denotes the space of input and outputs. This space is an amount of space taken by instructions, variables and identifiers. Instance characteristic (S p ) is a space dependent upon instance (particular problem instance).

Addition of three number-Space complexity Algorithm Add( a,b,c ) { //Problem Description: This algorithm computes //the addition of three elements //Input: a,b and c are of floating type //Output: The addition is returned return a+b+c ; }

The space requirement for addition of three numbers algorithm is S(P)=C+ S p The problem instance is characterized by specific values of a, b and c. By assuming a, b and c occupies one word then total size comes to 3. Space needed by a, b and c is independent of instance characteristics. Consequently S p (instance characteristics)=0.

Sum of ‘n’ numbers Algorithm Sum( a,n ) { S<-0.0; for i <-1 to n do S<- S+a [ i ]; return s; } The Space requirement for sum of n numbers algorithm is S(P)>=(n+3) The ‘n’ space required for a[], one unit space for n, one unit for i and one unit for S.

Sum of ‘n’ numbers using Recursion Algorithm Rsum ( a,n ) { if(n<=0) then return 0.0; else return Rsum (a, n-1)+a[n]; } The Space requirement is S(P)>=3(n+1) The internal stack used for recursion includes space for formal parameters, local variables and return address. The space required by each call to function Rsum requires atleast three words (space for n values + space for return address + pointer to a[]). The depth of recursion is n+1 ( n times call to function and one return call). The recursion stack space will be >=3(n+1).

Time Complexity The time complexity of an algorithm is the amount of computer time required by an algorithm to run to completion. The time T(P) by a program P is the sum of the compile time and the run (or execution) time. The compile time does not depend on the instance characteristics and the compiled program runs several times without recompilation.

Run time complexity Run time complexity of a program will be determined by t p ( instance characteristics). Run time complexity depends upon so many factors.

Issues in Time Complexity It is difficult to compute the time complexity in terms of physically clocked time or instance in multiuser system, executing time depends on may factors such as: System load Number of other programs running Instruction set used Speed of underlying hardware

Frequency count The time complexity is therefore given in terms of frequency count . Frequency count is a count denoting number of times of execution of statement . Time efficiency is analyzed by determining the number of repetitions of the basic operation as a function of input size .

sum of ‘n’ numbers -Time complexity Statement Steps per execution Frequency Total steps Algorithm Sum( a,n ) - { - S<-0.0; 1 1 1 for i <-1 to n do 1 n+1 n+1 S<- S+a [ i ]; 1 n n return s; 1 1 1 } -- Total 2n+3

Sum of ‘n’ using Recursion-Time Complexity Statement Steps per execution Frequency n=0 n>0 Total steps n=0 n>0 Algorithm RSum ( a,n ) - - - - { if(n<=0) then 1 1 1 1 1 return 0.0; 1 1 1 else return Rsum (a,n-1)+a[n]; 1+x 1 1+x } Total 2 2+x X= t Rsum (n-1)

Basic operation Basic operation is nothing but core operation, generally basic operation resides in inner loop. Example in Sorting algorithm the basic operation is comparing the elements and placing them in appropriate position.

Input Size One of the instance characteristics for run time complexity of an algorithm is input size. Usually longer input size make the algorithm to run longer time. The input size for the problem of summing an array with ‘n’ elements is n+1 (n for listing the ‘n’ elements and 1 for ‘n’ value)

Input size and basic operation examples Problem Input size measure Basic operation Searching for key in a list of n items Number of list’s items, i.e. n Key comparison Multiplication of two matrices Matrix dimensions or total number of elements Multiplication of two numbers Checking primality of a given integer n n’ size = number of digits (in binary representation) Division Typical graph problem #vertices and/or edges Visiting a vertex or traversing an edge

Order of Growth Measuring the performance of an algorithm in relation with the input size ‘n’ is called order of growth. n Log n n log n n 2 2 n 1 1 1 2 1 2 4 4 4 2 8 16 16 8 3 24 64 256 16 4 64 256 65,536 32 5 160 1024 4,294,967,296 Order of growth for varying input size of ‘n’

Growth Rate of Common Functions

T(n)=c op C(n) Measuring Running Time Running time of basic operation Time taken by the basic operation to execute Number of times the operation needs to be executed

General Plan for Analyzing Time Efficiency of Non Recursive Algorithm Decide on a parameter (or parameters) indicating an input’s size. Identify the algorithm’s basic operation. Check whether the number of times the basic operation is executed depends only on the size of an input. If it also depends on some additional property, the worst-case, average-case and if necessary best-case efficiencies have to be investigated separately. Set up a sum expressing the number of times the algorithm’s basic operation is executed Using standard formulas and rules of sum manipulation, either find a closed-form formula for the count or at the very least, establish its order of growth.

Asymptotic Notations Asymptotic running time of an algorithm is defined in terms of functions. Asymptotic notation is useful describe the running time of the algorithm. Asymptotic notations give time complexity as “fastest possible”, “slowest possible” or “average time”. Bigh Oh (Ο) , Omega (Ω) and Theta (Θ) notations are useful to represent the asymptotic complexity of algorithms.
Tags