MeenakshiGupta233101
13 views
20 slides
Aug 05, 2024
Slide 1 of 20
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
About This Presentation
Data Structure Asymptotic Notations
Size: 360.64 KB
Language: en
Added: Aug 05, 2024
Slides: 20 pages
Slide Content
Asymptotic Notations
Asymptotic notations Asymptotic notations are mathematical tools to represent time complexity of algorithms for asymptotic analysis. It is a measure of efficiency of algorithms that doesn’t depend on machine specific constants, and doesn’t require algorithms to be implemented and time taken by programs to be compared.
Asymptotic notations Big-O Big-Omega Theta Small-O Small Omega
Big-O (O) f(N) = O(g(N)) iff , There are positive constants c and n such that f(N) c*g(N) when N n The growth rate of f(N) is less than or equal to the growth rate of g(N) g(N) is an upper bound on f(N) Big Oh represents “ UPPER BOUND ” of an algorithm
Big-O (O) Growth rate
Big-O (O) When considering the growth rate of a function using Big-O: Ignore the lower order terms and the coefficients of the highest-order term No need to specify the base of logarithm Changing the base from one constant to another changes the value of the logarithm by only a constant factor
Common Growth Rates Time complexity Example O(1) constant Adding to the front of a linked list O(log N ) log Finding an entry in a sorted array O( N ) linear Finding an entry in an unsorted array O( N log N ) n-log-n Sorting n items by ‘divide-and-conquer’ O( N 2 ) quadratic Shortest path between two nodes in a graph O( N 3 ) cubic Simultaneous linear equations O(2 N ) Exponential The Towers of Hanoi problem
Big-Omega f(N) = (g(N)), iff There are positive constants c and n such that f(N) c g(N) when N n The growth rate of f(N) is greater than or equal to the growth rate of g(N).
Big-Omega Growth rate
Big-Theta f(N) = (g(N)) iff There are positive constants c1, c2 and n0 such that c1*g(n) <= f(n) <= c2*g(n) when N n The growth rate of f(N) equals the growth rate of g(N)
Big-Theta Growth rate
Small-O (o) f(N) = o(g(N)) iff , There are positive constants c and n such that f(N) < c*g(N) when N n The growth rate of f(N) is less than the growth rate of g(N)
Small-Omega ω f(N) = ω (g(N)), iff There are positive constants c and n such that f(N) > c g(N) when N n The growth rate of f(N) is greater than the growth rate of g(N).
Complexity of Algorithm
Complexity of algorithm It is a measure of performance of an algorithm. It is the function f(n) that gives the running time and/or space in terms of the input size n
Need for Calculating Complexity of algorithm To determine resource consumption CPU time Memory space To compare different methods for solving the same problem before actually implementing them To find an efficient algorithm
Factors affecting Complexity of algorithm An algorithm’s performance depends on External factors Internal factors External Factors Speed of the computer on which it is run Quality of the compiler Size of input to the algorithm Internal Factors Memory space require to run Time required to run
Factors affecting Complexity of algorithm If two or more solutions of a problem are executed on same machine and same set of data, then external factors remain same. The storage space required by an algorithm is simply a multiple of data size n. Therefore, the term ‘complexity’ shall refer to the running time of the algorithm.
Cases of complexity Best case The minimum running time of an algorithm Average case The expected running time of an algorithm Worst case The maximum running time of an algorithm Example: Searching an element that does not exist in data Sorting a list of numbers in ascending order, and list is in descending order
Time-Space Tradeoff It means that by increasing the amount of space for storing the data, one may be able to reduce the time needed for processing the data, or vice-versa. Example – comparison of linear and binary search