BCSE202Lkkljkljkbbbnbnghghjghghghghghghghgh

shivapatil54 40 views 49 slides Sep 22, 2024
Slide 1
Slide 1 of 49
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41
Slide 42
42
Slide 43
43
Slide 44
44
Slide 45
45
Slide 46
46
Slide 47
47
Slide 48
48
Slide 49
49

About This Presentation

code


Slide Content

BCSE202L- Data Structures and Algorithms Faculty Name: Dr. A. ILAVENDHAN School of Computer Science and Engineering (SCOPE)

COURSE OBJECTIVES To impart basic concepts of data structures and algorithms. To differentiate linear, non-linear data structures and their operations. To comprehend the necessity of time complexity in algorithms

COURSE OUTCOME Understand the fundamental analysis and time complexity for a given problem. Articulate linear, non-linear data structures and legal operations permitted on them. Identify and apply suitable algorithms for searching and sorting. Discover various tree and graph traversals.

TOPICS TO BE DISCUSSED IN MODULE 1 Algorithm Analysis Importance of algorithms and data structures - Fundamentals of algorithm analysis: Space and time complexity of an algorithm, Types of asymptotic notations and orders of growth - Algorithm efficiency – best case, worst case, average case - Analysis of non-recursive and recursive algorithms - Asymptotic analysis for recurrence relation: Iteration Method, Substitution Method, Master Method and Recursive Tree Method.

Importance of algorithms and data structures Algorithms and data structures are fundamental concepts in computer science that play a crucial role in the design and efficiency of software systems . Efficient Problem Solving: Algorithms provide step-by-step instructions to solve specific problems. Efficient algorithms help in solving complex tasks more quickly, reducing processing time , and improving overall system performance. Resource Management : Data structures enable efficient organization and management of data in memory. Choosing the right data structure can significantly impact memory usage, storage requirements, and execution speed

Importance of algorithms and data structures Optimized Performance: Well-designed algorithms and data structures can dramatically enhance the performance of software applications, making them more responsive and user-friendly . Scalability: Scalable solutions ensure that applications can handle increasing amounts of data without sacrificing performance. Code Reusability: This encourages code reusability and facilitates collaboration among different projects. Innovation and Optimization: Developing new algorithms and data structures is a significant area of research in computer science to optimize the existing systems.

Importance of algorithms and data structures Reduced Development Costs: Efficient algorithms and data structures can reduce hardware requirements and optimize server loads , leading to cost savings in infrastructure and development. Correctness and Reliability: Established algorithms and data structures have been thoroughly tested and analyzed , making them more reliable and less prone to errors. Interoperability: Standard algorithms and data structures are widely recognized and used across various programming languages and platforms. This promotes interoperability and ensures that software components can work together seamlessly .

Fundamentals of algorithm analysis T o understand the efficiency and performance characteristics of algorithms. Here are some key points about the fundamentals of algorithm analysis: Time Complexity: Time complexity measures the amount of time an algorithm takes to execute as a function of the input size. It helps in understanding how the algorithm's performance scales with increasing input data. The time taken for an algorithm is comprised of two times: Compilation time Run time

Fundamentals of algorithm analysis

Fundamentals of algorithm analysis Space Complexity: Space complexity evaluates the amount of memory an algorithm needs to execute as a function of the input size. It is essential for analyzing memory usage and understanding resource requirements. There are some standard analysis techniques for time and space complexity

Fundamentals of algorithm analysis Big O Notation: Big O notation is a mathematical notation used to describe the upper bound or worst-case time complexity of an algorithm. It provides a way to classify algorithms based on how their running time increases with the input size. We can express algorithmic complexity using the big-O notation. For a problem of size N: A constant-time function/method is “order 1” : O(1) A linear-time function/method is “order N” : O(N) A quadratic-time function/method is “order N squared” : O(N 2  )

Fundamentals of algorithm analysis Big O Notation: Let g and f be functions from the set of natural numbers to itself. The function f is said to be O(g) (read big-oh of g), if there is a constant  c > 0  and a natural number  n  such that f(n) ≤ cg(n) for all n ≥ n  0 ≤ f(n) ≤ cg(n) for all n ≥ n0 t

Fundamentals of algorithm analysis Omega Notation Apart from Big O notation, Omega notation ( Ω ) is used to describe the lower bound or best-case time complexity c*g(n) ≤ f(n) t

Fundamentals of algorithm analysis Theta notation ( Θ ) represents both the upper and lower bounds , giving a tight bound on the running time. c1* g(n) ≤ f(n) ≤ c2 * g(n) for all n ≥ n0 t

Fundamentals of algorithm analysis Asymptotic Analysis: Asymptotic analysis focuses on analyzing the performance of algorithms for large input sizes. It simplifies the complexity analysis by considering the growth rates of functions as input size approaches infinity Best Case, Worst Case, and Average Case Analysis : Algorithms may exhibit different performance characteristics depending on the input data. The best-case analysi s looks at the minimum time required, the worst-case analysis examines the maximum time required, and the average-case analysis studies the expected performance over all possible inputs .

Fundamentals of algorithm analysis Time Complexity Classes: Algorithms are categorized into complexity classes based on their time complexity growth rates. Common classes include O(1) (constant time) O(log n) (logarithmic time), O(n) (linear time), O(n log n) ( linearithmic time) O(n^2) (quadratic time), and more.

Fundamentals of algorithm analysis Space Complexity Classes: Similar to time complexity, space complexity classes classify algorithms based on their space requirements, such as O(1) (constant space) O(n) (linear space) O(n^2) (quadratic space), etc.

Constant time complexity : If a program required fixed amount of time for all input values is called Constant time complexity . Example : int sum(int a , int b) { return a+b ; }

Linear time complexity: If the input values are increased then the time complexity will changes. comments = 0 step Assignment statement= 1 step condition statement= 1 step loop condition for n times = n+1 steps body of the loop = n steps

Example : int sum(int A[],int n) { int sum=0,i; for ( i =0;i< n;i ++) sum= sum+A [ i ]; return sum; cost repetation total 1 1 1 1+1+1 1+(n+1)+n 2n+2 1 n n 1 1 1 3n+4

Big O Notation Example : f(n)=3n +2 & g(n)= n Formula : f(n)<=c g(n) n>=n 0 , c>0 ,n >=1 f(n)=3n+2 & g(n)=n Now 3n+2<= c.n 3n+2<=4.n Put the value of n =1 5<=4 false n=2 8<=8 true now n0>2 For all value of n>2 & c=4 now f(n)<= c.g (n) 3n+2<=4n for all value of n>2 Above condition is satisfied this O notation takes maximum amount of time to execute .so that it is called worst case complexity.

Omega Notation Example : f(n)=3n +2 Formula : f(n)>=c g(n) n>=n 0 , c>0 ,n >=1 f(n)=3n+2 3n+2>=1*n, c=1 put the value of n=1 n=1 5>=1 true n0>=1 for all value of n It means that f(n)= Ω g(n).

Theta notation Example : f(n)=3n+2 Formula : c 1 g(n)<=f(n)<=c 2 g(n) f(n)=3n+2 1*n<=3n+2<=4*n now put the value of n=1 we get 1<=5<=4 false n=2 we get 2<=8<=8 true n=3 we get 3<=11<=12 true Now all value of n>=2 it is true above condition is satisfied.

Problem:-Find upper bound ,lower bound & tight bound range for functions: f(n)= 2n+5 Solution:-Let us given that f(n)= 2n+5 , now g(n)= n lower bound=2n, upper bound =3n, tight bound=2n For Big –oh notation(O):- according to definition f(n)<=cg(n) for Big O we use upper bond so f(n)=2n+5, g(n)=n and c=3 according to definition 2n+5<=3n Put n=1 7<=3 false Put n=2 9<=6 false Put n=3 14<=9 false Put n=4 13<=12 false Put n=5 15<=15 true now for all value of n>=5 above condition is satisfied. C=3 n>=5

Big - omega notation :- f(n)>= c.g (n) we know that this Notation is lower bond notation so c=2 Let f(n)=2n+5 & g(n)=2.n Now 2n+5>= c.g (n); 2n+5>=2n put n=1 We get 7>=2 true for all value of n>=1,c=2 condition is satisfied. 3. Theta notation :- according to definition c1.g(n)<=f(n)<=c2.g 2.n<=2n+5<=3.n

Techniques to calculate Time Complexity: Here are some tricks to calculate complexities: Drop the constants: Anything you might think is O( kn ) (where k is a constant) is O(n) as well. Since the k term would not affect the complexity much for a higher value of n.

Techniques to calculate Time Complexity: Drop the non-dominant terms: : Anything you represent as O(n 2 +n) can be written as O(n 2 ). Similar to when non-dominant terms are ignored for a higher value of n. Consider all variables which are provided as input: O ( mn ) and O ( mnq ) might exist for some cases. In most cases, we try to represent the runtime in terms of the inputs which can even be more than one in number.   For example, The time taken to paint a park of dimension m * n → O ( kmn ) → O ( mn )

H ow do we translate code into time complexity ? Sequential Statements If we have statements with basic operations like comparisons, assignments, reading a variable . We can assume they take constant time each O(1). 1statement1; 2statement2; 4 statementN ; If we calculate the total time complexity, it would be something like this: total = time(statement1) + time(statement2) + ... time ( statementN ) Let’s use T(n) as the total time in function of the input size n, And t as the time complexity taken by a statement or group of statements. T(n) = t(statement1) + t(statement2) + ... + t( statementN );

H ow do we translate code into time complexity ? Sequential Statements If each statement executes a basic operation, we can say it takes constant time O(1). As long as you have a fixed number of operations, it will be constant time. Even if we have 1 or 100 of these statements. Example: Compute the square sum of 3 numbers. 1 2 3 4 5 6 7 function squareSum (a, b, c) { const sa = a * a; const sb = b * b; const sc = c * c; const sum = sa + sb + sc ; return sum; } As you can see, each statement is a basic operation (math and assignment). Each line takes constant time  O(1) . If we add up all statements’ time it will still be  O(1) .

H ow do we translate code into time complexity ? Conditional Statements T(n) = max([t(statement1) + t(statement2)], [time(statement3)]) 1 2 3 4 5 6 if ( isValid ) { statement1; statement2; } else { statement3; }

H ow do we translate code into time complexity ? Conditional Statements Since n log n has a higher order than n, we can express the time complexity as O(n log n). if ( isValid ) { array.sort (); return true; } else { return false; } The  if  block has a runtime of  O(n log n)  (that’s common runtime for  efficient sorting algorithms ). The else block has a runtime of O(1). So we have the following: O([n log n] + 1) => O(n log n)

H ow do we translate code into time complexity ? Linear time Loop Statements The number of times the loop is iterated, the complexity is folded to that number of times. Hence, in case for ( i = 0; i < n; i ++) { /*statement 1 of complexity O(1)*/ /*statement 2 of complexity O(1)*/ /*statement 3 of complexity O(1)*/ } For this example, the loop is executed , n time, we get the following: T(n) = n * [ t (statement1) + t (statement2) + t (statement3) ] the total complexity of logic within for loop is O(1+1+1) = O(3) Note O(3) complexity is still a constant. O(n) T(n) = max([t(statement1) + t(statement2)], [time(statement3)])

H ow do we translate code into time complexity ? Constant time Loop Statements If a constant number bounds the loop, let’s say 4 (or even 400). Then, the runtime is constant O(4) -> O(1). for ( i = 0; i < 4; i ++) { /*statement 1 of complexity O(1)*/ /*statement 2 of complexity O(1)*/ } It will always run statement 1 and 2 four times. T(n) = max([t(statement1) + t(statement2)], [time(statement3)])

H ow do we translate code into time complexity ? Nested Loop Statements if we have a nested loop like for ( i = 0; i < n; i ++) { for (j=0; j<n; j++ ) { /*statement 1 of complexity (1) */ /*statement 2 of complexity (1) */ /*statement 3 of complexity (1) */ } } Doing a similar analysis, total number of statements iterations would be (n*n) = n 2 i.e. n square. T(n) = max([t(statement1) + t(statement2)], [time(statement3)]) Note: Therefore, as the number of nested loops increases to k, the time complexity increases to n raise to the power k.

H ow do we translate code into time complexity ? Nested Loop Statements if we have a nested loop with different variable for ( i = 0; i < n; i ++) { /*statement 1 of complexity (1) */ for (j=0; j<m; j++ ) { /*statement 2 of complexity (1) */ /*statement 3 of complexity (1) */ } } T(n) = n * [t(statement1) + m * t(statement2...3)] Note: Assuming the statements from 1 to 3 are O(1), we would have a runtime of  O(n * m).

Logarithmic Time Loops Consider the following code, where we divide an array in half on each iteration (binary search): function fn1(array, target, low = 0, high = array.length - 1) { let mid; while ( low <= high ) { mid = ( low + high ) / 2; if ( target < array[mid] ) high = mid - 1; else if ( target > array[mid] ) low = mid + 1; else break; } return mid; } O (log n) which is log n is to the base 2 This function divides the array by its middle point on each iteration. The while loop will execute the amount of times that we can divide  array.length  in half. We can calculate this using the log function. E.g. If the array’s length is 8, then we the while loop will execute 3 times because  log 2 (2 3 ) = 3.

Time complexity of algorithm analysis Constant, function definition without loop - O(1) eg : age=22 Linear generally single loop - O(n) Eg : for( i =0;i< n;i ++) Quadratic - O(n 2 ) Example : int sum(int A[],int n)  O(1) { int sum=0,i;  O(1) for ( i =0;i< n;i ++)  O(n) sum= sum+A [ i ];  O(n) return sum;  O(1) }  O(1) + O(1)+ O(n)+ O(n)+ O(1) = O(2n+3) //Remove constant O( n)

Time complexity of algorithm analysis Example : Addition (Mat1,Mat2,n)  O(1) { for ( i =0;i< n;i ++)  O(n) { for ( i =0;i< n;i ++)  O(n 2 ) { Mats3[ i,j ]=Mat1[ i,j ]+Mat2[ i,j ]  O(n 2 ) } } return mats[ i,j ];  O(1) }  O(1) + O(n)+  O(n 2 ) +  O(n 2 )+ O(1) = O(2n 2 +n+2)  O( n 2 ) //Remove constant and non dominants

Time complexity of algorithm analysis

Time complexity of algorithm analysis

Practice Problems

Practice Problems

Practice Problems

Fundamentals of algorithm analysis Space Complexity Now there are two types of space complexity a) Constant space complexity b) Linear(variable)space complexity

Fundamentals of algorithm analysis 1. Constant space complexity: A fixed amount of space for all the input values. Example : int square(int a) { return a*a; } Here algorithm requires fixed amount of space for all the input values.

Fundamentals of algorithm analysis Linear space complexity: The space needed for algorithm is based on size. Size of the variable ‘n’ = 1 word Array of a values = n word Loop variable = 1 word Sum variable = 1 word Example: int sum(int A[],int n) { n int sum=0,i; 1 for ( i =0;i< n;i ++) 1 Sum= sum+A [ i ]; 1 Return sum; } Ans : 1+n+1+1 = n+3 words

Examples: 1.Algorithm sum(a,,b,c) { a=10; a  1 b=20; b  1 c=a+b; c  1 } s(p)=c+sp 3+0=3 0(n)=3

2. algorithm sum( a,n ) { total=0; 1 Fori =1 to n do 1 Total= total+a [ i ] n Return total