Time Complexity of an algorithm/code - Part 2.pptx
ZuaAuh
1 views
10 slides
Oct 20, 2025
Slide 1 of 10
1
2
3
4
5
6
7
8
9
10
About This Presentation
The Time Complexity of an algorithm/code is not equal to the actual time required to execute a particular code, but the number of times a statement executes.
Size: 110.47 KB
Language: en
Added: Oct 20, 2025
Slides: 10 pages
Slide Content
Time Complexity Part 2 Instructors Farhat Jalil
Slide courtesy
Asymptotic Complexity The 5N+3 time bound is said to "grow asymptotically" like N This gives us an approximation of the complexity of the algorithm Ignores lots of (machine dependent) details, concentrate on the bigger picture
Big Oh Notation To denote asymptotic upper bound, we use O-notation. For a given function g(n), we denote by O(g(n)) (pronounced “big-oh of g of n”) the set of functions: O(g(n ))= { f(n) : there exist positive constants c and n0 such that 0≤ f(n)≤ c∗g (n) for all n≥n0 }
Omega Notation To denote asymptotic lower bound, we use Ω-notation. For a given function g(n), we denote by Ω(g(n)) (pronounced “big-omega of g of n”) the set of functions: Ω(g(n ))= { f(n) : there exist positive constants c and n0 such that 0≤c∗g(n)≤f(n) for all n≥n0 }
Theta Notation To denote asymptotic tight bound, we use Θ-notation. For a given function g(n), we denote by Θ(g(n)) (pronounced “big-theta of g of n”) the set of functions: Θ(g(n ))= { f(n) : there exist positive constants c1,c2 and n0 such that 0≤c1∗g(n)≤f(n)≤c2∗g(n) for all n>n0 }
Example : Comparing Functions Which function is better? 10 n 2 Vs n 3
Comparing Functions As inputs get larger, any algorithm of a smaller order will be more efficient than an algorithm of a larger order
Big-Oh Notation Even though it is correct to say “7n - 3 is O(n 3 )”, a better statement is “7n - 3 is O(n)”, that is, one should make the approximation as tight as possible Simple Rule: Drop lower order terms and constant factors 7n-3 is O(n) 8n 2 log n + 5n 2 + n is O(n 2 log n)