Section 1 slides for design and analysis of algorithms
Size: 2.7 MB
Language: en
Added: Oct 16, 2024
Slides: 43 pages
Slide Content
Section 1 CA : [Name of CA]
Section 1 agenda Review big-Oh notation Review Karatsuba Multiplication Review mergesort Practice in groups
Asymptotic analysis Aka big-Oh notation
In this class we will use… Big-Oh notation! Gives us a meaningful way to talk about the running time of an algorithm, independent of programming language, computing platform, etc., without having to count all the operations.
Main idea: Focus on how the runtime scales with n (the input size). Number of operations Asymptotic Running Time We say this algorithm is “asymptotically faster” than the others. (Only pay attention to the largest function of n that appears.) Some examples…
Why is this a good idea? This constant factor of 10 depends a lot on my computing platform… These lower-order terms don’t really matter as n gets large. We’re just left with the n 2 term! That’s what’s meaningful.
Pros and Cons of Asymptotic Analysis Abstracts away from hardware- and language-specific issues. Makes algorithm analysis much more tractable. Allows us to meaningfully compare how algorithms will perform on large inputs. Only makes sense if n is large (compared to the constant factors). Pros: Cons: 1000000000 n is “better” than n 2 ?!?!
Informal definition for O(…) Here, “constant” means “some number that doesn’t depend on n.” pronounced “big-oh of …” or sometimes “oh of …”
T(n) = 2n 2 + 10 g(n) = n 2
T(n) = 2n 2 + 10 g(n) = n 2 3g(n) = 3n 2
T(n) = 2n 2 + 10 g(n) = n 2 3g(n) = 3n 2 n =4
Formal definition of O(…) “There exists” “For all” “such that” “If and only if”
T(n) = 2n 2 + 10 g(n) = n 2
T(n) = 2n 2 + 10 g(n) = n 2 3g(n) = 3n 2 (c=3)
T(n) = 2n 2 + 10 g(n) = n 2 3g(n) = 3n 2 n =4 (c=3)
Formally: Choose c = 3 Choose n = 4 Then: T(n) = 2n 2 + 10 g(n) = n 2 3g(n) = 3n 2 n =4
Formally: Choose c = 7 Choose n = 2 Then: T(n) = 2n 2 + 10 g(n) = n 2 7g(n) = 7n 2 n =2 There is not a “correct” choice of c and n
Choose c = 1 Choose n = 1 Then g(n) = n 2 T(n) = n
Ω(…) means a lower bound Switched these!!
Choose c = 1/3 Choose n = 2 Then g(n)/3 = n T(n) = nlog(n) g(n) = 3n
Θ(…) means both!
Take-away from examples To prove T(n) = O(g(n)), you have to come up with c and n so that the definition is satisfied. To prove T(n) is NOT O(g(n)), one way is proof by contradiction : Suppose (to get a contradiction) that someone gives you a c and an n so that the definition is satisfied. Show that this someone must by lying to you by deriving a contradiction.
Another example: polynomials Siggi the Studious Stork Try to prove it yourself first!
Example: polynomials SLIDE SKIPPED IN CLASS! (Note this is also in the reading).
Example: polynomials Definition of c SLIDE SKIPPED IN CLASS! (Note this is also in the reading).
Example: more polynomials SLIDE SKIPPED IN CLASS! (Note this is also in the reading).
Example: more polynomials SLIDE SKIPPED IN CLASS!
Yet more examples n 3 + 3n = O(n 3 – n 2 ) n 3 + 3n = Ω(n 3 – n 2 ) n 3 + 3n = Θ(n 3 – n 2 ) 3 n is NOT O(2 n ) log(n) = Ω(ln(n)) log(n) = Θ( 2 loglog(n) ) Siggi the Studious Stork Work through these on your own! remember that log = log 2 in this class.
Adding big-O notation Suppose you have two functions f(n) = O(g(n)) and h(n) = O(p(n)). The combined function is f(n) + h(n) If g(n) grows faster than p(n), then f(n) + h(n) = O(g(n)). If p(n) grows faster than g(n), then f(n) + h(n) = O(p(n)). If g(n) and p(n) grow at the same rate, then f(n) + h(n) = O(g(n)) or O(p(n)) E.g., f(n) = O(n 2 ); g(n) = O(n) f(n) + g(n) = O(n 2 ) What about subtraction and multiplication?
Multiplying big-O notation Suppose you have two functions f(n) = O(g(n)) and h(n) = O(p(n)). The combined function is f(n) * h(n) The resulting complexity is O(g(n) * p(n)) E.g., f(n) = O(n 2 ); g(n) = O(n) f(n) * g(n) = O(n 3 )
Some brainteasers Are there functions f, g so that NEITHER f = O(g) nor f = Ω(g)? Are there non-decreasing, non-negative functions f, g so that the above is true? Define the n’th fibonacci number by F(0) = 1, F(1) = 1, F(n) = F(n-1) + F(n-2) for n > 2. 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, … True or false: F(n) = O(2 n ) F(n) = Ω(2 n ) Ollie the Over-achieving Ostrich
Recap: Asymptotic Notation This is my happy face!
Divide-and-Conquer
Divide and conquer Break problem up into smaller (easier) sub-problems Big problem Smaller problem Smaller problem Yet smaller problem Yet smaller problem Yet smaller problem Yet smaller problem [Lecture 1: Slide 76]
Divide and conquer for multiplication 1 2 3 4 One n-digit multiply Four (n/2)-digit multiplies Break up an n-digit integer: Suppose n is even
There are n 2 1-digit problems 1 problem of size n 4 problems of size n/2 4 t problems of size n/2 t ____ problems of size 1 … … Note: this is just a cartoon – I’m not going to draw all 4 t circles!
Better divide and conquer: Karatsuba integer multiplication Recursively compute these THREE things: ac bd (a+b)(c+d) (a+b)(c+d) = ac + bd + ad + bc Subtract these off get this Assemble the product:
What’s the running time? 1 problem of size n 3 problems of size n/2 3 t problems of size n/2 t ____ problems of size 1 … Note: this is just a cartoon – I’m not going to draw all 3 t circles! …
MergeSort A divide-and-conquer algorithm for sorting an array of numbers Big problem Smaller problem Smaller problem Yet smaller problem Yet smaller problem Yet smaller problem Yet smaller problem
MergeSort Pseudocode MERGESORT (A): If A has length 1, It is already sorted! Sort the right half Sort the left half Merge the two halves [Lecture 2: Slide 63]