Click here to load reader

View

747Download

3

Tags:

Embed Size (px)

- 1. Introduction to Algorithms 6.046J/18.401J LECTURE 2 Asymptotic Notation O-, -, and -notation Recurrences Substitution method Iterating the recurrence Recursion tree Master method Prof. Erik DemaineSeptember 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.1

2. Asymptotic notation O-notation (upper bounds): We write f(n) = O(g(n)) if there exist constants c > 0, n0 > 0 such that 0 f(n) cg(n) for all n n0.September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.2 3. Asymptotic notation O-notation (upper bounds): We write f(n) = O(g(n)) if there exist constants c > 0, n0 > 0 such that 0 f(n) cg(n) for all n n0.EXAMPLE: 2n2 = O(n3)(c = 1, n0 = 2)September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.3 4. Asymptotic notation O-notation (upper bounds): We write f(n) = O(g(n)) if there exist constants c > 0, n0 > 0 such that 0 f(n) cg(n) for all n n0.EXAMPLE: 2n2 = O(n3)(c = 1, n0 = 2) functions, not valuesSeptember 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.4 5. Asymptotic notation O-notation (upper bounds): We write f(n) = O(g(n)) if there exist constants c > 0, n0 > 0 such that 0 f(n) cg(n) for all n n0.EXAMPLE: 2n2 = O(n3)(c = 1, n0 = 2) funny, one-way functions,equality not valuesSeptember 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.5 6. Set definition of O-notation O(g(n)) = { f(n) : there exist constantsc > 0, n0 > 0 suchthat 0 f(n) cg(n)for all n n0 }September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.6 7. Set definition of O-notation O(g(n)) = { f(n) : there exist constantsc > 0, n0 > 0 suchthat 0 f(n) cg(n)for all n n0 }EXAMPLE: 2n2 O(n3)September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.7 8. Set definition of O-notation O(g(n)) = { f(n) : there exist constantsc > 0, n0 > 0 suchthat 0 f(n) cg(n)for all n n0 }EXAMPLE: 2n2 O(n3)(Logicians: n.2n2 O(n.n3), but itsconvenient to be sloppy, as long as weunderstand whats really going on.)September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.8 9. Macro substitutionConvention: A set in a formula representsan anonymous function in the set.September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.9 10. Macro substitutionConvention: A set in a formula representsan anonymous function in the set.EXAMPLE: f(n) = n3 + O(n2) means f(n) = n3 + h(n) for some h(n) O(n2) .September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.10 11. Macro substitutionConvention: A set in a formula representsan anonymous function in the set.EXAMPLE: n2 + O(n) = O(n2) means for any f(n) O(n):n2 + f(n) = h(n)for some h(n) O(n2) .September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.11 12. -notation (lower bounds)O-notation is an upper-bound notation. Itmakes no sense to say f(n) is at least O(n2).September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.12 13. -notation (lower bounds)O-notation is an upper-bound notation. Itmakes no sense to say f(n) is at least O(n2). g(n)) = { f(n) : there exist constantsc > 0, n0 > 0 suchthat 0 cg(n) f(n)for all n n0 }September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.13 14. -notation (lower bounds)O-notation is an upper-bound notation. Itmakes no sense to say f(n) is at least O(n2). g(n)) = { f(n) : there exist constantsc > 0, n0 > 0 suchthat 0 cg(n) f(n)for all n n0 } EXAMPLE: n (lg n) (c = 1, n0 = 16)September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.14 15. -notation (tight bounds) g(n)) = O(g(n)) (g(n))September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.15 16. -notation (tight bounds) g(n)) = O(g(n)) (g(n))1 22 EXAMPLE: 2n 2 n ( n )September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.16 17. -notation and -notationO-notation and -notation are like and .o-notation and -notation are like < and >. g(n)) = { f(n) : for any constant c > 0,there is a constant n0 > 0such that 0 f(n) cg(n)for all n n0 } EXAMPLE: 2n2 = o(n3)(n0 = 2/c)September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.17 18. -notation and -notationO-notation and -notation are like and .o-notation and -notation are like < and >. g(n)) = { f(n) : for any constant c > 0,there is a constant n0 > 0such that 0 cg(n) f(n)for all n n0 } EXAMPLE: n (lg n) (n0 = 1+1/c)September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.18 19. Solving recurrences The analysis of merge sort from Lecture 1required us to solve a recurrence. Recurrences are like solving integrals,differential equations, etc. Learn a few tricks. Lecture 3: Applications of recurrences todivide-and-conquer algorithms.September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.19 20. Substitution methodThe most general method:1. Guess the form of the solution.2. Verify by induction.3. Solve for constants.September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.20 21. Substitution methodThe most general method:1. Guess the form of the solution.2. Verify by induction.3. Solve for constants.EXAMPLE: T(n) = 4T(n/2) + n [Assume that T(1) = (1).] Guess O(n3) . (Prove O and separately.) Assume that T(k) ck3 for k < n . Prove T(n) cn3 by induction.September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.21 22. Example of substitutionT (n) 4T (n / 2) n 4c(n / 2)3 n (c / 2)n3 n cn3 ((c / 2)n3 n) desired residual cn3 desiredwhenever (c/2)n3 n 0, forexample, if c 2 and n 1. residualSeptember 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.22 23. Example (continued) We must also handle the initial conditions, that is, ground the induction with base cases. Base: T(n) = (1) for all n < n0, where n0 is a suitable constant. For 1 n < n0, we have (1) cn3, if we pick c big enough.September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.23 24. Example (continued) We must also handle the initial conditions, that is, ground the induction with base cases. Base: T(n) = (1) for all n < n0, where n0 is a suitable constant. For 1 n < n0, we have (1) cn3, if we pick c big enough. This bound is not tight!September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.24 25. A tighter upper bound?We shall prove that T(n) = O(n2).September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.25 26. A tighter upper bound?We shall prove that T(n) = O(n2).Assume that T(k) ck2 for k < n:T (n) 4T (n / 2) n 4c ( n / 2) 2 n cn 2 n2 O(n )September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.26 27. A tighter upper bound?We shall prove that T(n) = O(n2).Assume that T(k) ck2 for k < n:T (n) 4T (n / 2) n 4c ( n / 2) 2 n cn 2 n2 O(n ) Wrong! We must prove the I.H.September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.27 28. A tighter upper bound?We shall prove that T(n) = O(n2).Assume that T(k) ck2 for k < n:T (n) 4T (n / 2) n 4c ( n / 2) 2 n cn 2 n2 O(n ) Wrong! We must prove the I.H. cn 2 ( n) [ desired residual ] cn 2 for no choice of c > 0. Lose!September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.28 29. A tighter upper bound!IDEA: Strengthen the inductive hypothesis. Subtract a low-order term.Inductive hypothesis: T(k) c1k2 c2k for k < n.September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.29 30. A tighter upper bound!IDEA: Strengthen the inductive hypothesis. Subtract a low-order term.Inductive hypothesis: T(k) c1k2 c2k for k < n.T(n) = 4T(n/2) + n = 4(c1(n/2)2 c2(n/2)) + n = c1n2 2c2n + n = c1n2 c2n (c2n n) c1n2 c2n if c2 1.September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.30 31. A tighter upper bound!IDEA: Strengthen the inductive hypothesis. Subtract a low-order term.Inductive hypothesis: T(k) c1k2 c2k for k < n.T(n) = 4T(n/2) + n = 4(c1(n/2)2 c2(n/2)) + n = c1n2 2c2n + n = c1n2 c2n (c2n n) c1n2 c2n if c2 1.Pick c1 big enough to handle the initial conditions.September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.31 32. Recursion-tree method A recursion tree models the costs (time) of arecursive execution of an algorithm. The recursion-tree method can be unreliable,just like any method that uses ellipses (). The recursion-tree method promotes intuition,however. The recursion tree method is good forgenerating guesses for the substitution method.September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.32 33. Example of recursion treeSolve T(n) = T(n/4) + T(n/2) + n2: September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.33 34. Example of recursion treeSolve T(n) = T(n/4) + T(n/2) + n2:T(n) September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.34 35. Example of recursion treeSolve T(n) = T(n/4) + T(n/2) + n2: n2T(n/4)T(n/2) September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.35 36. Example of recursion treeSolve T(n) = T(n/4) + T(n/2) + n2: n2 (n/4)2 (n/2)2 T(n/16)T(n/8)T(n/8) T(n/4) September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.36 37. Example of recursion treeSolve T(n) = T(n/4) + T(n/2) + n2: n2 (n/4)2 (n/2)2(n/16)2 (n/8)2 (n/8)2 (n/4)2 (1) September 12, 2005 Copyright 2001-5 Erik D. Demaine and Charles E. Leiserson L2.37 38. Example of recursion treeSolve T(n) = T(n/4) + T(n/2) + n2: n2n2 (n/4)2