Click here to load reader

Asymptotic Analysis

Embed Size (px)

DESCRIPTION

Asymptotic Analysis. Chapter 3. Given several algorithms for a given problem …. How do we decide which is “the best”? What do we even mean by “the best”? Do we mean the “most efficient”?. What do mean by the “most efficient”?. Time?. Space?. Both?. How to determine execution time?. - PowerPoint PPT Presentation

Citation preview

Computer Programming I

Asymptotic AnalysisChapter 3

Given several algorithms for a given problem How do we decide which is the best?What do we even mean by the best?Do we mean the most efficient?What do mean by the most efficient?

Time?Space?Both?Typically time. We can actually come up with a function that essentially counts the number of operations performed based on the size of the problem. We dont care so much about small problems, but we want to know how the time for the algorithm grows as the size of the problem gets larger. We want to know its growth factor. Classic examples are searching an sorting very common database operations! 3How to determine execution time?EmpiricallyTheoreticallyHow does execution time change as size of problem grows very large? (or reaches a limit in the calculus sense)What is the Cost Function for an algorithm?What is Growth Rate of an algorithm?Empirically run and time; problem is not machine-independent; Theoretically how does algorithm behave as size of the problem gets larger?4Common Growth Ratesnlog2nnn log2nn2n32n1010112212248431.58496334.7548889278424816641652.321928511.60964251253262.584963615.50978362166472.807355719.6514849343128838246451225693.169925928.5293381729512103.3219281033.2192810010001024204.3219282086.4385640080001048576505.64385650282.192825001250001.13E+151006.643856100664.38561000010000001.27E+30So if n is the size of the problem Need to lead in to Oh, Omega, Theta functions5Cost function for a given sorting algorithm:Suppose this is the cost function for a particular sorting algorithm. What happens to each term in the polynomial function as n gets larger6Upper Bound or big-Oh Notation: O( f(n) ) For T(n) a non-negatively valued function, T(n) is in set O( f(n) ) if there exist two positive constants c and n0 such that T(n) c f(n) for all n > n0Lower Bound or big-Omega Notation: ( f(n) ) For T(n) a non-negatively valued function, T(n) is in set ( f(n) ) if there exist two positive constants c and n0 such that T(n) c f(n) for all n > n0big-Theta Notation: ( f(n) ) For T(n) a non-negatively valued function, T(n) is ( f(n) ) if T(n) is in set ( f(n) ) and T(n) is in set O( f(n) )Apply these definitions to:We want the f(n)s to be simple! Lets look at the graph of the function. (Use Mathematica or calculator?)10

So T(n) (1)

So T(n) (log2n)

So T(n) (n)

So T(n) (n2)

So T(n) O(n3)

So T(n) O(n2) Conclusion:So, lets analyze some code segments to come up with cost functions / growth functionsSimplification rules (p. 67)1. If f(n) is (g(n)) and g(n) is (h(n)), then f(n) is (h(n)).2. If f(n) is (kg(n)) for any constant k > 0, then f(n) is (g(n)).3. If f1(n) is (g1(n)) and f2(n) is (g2(n)), then f1(n) + f2(n) is(max(g1(n), g2(n))).4. If f1(n) is (g1(n)) and f2(n) is (g2(n)), then f1(n)f2(n) is(g1(n)g2(n)).

Example 3.9, p. 69a = b;(1)Example 3.10, p.69sum = 0;for (i=1; i