Upload
logan-patterson
View
214
Download
1
Embed Size (px)
Citation preview
1
Chapter 2
Program Performance – Part 2
2
Step Counts
• Instead of accounting for the time spent on chosen operations, the step-count method accounts for the time spent in all parts of the program/function
• Program step: loosely defined to be a syntactically or semantically meaningful segment of a program for which the execution time is independent of the instance characteristics
• Return a+b*c/(a-b)*4
• X =y
3
Use a global variable to count program steps
Count = 2n + 3
4
Counting steps in a recursive function
• tRsum = 2, n=0
• tRsum = 2+tRsum(n-1), n>0
• tRsum = 2+2+tRsum(n-2), n>0
• tRsum = 2(n+1), n>=0
5
Matrix Addition
6
Count steps in Matrix Addition
count = 2rows*cols+2rows+1
7
Using a Step TableSum
8
Rsum
9
Matrix Addition
10
Matrix TransposeTemplate <class T>
void transpose(T** a, int rows)
{
for (int i = 0; i < rows ; i++)
for (int j = i+1; j < rows ; j++)
swap(a[i][j], a[j][i])
}
11
Matrix Transpose
12
Inefficient way to compute the prefix sums
for j = 0, 1, …, n-1
j
i
ia0
][
Note: number of S/E for sum() varies depending on parameters
13
Steps Per Execution
• Sum(a, n) requires 2n + 3 steps• Sum(a , j + 1) requires 2(j+1) + 3 = 2j +5 steps• Assignment statement: b[j]=sum(….)• ==>2j + 6 steps
n-1
• Total: ∑ (2j +6) = n(n+5)
j=0
14
Prefix sums
15
Sequential Search - Best case
16
Sequential Search - Worst case
17
Average for successful searches• X has equal probability of being any one element of a.
• Step count if X is a[j]
18
Average for successful searches
1
0
2/)7()4(1
)(n
j
AVG
SearchSequentialnj
nnt
19
Insertion in a Sorted Array – Best Case
20
Insertion – Worst Case
21
Insertion - Average
n
0k
n
0j3n)]1n(3k2[
1n
1)3j2n2((
1n
1
• the step count for inserting into position j is 2n-2j+3
• Average count is:
22
Asymptotic Notation• Objectives of performance Evaluation:
– Compare time complexities of two programs that do the same function
– Predict the growth in run time as instance characteristics change
• Operation count and step count methods not accurate for either objectives– Op count: counts some ops and ignores others– Step count: definition of a step is inexact
23
Asymptotic Notation• If two programs:
– Program A with complexity C1n2+C2n– Program B with complexity C3n
• Program B is faster than program A for sufficiently large values of n
• For Small values of n, either could be faster and it may not matter any way.
• There is a break-even point for n beyond which B is always faster than A.
24
Asymptotic Notation• Describes behavior of space and time
complexities of programs for LARGE instance characteristics– To establish a relative order among functions. – To compare their relative rate of growth
• Allows us to make meaningful, though inexact statements about the complexity of programs
25
Mathematical backgroundT(n) denotes the time or space complexity of a
program
Big- Oh: Growth rate of T(n) is <= f(n)
• T(n)= ( f(n) ) iff constants c and n0 exist such that T(n)<=c f(n) when n>=n0
• f is an upper bound function for T
• Example: Algoritm A is (n2) means, for data sets big enough (n>n0), algorithm A executes less than c*n2 (c a positive constant).
26
The Idea• Example:
– 1000n• larger than n2 for small values of n
• n2 grows at a faster rate and thus n2 will eventually be the larger function.
– Here we have• T(n) = 1000n, f(n) = n2 , n0 = 1000, and c=1
• T(n) <= f(n) and n > n0
– Thus we say that• 1000n = (n2 )
– Note that we can get a tighter upper bound
27
Example
• Suppose T(n) = 10n2 + 4n + 2
• for n>= 2, T(n) <= 10n2 + 5n
• for n>=5, T(n) <= 11n2
• T(n) = O(n2 )
28
Big Oh Ratio Theorem
• T(n) = O(f(n)) iff (T(n)/f(n)) < c
for some finite constant c.
• f(n) dominates T(n).
lim
n
29
Examples
• Suppose T(n) = 10n2 + 4n + 2
• T(n)/n2 = 10 + 4/n + 2/n2
• (T(n)/ n2) = 10
• T(n) = O (n2 )
lim
n
30
Common Orders of MagnitudeFunctions Name
1 Constant
log n Logarithmic
log2n Log-squared
n log n
n2 Quadratic
n3 Cubic
2n Exponential
n! Factorial
31
Loose Bounds
• Suppose T(n) = 10n2 + 4n + 2
• 10n2 + 4n + 2 <= 11n3
• T(n) = O(n3)
• Need to get the smallest upper bound.
32
Polynomials
• If T(n) = amnm + ….+a1n1 +a0n0
then T(n) = O(nm)
33
Omga Notation--Lower BoundOmega:
• T(n)= ( g(n) ) iff constants c and n0 exist such that T(n)>=c g(n) for all n >=n0
• Establishes a lower bound
• eg: T(n) = C1n2+C2n• C1n2+C2n C1n2 for all n 1• T(n) C1 n2 for all n 1• T(n) is (n2)• Note: T(n) is also (n) and (1). Need to
get largest lower-bound
34
Omega Ratio Theorem
• T(n) = (f(n)) iff (f(n)/T(n)) <= c
for some finite constant c.
limn
35
Lower Bound of Polynomials
• If T(n) = amnm + ….+a1n1 +a0n0
then T(n) = (nm)
• T(n) = n4 + 3500n3 + 400n2 +1• T(n) is (n4)
36
Theta NotationTheta: When O and meet we indicate that with notation
• Definition: T(n)= ( h(n) ) iff constants c1, c2 and n0 exist such that c1h(n)<=T(n)<=c2h(n) for all n > n0
• T(n)= ( h(n) ) iff T(n)=O(h(n)) and T(n)= (h(n))
• e.g. T(n) = 3n + 8 • 3n<= 3n+8 <= 11n for n >= 1• T(n) = (n)• T(n) = 20log2(n) +8 = log2 (n)• log2 (n) < 20log2 (n) + 8<= 21log2 (n) for all n>=32
37
Theta Notation cntd
• T(n) = 1000n
• T(n) = O(n2)• but T(n) != (n2) because T(n) != n2)
38
Theta of Polynomials
• If T(n) = amnm + ….+a1n1 +a0n0
then T(n) = (nm)
39
Little o Notation
Little- Oh: Growth rate of T(n) is < p(n)• T(n)= ( p(n) ) if T(n)= ( p(n) ) and T(n)!=
( p(n) )
• T(n) = 1000n• T(n) o(n2)
40
Simplifying Rules
• If f(n) is O(g(n)) and g(n) is O(h(n)), then f(n) is O(h(n)).
• If f(n) is O(kg(n)) for any k>0, then f(n) is O(g(n)).
• f1(n) = O(g1(n)) and f2 (n) = O(g2(n)), then
(a) (f1 + f2 )(n) = max (O(g1 (n)), O(g2(n))),
(b) f1 (n) * f2 (n) = O(g1(n) * g2(n))
41
Some Points
• DO NOT include constants or low-order terms inside a Big-Oh.
• For example:– T(n) = O(2n2) or
– T(n) = O(n2 + n)
• are the same as: – T(n) = O(n2)
42
Examples
• Example1: a = b;
This assignment takes constant time, so it is
• Example 2: sum =0;
for( I= 0; I<= n; I++)
sum += n;
• time complexity is (n)
43
Examples CNTD
a = 0;
for (i=1; i<=n; i++)
for (j=1; j<=n; j++)
a++;• time complexity is (n2)
44
Examples CNTD
a = 0;
for (i=1; i<=n; i++)
for (j=1; j<= i ; j++)
a++;
• a++ statement will execute n(n+1)/2 times• time complexity is (n2)
45
Examples CNTD
a = 0; (1)
for (i=1; i<=n; i++)
for (j=1; j<= i ; j++)
a++; (n2)
for (k=1; k<=n; k++) (n)
A[k] = k-1;
• time complexity is (n2)
46
Examples CNTD
• Not all doubly nested loops execute n2 times
a = 0;
for (i=1; i<=n; i++)
for (j=1; j<= n ; j *= 2)
a++;
• Inner loop executes log2(n)
• Outer loop execute n times
• time complexity is (n log2 (n))
49
First determine the asymptotic complexity of each statement and then
add up
50
Asymptotic complexity of Rsum
51
Asymptotic complexity of Matrix Addition
52
Asymptotic complexity of Transpose
53
Asymptotic complexity of Inef
54
Asymptotic complexity of Sequential Search
55
Binary Search
Worst-case complexity is Θ(log n)
56
Performance Measurement
Chapter 2 Section 6
57
Run time on a pseudo machine
58
Conclusions
• The utility of a program with exponential complexity is limited to small n (typically <= 40)
• Programs that have a complexity of high degree polynomial are also of limited utility
• Linear complexity is desirable in practice of programming
59
Performance Measurement
• Obtain the actual space and time requirements of a program
• Choosing Instance Size
• Developing the test data - exhibits the best-, worst-, and average-case time complexity (using randomly generated data)
• Setting up the experiment - write a program that will measure the desired run times
60
Measuring the performance of Insertion Sort Program
61
Measuring the performance of Insertion Sort Program (continue)
62
Experimental results - Insertion Sort
63
Measuring with repeated runs
64
Do without overhead
65
Do without overhead (continue)
66
Overhead
67
End of Chapter 2