30
1 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Introduction in Computer Science 2 Asymptotic Complexity DEEDS Group - TU Darmstadt Prof. Neeraj Suri Constantin Sarbu Brahim Ayari Dan Dobre Abdelmajid Khelil

Introduction in Computer Science 2 Asymptotic Complexity

Embed Size (px)

DESCRIPTION

Introduction in Computer Science 2 Asymptotic Complexity. DEEDS Group - TU Darmstadt Prof. Neeraj Suri Constantin Sarbu Brahim Ayari Dan Dobre Abdelmajid Khelil. Remember: Sequential Search. Given: Array A of integers and a constant c. Question : Is c in A?. Memory Complexity (in Java) - PowerPoint PPT Presentation

Citation preview

1©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

Introduction in Computer Science 2

Asymptotic Complexity

DEEDS Group - TU Darmstadt

Prof. Neeraj Suri

Constantin Sarbu

Brahim Ayari

Dan Dobre

Abdelmajid Khelil

2©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

Given: Array A of integers and a constant c.Question: Is c in A?

boolean contains (int [] A, int c) { int n = A.length; boolean found = false; for(int i = 0, i < n; i++) if (A [i] == c) found = true; return (found);}

Input A c n Assignment Comparisons Array access Increments[1,4,2,7] 6 4 1+1+1+0 4+4 4 4[2,7,6,1] 2 4 1+1+1+1 4+4 4 4[2,1,8,4,19,7,16,3] 5 8 1+1+1+0 8+8 8 8[4,4,4,4,4,4] 4 6 1+1+1+6 6+6 6 6

Memory Complexity (in Java) int: 4 bytes,

boolean: 1 byte

Memory used: size(A) + size(c) + size(n) +

size(i) + size(found) = n*4+13

Time Complexitycounting operations

Remember: Sequential Search

3©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

• Time complexity‣ gives a simple characterization of an algorithm’s efficiency‣ allows to compare it to alternative algorithms

• In the last lecture we determined exact running time, but extra precision usually doesn’t worth the effort of computing it

• Large input sizes: constants and lower order terms are ruled out

• This means we are studying asymptotic complexity of algorithms

we are interested in how the running time increases with the size of the input in the limit

• Usually, an algorithm which is asymptotically more efficient is not the best choice for very small inputs ;)

Why Asymptotic Complexity?

4©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

• Upper Bounds: O (big O) - Notation‣ Properties, proof f O(g), sum and product rules‣ Loops, conditional statements, conditions, procedures‣ Examples: Sequential search, selection sort

• Lower Bounds: (Omega) – Notation

• Bands: (Theta) - Notation

Today: Efficiency Metrics - Complexity

5©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

n

c g(n)

f(n)

n0

c > 0 n0 n > n0 cg(n) >= f(n)

T(n)

Asymptotic Time Complexity: Upper Bound

6©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

• Given f: NN R+ g: N R+

• Definition:

• O(g) = { f | n0N, cR, c > 0: n n0 f(n) cg(n) }

• Intuitively: • O(g) = the set of all functions f,

that grow, at most, as fast as g

• One says:• „If f O(g), then g is an asymptotical

upper bound for f”

O-Notation (pronounce: “big-Oh”)

7©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

• O(n4) = {…, n, n2, nlogn, n3, n4, 3n4, cn3, …}

‣ n3 O(n4)

‣ nlogn O(n4)

‣ n4 O(n4)

• Generally: „slower growth O (faster growths)“

Example

8©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

• Often shortened as f = O(g) instead of f O(g)

‣ But: f = O(g) is no equality in the common meaning, only interpretable from left to right!

• Normally, for analysis of algorithms:‣ f: N N and g: N N,‣ since the input is the size of the input data and the

value is the amount of elementary operations

• For average case analysis the set R+ is also used:‣ f: N R+ and g: N R+

O-Notation

9©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

Example O-Notation

• T1(n) = n + 3 O(n) because n + 3 2n n 3• T2(n) = 3n + 7 O(n)• T3(n) = 1000n O(n)• T4(n) = 695n2 + 397n + 6148 O(n2)

Functions are mostly monotonically increasing and 0.Criteria for finding f O(g):

If f(n) / g(n) c for some n n0 then f = O(g)

Example: lim f(n) / g(n) c n

n2lim 695n2 + 397n + 6148T4(n)

n2n = = 695

n0c

10©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

• The proof has two parts:‣ Finding the closed form‣ Solving the inequality f(n) c.g(n) from the definition

• Illustration using an example:‣ A is an algorithm, which sorts a set of numbers in

increasing order‣ Assumption: A performs according to f(n) = 3 + 6 + 9

+...+ 3n‣ Proposition: A has the complexity O(n2)‣ Closed form for f(n) = 3 + 6 + 9 +...+ 3n:

• f(n) = 3(1+2+3+...+n) = 3n(n+1)/2

Proving that f O(g)

11©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

• Task: Find a value c, for which

3n(n+1)/2 cn2 (for n > one n0)

• Try c=3:

3n(n+1)/2 3n2

n2 + n 2n2

n n2

1 n for all n 1 Q.E.D.

Proving that f O(g)

12©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

• O-Notation is a simplification:

‣ It eliminates constants: O(n) = O(n/2) = O(17n)

‣ It forms an upper bound, i.e.:• from f(n) O(n log2n) follows that f(n) O(n2)

‣ For O-Notation the basis for logarithms is irrelevant, as:

xcxbb

xxb loglog

log

1

log

loglog

Consequences of the O-Notation

13©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

• Inclusion relations of the O-Notation:

O(1) O(log n) O(n) O(n log n) O(n2) O(n3) O(2n) O(10n)

We try to set the bounds as tight as possible

• Rule:

)()( gOfcgOf

Properties of O-Notation

14©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

O(1) constant

O(log n) logarithmic

O(n) linear

O(n log n) n log n

O(n2) square

O(n3) cubic

O(nk), k2 polynomial

O(2n) exponential

Pronunciation

15©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

• The time complexity of a program comes from the complexity of its parts

• The complexity of the elementary operations is O(1)(elementary operation: e.g. assignment,

comparison, arithmetic operations, array access, …)

• A defined sequence of elementary operations (independent of the input size n) also has the complexity O(1)

Calculating the Time Complexity

16©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

• Given the time complexities of two algorithms T1 and T2 :•

• Summation rule:For the execution of T1 followed by T2:

• Product rule:For the nested execution of T1 and T2:

T1(n) O(g1(n)),T2(n) O(g2(n))

))()(()()( 2121 ngngOnTnT

)))(),((max())()(()()( 212121 ngngOngngOnTnT

Sum and Product Rules

17©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

• Loops in series: (n and m are the problem sizes)

for (int i = 0; i < n; i++){operation;

}for (int j = 0; j < m; j++){

operation;}

• Complexity O(n+m) = O(max(n,m)) (sum rule)

Loops in Series

18©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

• Nested loops: (n is the problem size)‣ When inner loop execution is not dependent on the problem size, e.g.:

for (int i = 0; i < n; i++) for (int j = 0; j < 17; j++) operation;

Complexity O(17n) = O(n) (Product rule)

‣ Otherwise:

for (int i = 0; i < n; i++) for (int j = 0; j < n; j++) operation;

Complexity:• (Product rule)

Ex: read the data from a n x n matrix -> very expensive (O(n2))!

)()()()( 2nOnnOnnnTnT internalexternal

Nested Loops

19©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

• Conditional Statement:

if B then T1

else T2

‣ Cost of „if“ is constant, therefore negligible‣ T(n)=T1(n) or T(n)=T2(n)

‣ Good (if decidable): Longer sequences are chosen, i.e., the dominant operation should be used

‣ Upper boundary assessment also possible:• T(n) < T1(n) + T2(n) O(g1(n)+g2(n))

Conditional Statement

20©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

• Loop with condition: (n is the problem size)

for (int i = 0; i < n; i++) {if (i = = 0)

block1;else

block2;}

• block1 is executed only once => not relevant‣ (when not: T(block2) >> n.T(block1) )

• block2 is dominant

• Complexity O(n.T(block2))

Condition Example

21©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

• Procedures are analyzed separately, and their execution times inserted for each call

• For recursive procedure calls: a recurrence relation for T(n) must be found

• Once again: Find a closed form for the recursive relation (example follows shortly)

Procedure Calls

22©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

• Iterative Algorithms (today)‣ Composed of smaller parts sum rule‣ Consider loops multiplication rule

• Recursive Algorithms (next lecture)‣ Time factors:

• Breaking a problem in several smaller ones• Solving the sub-problems• Recursive call of the method for solving the problem• Combining the solutions for the sub-problems

Analysis of simple Algorithms

23©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

• Cost consists of part a, and a part b which is repeated n times

• T(n) = a+bn

• T(n) O(n)

boolean contains (int [] A, int c) { int n = A.length; boolean found = false; for(int i = 0, i < n; i++) if (A [i] == c) found = true; return (found);}

b: inside loop

a: outside loop

Example 1: Sequential Search

24©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

• Inner loop is executed i times, i < n => upper boundary: c.n• Outer loop is executed n times, constant costs: b • Costs: n.(b+cn) = bn + cn2 => O(n2)

void SelectionSort (int [] A) { int MinPosition, temp, i, j; for (i=n-1; i>0; i--) {

MinPosition = i;for (j=0; j<i; j++)

if ( A[j] < A[MinPosition] )MinPosition = j;

temp = A[i];A[i] = A[MinPosition];A[MinPosition] = temp;

}}

b c

Example 2: Selection Sort

25©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

• Analog to O(f) we have:• (g) = { h | c>0: n‘>0: n>n‘: h(n) c g(n) }

• Intuitively: • (g) is the set of all functions that grow at least as strong

as g

• One says:• „ if f (g), then g sets a lower bound for f.“

• Note: f O(g) g (f)

(Omega) - Notation

26©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

n

c2, n0 > 0 such that f(n) = (g(n))

T(n)

“g(n) sets an lower bound for f(n)”

f(n)

n0

c2 g(n)

Example: -Notation

27©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

• With the sets O(g) and (g) we can define:(g) = O(g) (g)

• Intuitively: (g) is the set of functions that grow exactly as strong as g

• Meaning: if f O(g) and f (g) then f (g)

• In this case one talks about an exact bound

(Theta) - Notation

28©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

n

c1, c2, n0 > 0 such that f(n) = (g(n))

T(n)

“g(n) sets an exact bound for f(n)”

c1 g(n)

f(n)

n0

c2 g(n)

Example: -Notation

29©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

• Algorithms with a higher asymptotic complexity can be more efficient for smaller problem sizes

• Asymptotic execution time only holds for certain values of n

• The constants do make a difference for smaller input sets

Algorithm T(n) Good for n = ...

A1 186182 log2 n n > 2048

A2 1000 n 1024 n 2048

A3 100 n log2 n 59 n 1024

A4 10n2 10 n 58

A5 n3 n = 10

A6 2n 2 n 9

Non-Asymptotic Execution Time

30©2008 DEEDS GroupIntroduction to Computer Science 2 - SS 08 Asymptotic Complexity

• Up till now, we’ve seen only iterative algorithms

• What about recursive algorithms?

• Following week: Refreshing recursion

• Then: Complexity Analysis with recurrence relation

Complexity and Recursion