Amortized Analysis and Master Method - deepak garg · When amortized cost > actual cost, store...

Preview:

Citation preview

Amortized Analysis and Master Method

Deliverables

Copyright @ gdeepak.com 2 6/5/2012 7:15 PM

Amortized Analysis

Aggregate Analysis

Accounting Method

Potential Analysis

Solving Recurrence Relations

Master Method

Amortized Analysis

Analyze a sequence of operations on a data structure.

Goal is to show that although some individual operations may be expensive, on average the cost per operation is small.

Such analysis is not immediately obvious and needs deeper knowledge of problem parameters and complexity bounds.

No probability is involved. It is average cost in the worst case. Aggregate analysis, Accounting method, and Potential method

Copyright @ gdeepak.com 3 6/5/2012 7:15 PM

Aggregate Analysis

It requires knowledge of which series of operations are possible. It is common case with data structures, which have state that persists between operations.

Idea is that a worst case operation can alter the state in a way that worst case cannot occur for a long time, thus amortizing its cost. E.g. doubling an array will be an operation for a heavy cost.

Aggregate analysis determines the upper bound T(n) on the total cost of a sequence of n operations, then calculates average cost to be T(n)/n

Copyright @ gdeepak.com 4 6/5/2012 7:15 PM

Aggregate Analysis-Example

k-bit binary counter A[0 . . k − 1] of bits, where A[0] is the least significant bit and A[k − 1] is the most significant bit. Counts upward from 0.

Value of counter is

Initially, counter value is 0, so A[0 . . k − 1] = 0.

Copyright @ gdeepak.com 5

1

[ ].2k

i

i o

A i

6/5/2012 7:15 PM

Binary counter algorithm

Increment(A, k)

i ← 0

while (i < k and A[i ] = 1)

do A[i ] ← 0

i ←i + 1

if i < k

then A[i ] ← 1

Copyright @ gdeepak.com 6 6/5/2012 7:15 PM

Binary Counter example k = 3

Copyright @ gdeepak.com 7

0

1

2

3

4

5

6

7

0 0 0

0 0 1

0 1 0

0 1 1

1 0 0

1 0 1

1 1 0

1 1 1

Counter Value

0

1

3

4

7

8

10

11

cost

6/5/2012 7:15 PM

Binary counter- analysis

Cost of Increment = # of bits flipped

Each call could flip k bits, so n Increments takes O(nk) time.

Observation is that not every bit flips every time.

bit flips how often times in n Increments

0 every time n

1 1/2 the time n/2

2 1/4 the time n/4

...

i 1/2i the time n/2i

i ≥ k never 0

Copyright @ gdeepak.com 8 6/5/2012 7:15 PM

Binary counter-Aggregate analysis

Therefore, total # of flips =

< n =

= 2n .

Average cost per operation = O(1)

Copyright @ gdeepak.com 9

1

2ii o

1

2

n

ii o

n

1

12

n

6/5/2012 7:15 PM

Stack Operations example

• Stack operations

• Push(S, x): O(1) ⇒ O(n) for any sequence of n operations.

• Pop(S): O(1) ⇒ O(n) for any sequence of n operations

• Multipop(S, k)

while S is not empty and k > 0

do Pop(S)

k ← k − 1

Copyright @ gdeepak.com 10 6/5/2012 7:15 PM

Stack Complexity for Multipop

Running time of Multipop is linear in #of Pop operations

# of iterations of while loop is min(s, k), where s = # of objects on stack therefore, total cost = min(s, k).

Worst-case cost of Multipop is O(n)

For Sequence of n Push, Pop, Multipop operations worst-case cost of sequence is O(n2).

Copyright @ gdeepak.com 11 6/5/2012 7:15 PM

Aggregate Analysis for stacks

Each object can be popped only once per time that it is pushed.

Have ≤ n Pushes ⇒≤n Pops, including those in Multipop.

Therefore, total cost = O(n) and average over n operations ⇒ O(1) per operation

No probability is involved and shows worst-case O(n) cost for sequence

Copyright @ gdeepak.com 12 6/5/2012 7:15 PM

Accounting method

A set of elementary operations are chosen to be used in the algorithm and their cost is set to 1.

Actually, cost differs because a prepayment may be necessary to complete an operation with some payment left over to be used later from savings.

Assign different charges to different operations. Some are charged more than actual cost and some are charged less.

Amortized cost = amount we charge

Copyright @ gdeepak.com 13 6/5/2012 7:15 PM

Accounting Method

Copyright @ gdeepak.com 14

When amortized cost > actual cost, store the difference on specific objects in the data structure as credit.

Use credit later to pay for operations whose actual cost > amortized cost.

Differs from aggregate analysis because different operations can have different costs while in aggregate analysis, all operations have same cost.

Need credit to never go negative

6/5/2012 7:15 PM

Stack

• Intuition: When pushing an object, pay `2.

• ` 1 pays for the PUSH.

• ` 1 is prepayment for it being popped by either POP or Multipop.

• Each object has ` 1, which is credit, credit can never go negative

• Total amortized cost, = O(n), is an upper bound on total actual cost.

Copyright @ gdeepak.com 15

operation Actual cost Amortized Cost

PUSH 1 2

Pop 1 0

Multipop min(k, s) 0

6/5/2012 7:15 PM

Accounting Method

Let ci = actual cost of ith operation, = amortized cost of ith operation. Then require ≥ for all sequences of n operations. Total credit stored = - ≥ 0

Copyright @ gdeepak.com 16

ic

1

n

i

i

c

1

n

i

i

c

1

n

i

i

c

1

n

i

i

c

6/5/2012 7:15 PM

Binary Counter-Example

• Charge `2 to set a bit to 1. `1 pays for setting a bit to 1. • `1 is prepayment for flipping it back to 0. • Have ` 1 of credit for every 1 in the counter. • Therefore, credit ≥ 0. • Amortized cost of increment: Cost of resetting bits to 0 is

paid by credit. • At most 1 bit is set to 1. Therefore, amortized cost ≤ ` 2.

For n operations, amortized cost = O(n).

Copyright @ gdeepak.com 17 6/5/2012 7:15 PM

Potential Method

• Like the accounting method, but think of credit as potential stored with entire data structure.

• Accounting method stores credit with specific objects. • Potential method stores potential in the data structure as

a whole. • Can release potential to pay for future operations. • Most flexible of the amortized analysis methods. • Let Di = data structure after ith operation ,

Copyright @ gdeepak.com 18 6/5/2012 7:15 PM

Potential Method

• D0 = initial data structure

• ci = actual cost of ith operation ,

• = amortized cost of ith operation .

• Potential function Φ : Di → R

• Φ (Di) is the potential associated with data structure Di.

• = ci + Φ (Di ) − Φ (Di−1)

• Increase in potential due to ith operation

Copyright @ gdeepak.com 19

ic

ic

6/5/2012 7:15 PM

Potential Method

Total amortized cost = + ΦDi –ΦDi-1 telescoping: every term except D0 and Dn is added and subtracted = + ΦDn –ΦD0

If we require that Φ (Di ) ≥Φ (D0) for all i , then amortized cost is an upper bound on actual cost Φ (D0) = 0, Φ (Di ) ≥ 0

Copyright @ gdeepak.com 20

1

n

i

i

c

1

n

i

i

c

1

n

i

i

c

1

n

i

i

c

1

n

i

i

c

6/5/2012 7:15 PM

Stack Example

• Φ = # of objects in stack

• D0 = empty stack ⇒ Φ(D0) = 0.

• Since # of objects in stack is always ≥ 0, Φ(Di ) ≥ 0 = Φ(D0) for all i.

• Therefore, amortized cost of a sequence of n operations = O(n).

Copyright @ gdeepak.com 21

Operation Actual cost Φ s=#of objects initially

Amortized Cost

Push 1 (s + 1) − s =1 1+ 1 = 2

Pop 1 (s − 1) − s = −1 1− 1 = 0

Multipop k’=min(k, s) (s−k’) − s = −k’ k’ − k’ = 0

6/5/2012 7:15 PM

Binary Counter example k = 3

Copyright @ gdeepak.com 22

0

1

2

3

4

5

6

7

0 0 0

0 0 1

0 1 0

0 1 1

1 0 0

1 0 1

1 1 0

1 1 1

Counter Value

0

1

3

4

7

8

10

11

cost

6/5/2012 7:15 PM

Potential Method

Copyright @ gdeepak.com 23 6/5/2012 7:15 PM

Potential Method

Copyright @ gdeepak.com 24

ic

6/5/2012 7:15 PM

Solving Recurrence Equations for Complexity – 3 methods

Guess the solution and prove

Using Recursion Tree Method

Master Method

Copyright @ gdeepak.com 25 6/5/2012 7:15 PM

Guess the solution and prove

1. Guess the form of the solution

2. Verify by induction and show that the solution works for a set of constants

T(n)=2T(n/2) + n Which means that now the problem has been divided into two sub problems and the size of the sub problem is n/2

Copyright @ gdeepak.com 26 6/5/2012 7:15 PM

Guess and Substitute by Example

We guess that it works for T(n) = O(n lg n)

To prove that T(n) ≤ cn lgn for an appropriate c.

Assuming that the guess works we will have

T(n) = 2T(n/2)+ n

= n lg(n/2) +n + n

= n lg n – n lg 2 + n +n

= n lg n + n ≤ cnlg n

this is true for a specific value of c

Copyright @ gdeepak.com 27

2( lg )2 2 2

n n nn

6/5/2012 7:15 PM

Guess by reasoning and experience

Looks similar to binary search, try O(logn)

Assume T(k) = O(log2 k) for all k < n

Show that T(n) = O(log2 n)

We need to prove T(n/2) ≤ c log2(n/2)

Copyright @ gdeepak.com 28

dnTnT )2/()(

6/5/2012 7:15 PM

Guess by reasoning and experience

Copyright @ gdeepak.com 29

dnTnT )2/()(

dnc )2/(log2

dcnc 2loglog 22

nc 2log

if c ≥ d

Residual dcnc 2log

6/5/2012 7:15 PM

Recursion Tree

Guessing may be difficult for recurrences which does not resemble any common recurrence

Need to check the cost of tree at each level of recursion and sum costs of all the levels. For this information of depth of the tree, no of leaves of the tree is required

In the end answer can be verified by substitution using the above as a guess.

Copyright @ gdeepak.com 30

2)4/(3)( nnTnT

cnnTnTnT )3/2(2)3/()(

6/5/2012 7:15 PM

Recursion Tree (Level 1)

Copyright @ gdeepak.com 31

2)4/(3)( nnTnT

cn2

T(n/4 ) T(n/4 ) T(n/4 )

6/5/2012 7:15 PM

Recursion Tree (Level 2)

Copyright @ gdeepak.com 32

2)4/(3)( nnTnT

cn2 cn2

cost

2

4

nc

2

4

nc

2

4

nc

16

nT

16

nT

16

nT

16

nT

16

nT

16

nT

16

nT

16

nT

16

nT

3/16cn2

6/5/2012 7:15 PM

Copyright @ gdeepak.com 33

2)4/(3)( nnTnT

cn2 cn2

cost

2

4

nc

2

4

nc

2

4

nc

2

16

nc

3/16cn2

2

16

nc

2

16

nc

2

16

nc

2

16

nc

2

16

nc

2

16

nc

2

16

nc

2

16

nc (3/16)2cn2

What is the cost at each level? 2

16

3cn

d

6/5/2012 7:15 PM

Copyright @ gdeepak.com 34 6/5/2012 7:15 PM

Important to grasp

• Data is divided into 4 parts at each level

Copyright @ gdeepak.com 35

14

d

n

04

log

d

n

04loglog dn

nd log4log

nd 4log

6/5/2012 7:15 PM

Important to grasp

No of leaves in the tree with three children of every node at each level

Copyright @ gdeepak.com 36

nd 4log33

6/5/2012 7:15 PM

Copyright @ gdeepak.com 37

)3(16

3...

16

3

16

3)( 4log2

1

2

2

22 n

d

cncncncnnT

)3(16

34

4log

1log

0

2 nn

i

i

cn

xx

k

k

1

10

let x = 3/16

)3(16

34log

0

2 n

i

i

cn

)3()16/3(1

14log2 n

cn

)3(13

164log2 n

cn )(13

16)(

3log2 4ncnnT

)()( 2nOnT

6/5/2012 7:15 PM

Proving after Guess through recursion tree

Now we have reasonable guess with the help of recursion tree , let us prove it by guessing

Assume T(k) = O(k2) for all k < n Show that T(n) = O(n2)

Given that T(n/4) = O((n/4)2), then T(n/4) ≤ c(n/4)2

Copyright @ gdeepak.com 38

2)4/(3)( nnTnT 22)4/(3 nnc

22 16/3 ncn 2cn

13

16c

6/5/2012 7:15 PM

Recurrence equations-Master Method

• Where a ≥ 1 and b>1 are constants and f(n) is an asymptotically positive function

• We are dividing a problem of size n into a sub problems, each of size n/b, where a and b are positive constants. a sub problems are solved recursively each in time T(n/b) where a and b are positive constants. Cost of dividing the problem and combining the results of the sub problems is described by the function f(n) which can be written as D(n)+C(n).

Copyright @ gdeepak.com 39

dnnfbnaT

dncnT

if)()/(

if )(

6/5/2012 7:15 PM

Three cases of Master Method

Compare f (n) with nlogb

a:

1. f (n) = O(nlogba – ε) for some constant ε > 0.

f (n) grows polynomially slower than nlogba (by an nε factor).

Solution: T(n) = Θ(nlogba) .

2. f (n) = Θ(nlogba lgkn) for some constant k ≥ 0.

f (n) and nlogba grow at similar rates.

Solution: T(n) = Θ(nlogba lgk+1n)

Copyright @ gdeepak.com 40 6/5/2012 7:15 PM

Three Cases of Master Method

Compare f (n) with nlogba:

3. f (n) = Ω(nlogba + ε) for some constant ε > 0.

f (n) grows polynomially faster than nlogba (by an nε factor),

and f (n) satisfies the regularity condition that

a f (n/b) ≤ c f (n) for some constant c < 1.

Solution: T(n) = Θ( f (n))

Copyright @ gdeepak.com 41 6/5/2012 7:15 PM

Master Theorem

There is a gap between cases 1 and 2 when f (n) is smaller than but not polynomially smaller. Similarly, there is a gap between cases 2 and 3 when f (n) is larger but not polynomially larger. If the function f (n) falls into these gaps, or if regularity condition in case 3 fails to hold, master method cannot be used.

Copyright @ gdeepak.com 42

.1 somefor )()/( provided

)),((is)(then),(is)(if 3.

)log(is)(then),log(is)(if 2.

)(is)(then),(is)(if 1.

log

1loglog

loglog

nfbnaf

nfnTnnf

nnnTnnnf

nnTnOnf

a

kaka

aa

b

bb

bb

dnnfbnaT

dncnT

if)()/(

if )(

6/5/2012 7:15 PM

Easy Way

Copyright @ gdeepak.com 43

( ) ( ) knT n aT n

b

blog ak

k k

k k

if a > b then T(n) = O(n ) (Bottom Heavy)

if a b then T(n) = O(n log(n)) (Steady State)

if a < b then T(n) = O(n ) (Top Heavy)

6/5/2012 7:15 PM

T(n)=4T(n/2)+n

a=4 = n2

b=2

f(n)=n

We are in case 1 so T(n) is (n2)

Copyright @ gdeepak.com 44

abnlog

.1 somefor )()/( provided

)),((is)(then),(is)(if 3.

)log(is)(then),log(is)(if 2.

)(is)(then),(is)(if 1.

log

1loglog

loglog

nfbnaf

nfnTnnf

nnnTnnnf

nnTnOnf

a

kaka

aa

b

bb

bb

2

2

2

is f( ) ( )?

is f( ) ( log )?

is f( ) ( )?

k

n O n

n n n

n n

6/5/2012 7:15 PM

T(n)=2T(n/2)+nlogn

a=2 = n

b=2

f(n)=nlogn

We are in case 2 and k=1 so T(n) is (n log2 n)

Copyright @ gdeepak.com 45

abnlog

.1 somefor )()/( provided

)),((is)(then),(is)(if 3.

)log(is)(then),log(is)(if 2.

)(is)(then),(is)(if 1.

log

1loglog

loglog

nfbnaf

nfnTnnf

nnnTnnnf

nnTnOnf

a

kaka

aa

b

bb

bb

1

1

1

is f( ) ( )?

is f( ) ( log )?

is f( ) ( )?

k

n O n

n n n

n n

6/5/2012 7:15 PM

T(n)=4T(n/2)+n3

a=4 = 2

b=2

f(n)=nlogn

We are in case 3 says T(n) is (n3).

For n=1/2 4*(1/2)3/2 ≤ δ (1/2)3 ¼ ≤ δ (1/8) for δ = 1/2

Copyright @ gdeepak.com 46

.1 somefor )()/( provided

)),((is)(then),(is)(if 3.

)log(is)(then),log(is)(if 2.

)(is)(then),(is)(if 1.

log

1loglog

loglog

nfbnaf

nfnTnnf

nnnTnnnf

nnTnOnf

a

kaka

aa

b

bb

bb

abnlog 2

2

2

is f( ) ( )?

is f( ) ( log )?

is f( ) ( )?

k

n O n

n n n

n n

6/5/2012 7:15 PM

T(n)=8T(n/2)+n2

Solution: logba=3, so case 1 says T(n) is (n3).

Copyright @ gdeepak.com 47

.1 somefor )()/( provided

)),((is)(then),(is)(if 3.

)log(is)(then),log(is)(if 2.

)(is)(then),(is)(if 1.

log

1loglog

loglog

nfbnaf

nfnTnnf

nnnTnnnf

nnTnOnf

a

kaka

aa

b

bb

bb

dnnfbnaT

dncnT

if)()/(

if )(

6/5/2012 7:15 PM

T(n)=9T(n/3)+n3

Solution: logba=2, so case 3 says T(n) is (n3)

Copyright @ gdeepak.com 48

.1 somefor )()/( provided

)),((is)(then),(is)(if 3.

)log(is)(then),log(is)(if 2.

)(is)(then),(is)(if 1.

log

1loglog

loglog

nfbnaf

nfnTnnf

nnnTnnnf

nnTnOnf

a

kaka

aa

b

bb

bb

dnnfbnaT

dncnT

if)()/(

if )(

6/5/2012 7:15 PM

Master Theorem

T(n)=T(n/2)+1 (binary search)

Solution: logba=0, so case 2 says T(n) is (log n).

Copyright @ gdeepak.com 49

.1 somefor )()/( provided

)),((is)(then),(is)(if 3.

)log(is)(then),log(is)(if 2.

)(is)(then),(is)(if 1.

log

1loglog

loglog

nfbnaf

nfnTnnf

nnnTnnnf

nnTnOnf

a

kaka

aa

b

bb

bb

6/5/2012 7:15 PM

T(n) = 4T(n/2) + n2/lgn

a = 4, b = 2 ⇒ nlogba = n2; f (n) = n2/lgn.

Master method does not apply

Copyright @ gdeepak.com 50

.1 somefor )()/( provided

)),((is)(then),(is)(if 3.

)log(is)(then),log(is)(if 2.

)(is)(then),(is)(if 1.

log

1loglog

loglog

nfbnaf

nfnTnnf

nnnTnnnf

nnTnOnf

a

kaka

aa

b

bb

bb

6/5/2012 7:15 PM

Copyright @ gdeepak.com 51

)()/()( nfbnaTnT

)(nf

a

)/( bnf)/( bnf )/( bnf

)/( 2bnf

a

)/( 2bnf )/( 2bnf )/( 2bnf

a

)/( 2bnf )/( 2bnf )/( 2bnf

a

)/( 2bnf )/( 2bnf

)(nf

)/( bnaf

)/( 22 bnfa

6/5/2012 7:15 PM

Depth of the tree

Data is divided by b at each level of the tree

Copyright @ gdeepak.com 52

1db

n

0log

db

n

log log 0dn b

nbd loglog

nd blog

6/5/2012 7:15 PM

Copyright @ gdeepak.com 53 6/5/2012 7:15 PM

Proof-Master Method Case 1: The Weight decreases Geometrically from the root to the leaves. The root hold a constant fraction of the total weight. First term is dominating. Case 2: Weight is approximately same on all levels. It is polynomial or slowly decreasing function. Case 3: The Weight increases Geometrically from the root to the leaves. The leaves hold a constant fraction of the total weight. Last term is dominating.

Copyright @ gdeepak.com 54

1)(log

0

log

1)(log

0

log

2233

22

2

)/()1(

)/()1(

. . .

)()/()/()/(

)()/()/(

))/())/((

)()/()(

n

i

iia

n

i

iin

b

b

b

b

bnfaTn

bnfaTa

nfbnafbnfabnTa

nfbnafbnTa

bnbnfbnaTa

nfbnaTnT

6/5/2012 7:15 PM

Copyright @ gdeepak.com 55

cn3

cn

2

cn

4 c

n

4 c

n

4 c

n

4

cn

2

cn

4 c

n

4 c

n

4 c

n

4

cn

2

cn

4 c

n

4 c

n

4 c

n

4

cn

2

cn

4 c

n

4 c

n

4 c

n

4

cn

cn

2 c

n

2 c

n

2 c

n

2

cn

( ) 4 ( / 2)T n T n n

Total cn

Total 4cn

2 = 2cn

Total 16cn

4 = 4cn

6/5/2012 7:15 PM

Copyright @ gdeepak.com 56

cn3

cn

2

3

cn

8

3 c

n

8

3 c

n

8

3 c

n

8

3

cn

2

3

cn

8

3 c

n

8

3 c

n

8

3 c

n

8

3

cn

2

3

cn

8

3 c

n

8

3 c

n

8

3 c

n

8

3

cn

2

3

cn

8

3 c

n

8

3 c

n

8

3 c

n

8

3

cn3

cn

2

3 c

n

2

3 c

n

2

3 c

n

2

3

cn3

3( ) 4 ( / 2)T n T n n

Total cn3

4cn

2

3 =

1

2 cn3

16cn

8

3 =

1

32 cn3

6/5/2012 7:15 PM

Copyright @ gdeepak.com 57 6/5/2012 7:15 PM

Copyright @ gdeepak.com 58

cn

cn

cn

2 c

n

2

cn

cn

2

cn

4 c

n

4

cn

2

cn

4 c

n

4

Total cn

Total cn

Total cn

T(n)=2T(n/2) + n

6/5/2012 7:15 PM

Copyright @ gdeepak.com 59

T(n)= T(n/3)+T(2n/3)+n

6/5/2012 7:15 PM

Questions, Comments and Suggestions

Copyright @ gdeepak.com 60 6/5/2012 7:15 PM

Question 1

For recurrence T(n) = T(n-1) + n what will be the big oh complexity.

A) O(n)

B) O(nlogn)

C) O(n2)

D) O(2n)

Copyright @ gdeepak.com 61 6/5/2012 7:15 PM

Question 2

The recurrence relation that arises in relation to the complexity of binary search is

(A) T(n) = T(n/2) + k, k is a constant

(B) T(n) = 2T(n/2) + k, k is a constant

(C) T(n) = T(n/2) + log n

(D) T(n) = T(n/2) + n

Copyright @ gdeepak.com 62 6/5/2012 7:15 PM

Question 3

Which of the following sorting algorithms has the lowest worst-case complexity?

(A) Merge sort

(B) Bubble sort

(C) Quick sort

(D) Selection sort

Copyright @ gdeepak.com 63 6/5/2012 7:15 PM

Recommended