55
Dr. Shahriar Bijani Shahed University Feb. 2014

Dr. Shahriar Bijani Shahed University Feb. 2014. Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007. Chapter 4, Computer Algorithms,

Embed Size (px)

Citation preview

Page 1: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

Dr. Shahriar BijaniShahed University

Feb. 2014

Page 2: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.

Chapter 4, Computer Algorithms, Horowitz ,…, 2001.

Chapter 16, Introduction to Algorithms, CLRS , 2001.

2Algorithms Design, Greedy Algorithms

Page 3: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

3

The problem: We are required to find a feasible solution that either maximizes or minimizes a given objective solution.

It is easy to determine a feasible solution but not necessarily an optimal solution.

The greedy method solves this problem in stages, at each stage, a decision is made considering inputs in an order determined by the selection procedure which may be based on an optimization measure.

Algorithms Design, Greedy Algorithms

Page 4: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

4

An optimization problem is one in which you want to find, not just a solution, but the best solution

A “greedy algorithm” sometimes works well for optimization problems

A greedy algorithm works in phases. At each phase:◦ You take the best you can get right now,

without regard for future consequences◦ You hope that by choosing a local optimum at

each step, you will end up at a global optimum

Algorithms Design, Greedy Algorithms

Page 5: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

5

Suppose you want to count out a certain amount of money, using the fewest possible bills and coins

A greedy algorithm would do this would be:At each step, take the largest possible bill or coin that does not overshoot◦ Example: To make $6.39, you can choose:

a $5 bill a $1 bill, to make $6 a 25¢ coin, to make $6.25 A 10¢ coin, to make $6.35 four 1¢ coins, to make $6.39

For US money, the greedy algorithm always gives the optimum solution

Algorithms Design, Greedy Algorithms

Page 6: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

6

In some (fictional) monetary system, “Rial” come in 1 Rial, 7 Rial, and 10 Rial coins

Using a greedy algorithm to count out 15 Rials, you would get◦ A 10 Rial piece◦ Five 1 Rial pieces, for a total of 15 Rials◦ This requires six coins

A better solution would be to use two 7 Rial pieces and one 1 Rial piece◦ This only requires three coins

This greedy algorithm results in a solution, but not in an optimal solution

Algorithms Design, Greedy Algorithms

Page 7: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

7

You have to run nine jobs, with running times of 3, 5, 6, 10, 11, 14, 15, 18, and 20 minutes

You have three processors on which you can run these jobs

You decide to do the longest-running jobs first, on whatever processor is available

20

18

15 14

11

10

6

5

3P1

P2

P3

Time to completion: 18 + 11 + 6 = 35 minutes This solution isn’t bad, but we might be able to do better

Algorithms Design, Greedy Algorithms

Page 8: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

8

What would be the result if you ran the shortest job first? Again, the running times are 3, 5, 6, 10, 11, 14, 15, 18,

and 20 minutes

That wasn’t such a good idea; time to completion is now 6 + 14 + 20 = 40 minutes

Note, however, that the greedy algorithm itself is fast◦ All we had to do at each stage was pick the minimum or maximum

20

18

15

14

11

10

6

5

3P1

P2

P3

Algorithms Design, Greedy Algorithms

Page 9: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

9

This solution is clearly optimal (why?) Clearly, there are other optimal solutions (why?) How do we find such a solution?

◦ One way: Try all possible assignments of jobs to processors◦ Unfortunately, this approach can take exponential time

Better solutions do exist:

20

18

15

14

11

10 6

5

3

P1

P2

P3

Algorithms Design, Greedy Algorithms

Page 10: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

10

The Huffman encoding algorithm is a greedy algorithm You always pick the two smallest numbers to combine

Average bits/char:0.22*2 + 0.12*3 +0.24*2 + 0.06*4 +0.27*2 + 0.09*4= 2.42

The Huffman algorithm finds an optimal solution

22 12 24 6 27 9 A B C D E F

15

27

46

54

100

A=00B=100C=01D=1010E=11F=1011

Algorithms Design, Greedy Algorithms

Page 11: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

04/18/23CSE 5311 Spring 2007

M Kumar 11

Procedure Huffman_Encoding(S,f);Input : S (a string of characters) and f (an array of frequencies).Output : T (the Huffman tree for S)

1. insert all characters into a heap H according to their frequencies;

2. while H is not empty do 3. if H contains only one character x then4. x root (T);5. else 6. z ALLOCATE_NODE();7. x left[T,z] EXTRACT_MIN(H);8. y right[T,z] EXTRACT_MIN(H);9. fz fx + fy;10. INSERT(H,z);

Page 12: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

04/18/23CSE 5311 Spring 2007

M Kumar 12

Complexity of Huffman algorithm

Building a heap in step 1 takes O(n) timeInsertions (steps 7 and 8) and deletions (step 10) on H

take O (log n) time eachTherefore Steps 2 through 10 take O(n logn) time

Thus the overall complexity of the algorithm is O( n logn ).

Page 13: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

Assume a thief is robbing a store with various valued and weighted objects. Her bag can hold only W pounds.

The 0-1 knapsack problem: must take all or nothing

The fractional knapsack problem: can take all or part of the item

Algorithms Design, Greedy Algorithms 13

Page 14: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

50

10

20

30

50

20

30

50

20

10

50

10

30 50

10

20---30

20

=$220 =$160 =$180 =$240$60 $100 $120

0-1

kna

psac

k

Page 15: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

Given n items of sizes w1, w2, …, wn and values p1, p2, …, pn and size of knapsack of C, the problem is to find x1, x2, …, xn that maximize

subject to

n

iii px

1

n

iii Cwx

1

Page 16: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

Consider yi = pi / wi

What is yi? What is the solution?

fractional problem has greedy-choice property, 0-1 does not.

Page 17: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

17

Activity On Edge (AOE) Networks: Tasks (activities) : a0, a1,… Events : v0,v1,…

V0

V1

V2

V3

V4

V6

V7

V8

V5

finish

a0 = 6

starta1 = 4

a2 = 5

a4 = 1

a3 = 1

a5 = 2

a6 = 9

a7 = 7

a8 = 4

a10 = 4

a9 = 2

Some definition: Predecessor Successor Immediate predecessor Immediate successor

Page 18: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

18

A critical path is a path that has the longest length. (v0, v1, v4, v7, v8)

V0

V1

V2

V3

V4

V6

V7

V8

V5

a0 = 6

a1 = 4

a2 = 5

a4 = 1

a3 = 1

a5 = 2

a6 = 9

a7 = 7

a8 = 4

a10 = 4

a9 = 2

start finish

6 + 1 + 7 + 4 = 18 (Max)

Page 19: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

19

The earliest time of an activity, ai, can occur is the length of the longest path from the start vertex v0 to ai’s start vertex. ( Ex: the earliest time of activity a7 can occur is 7.)

We denote this time as early(i) for activity ai.∴ early(6) = early(7) = 7.

V0

V1

V2

V3

V4

V6

V7

V8

V5

finish

a0 = 6

starta1 = 4

a2 = 5

a4 = 1

a3 = 1

a5 = 2

a6 = 9

a7 = 7

a8 = 4

a10 = 4

a9 = 26/?

0/?

7/? 16/?

0/?

5/?

7/?

14/?7/?4/?

0/?

18

Page 20: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

20

The latest time, late(i), of activity, ai, is defined to be the latest time the activity may start without increasing the project duration.

Ex: early(5) = 5 & late(5) = 8; early(7) = 7 & late(7) = 7

V0

V1

V2

V3

V4

V6

V7

V8

V5

finish

a0 = 6

starta1 = 4

a2 = 5

a4 = 1

a3 = 1

a5 = 2

a6 = 9

a7 = 7

a8 = 4

a10 = 4

a9 = 2

late(5) = 18 – 4 – 4 - 2 = 8late(7) = 18 – 4 – 7 = 7

6/6

0/1

7/7 16/16

0/3

5/8

7/10

14/147/74/5

0/0

Page 21: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

21

A critical activity is an activity for which early(i) = late(i).

The difference between late(i) and early(i) is a measure of how critical an activity is.

Calculationof

Latest Times

Calculationof

Earliest TimesFinding

Critical path(s)To solve

AOE Problem

Page 22: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

4 -22

Let activity ai is represented by edge (u, v).◦ early (i) = earliest [u]◦ late (i) = latest [v] – duration of activity ai

We compute the times in two stages:a forward stage and a backward stage.

The forward stage:◦ Step 1: earliest [0] = 0◦ Step 2: earliest [j] = max {earliest [i] + duration of

(i, j)}i is in P(j)

P(j) is the set of immediate predecessors of j.

Page 23: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

23

The backward stage:◦ Step 1: latest[n-1] = earliest[n-1]◦ Step 2: latest [j] = min {latest [i] - duration of (j, i)}

i is in S(j)

S(j) is the set of vertices adjacent from vertex j.latest[8] = earliest[8] = 18latest[6] = min{earliest[8] - 2} = 16latest[7] = min{earliest[8] - 4} = 14latest[4] = min{earliest[6] – 9; earliest[7] – 7} = 7latest[1] = min{earliest[4] - 1} = 6latest[2] = min{earliest[4] - 1} = 6latest[5] = min{earliest[7] - 4} = 10latest[3] = min{earliest[5] - 2} = 8latest[0] = min{earliest[1] – 6; earliest[2] – 4; earliest[3] –

5} = 0

Page 24: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

24

V0

V1

V2

V3

V4

V6

V7

V8

V5

finish

a0

starta1

a2

a4

a3

a5

a6

a7

a8

a10

a9

V0

V1

V4

V6

V7

V8 finish

a0

start

a3 a6

a7 a10

a9

Activity

Early

Late L - E

Critical

a0 0 0 0 Yes

a1 0 2 2 No

a2 0 3 3 No

a3 6 6 0 Yes

a4 4 6 2 No

a5 5 8 3 No

a6 7 7 0 Yes

a7 7 7 0 Yes

a8 7 10 3 No

a9 16 16 0 Yes

a10 14 14 0 Yes

Page 25: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

4 -25

The longest path(critical path) problem can be solved by the critical path method(CPM) :

Step 1:Find a topological ordering.Step 2: Find the critical path.

Page 26: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

26

A salesman must visit every city (starting from city A), and wants to cover the least possible distance◦ He can revisit a city (and reuse a road) if necessary

He does this by using a greedy algorithm: He goes to the next nearest city from wherever he is

From A he goes to B From B he goes to D This is not going to result in a

shortest path! The best result he can get

now will be ABDBCE, at a cost of 16

An actual least-cost path from A is ADBCE, at a cost of 14E

A B C

D

2

3 3

4

4 4

Algorithms Design, Greedy Algorithms

Page 27: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

27

A greedy algorithm typically makes (approximately) n choices for a problem of size n◦ (The first or last choice may be forced)

Hence the expected running time is:O(n * O(choice(n))), where choice(n) is making a choice among n objects◦ Counting: Must find largest useable coin from

among k sizes of coin (k is a constant), an O(k)=O(1) operation; Therefore, coin counting is (n)

◦ Huffman: Must sort n values before making n choices Therefore, Huffman is O(n log n) + O(n) = O(n log

n)

Algorithms Design, Greedy Algorithms

Page 28: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

Algorithms Design, Greedy Algorithms 29

In a shortest-paths problem, we are given a weighted, directed graph G = (V,E), with weights assigned to each edge in the graph. The weight of the path p = (v0, v1, v2, …, vk) is the sum of the weights of its edges:

v0 v1 v2 . . . vk-1 vk

The shortest-path from u to v is given by d(u,v) = min {weight (p)} : if there are one or more paths from u to v

= otherwise

Page 29: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

Algorithms Design, Greedy Algorithms 30

Given G (V,E), find the shortest path from a given vertex u V to every vertex v V ( u v). For each vertex v V in the weighted directed graph, d[v] represents the distance from u to v.

Initially, d[v] = 0 when u = v. d[v] = if (u,v) is not an edge d[v] = weight of edge (u,v) if (u,v) exists.

Dijkstra's Algorithm : At every step of the algorithm, we compute, d[y] = min {d[y], d[x] + w(x,y)}, where x,y V.

Dijkstra's algorithm is based on the greedy principle because at every step we pick the path of least weight.

Page 30: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

Algorithms Design, Greedy Algorithms 31

3 9 2

5 1 u b a

1 4 9

2 4 d e c

3 2

g h f

Page 31: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

Algorithms Design, Greedy Algorithms 32

3 9 2

5 1 u b a

1 4 9

2 4 d e c

3 2

g h f

Step #

Vertex tobe marked

Distance to vertex Unmarked vertices

u a b c d e f g h

0 u 0 1 5 9 a,b,c,d,e,f,g,h

1 a 0 1 5 3 9 b,c,d,e,f,g,h

2 c 0 1 5 3 7 12 b,d,e,f,g,h

3 b 0 1 5 3 7 8 12 d,e,f,g,h

4 d 0 1 5 3 7 8 12 11 e,f,g,h

5 e 0 1 5 3 7 8 12 11 9 f,g,h

6 h 0 1 5 3 7 8 12 11 9 g,h

7 g 0 1 5 3 7 8 12 11 9 h

8 f 0 1 5 3 7 8 12 11 9 --

Page 32: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

Algorithms Design, Greedy Algorithms 33

Procedure Dijkstra's Single-source shortest path_G(V,E,u)Input: G =(V,E), the weighted directed graph and v the source vertexOutput: for each vertex, v, d[v] is the length of the shortest path from u to v.1. mark vertex u;2. d[u] 0; 3. for each unmarked vertex v V do4. if edge (u,v) exists d [v] weight (u,v);5. else d[v] ;6. while there exists an unmarked vertex do7. let v be an unmarked vertex such that d[v] is minimal;8. mark vertex v;9. for all edges (v,x) such that x is unmarked do10. if d[x] > d[v] + weight[v,x] then11. d[x] d[v] + weight[v,x]

Page 33: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

Algorithms Design, Greedy Algorithms 34

Complexity of Dijkstra's algorithm: Steps 1 and 2 take (1) time Steps 3 to 5 take O(V) time The vertices are arranged in a heap in order of

their paths from u Updating the length of a path takes O(log V) time. There are V iterations, and at most E updates Therefore the algorithm takes

O((E+V) log V) time.

Page 34: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

Spanning tree of a connected graph G: a connected acyclic subgraph of G that includes all of G’s vertices

Minimum spanning tree of a weighted, connected graph G: a spanning tree of G of the minimum total weight

# MST = nn-2 (n = numbers of nodes)

Example:c

db

a

6

2

4

3

1

c

db

a

2

3

1

c

db

a

6

4 1

Page 35: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

Start with tree T1 consisting of one (any) vertex and “grow” tree one vertex at a time to produce MST through a series of expanding subtrees T1, T2, …, Tn

On each iteration, construct Ti+1 from Ti by adding vertex not in Ti that is closest to those already in Ti (this is a “greedy” step!)

Stop when all vertices are included

Page 36: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

c

db

a

4

26

1

3

c

db

a

4

26

1

3

c

db

a

4

26

1

3

c

db

a

4

26

1

3

c

db

a

4

26

1

3

Page 37: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

Proof by induction that this construction actually yields an MST (CLRS, Ch. 23.1).

Efficiency◦ O(n2) for weight matrix representation of graph

and array implementation of priority queue

◦ O(m log n) for adjacency list representation of graph with n vertices and m edges and min-heap implementation of the priority queue

Page 38: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

04/18/23 39

The algorithm maintains a collection VS of disjoint sets of vertices

Each set W in VS represents a connected set of vertices forming a spanning tree

Initially, each vertex is in a set by itself in VS Edges are chosen from E in order of increasing

cost, we consider each edge (v, w) in turn; v, w V.

If v and w are already in the same set (say W) of VS, we discard the edge

If v and w are in distinct sets W1 and W2 (meaning v and/or w not in T) we merge W1 with W2 and add (v, w) to T.

Page 39: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

04/18/23 40

Procedure MCST_G(V,E) (Kruskal's Algorithm) Input: An undirected graph G(V,E) with a cost function c on the

edges Output: T the minimum cost spanning tree for G1. T 0;2. VS 0;3. for each vertex v V do4. VS = VS {v};5. sort the edges of E in nondecreasing order of weight 6. while VS > 1 do7. choose (v,w) an edge E of lowest cost;8. delete (v,w) from E;9. if v and w are in different sets W1 and W2 in VS do10. W1 = W1 W2;11. VS = VS - W2;12. T T (v,w);13.return T

Page 40: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

04/18/23 41

Consider the example graph shown earlier, The edges in nondecreasing order

[(A,D),1],[(C,D),1],[(C,F),2],[(E,F),2],[(A,F),3],[(A,B),3],

[(B,E),4],[(D,E),5],[(B,C),6]

EdgeActionSets in VS Spanning Tree, T =[{A},{B},{C},{D},{E},{F}]{0}(A,D) merge [{A,D}, {B},{C}, {E}, {F}] {(A,D)} (C,D) merge [{A,C,D}, {B}, {E}, {F}] {(A,D), (C,D)} (C,F) merge[{A,C,D,F},{B},{E}]{(A,D),(C,D), (C,F)} (E,F) merge[{A,C,D,E,F},{B}]{(A,D),(C,D), (C,F),(E,F)}(A,F) reject[{A,C,D,E,F},{B}]{(A,D),(C,D), (C,F), (E,F)}(A,B) merge[{A,B,C,D,E,F}]{(A,D),(A,B),(C,D), (C,F),(E,F)}(B,E) reject(D,E) reject(B,C) reject

F

E

D

C

B

A

1

2 1

4

5

2 6

3 3

F

E

D

C

B

A

1

2 1

2

3

The equivalent Graph and the MCST

Page 41: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

04/18/23 42

Steps 1 thru 4 take time O (V) Step 5 sorts the edges in nondecreasing order in O

(E log E ) time Steps 6 through 13 take O (E) time The total time for the algorithm is therefore given

by O (E log E) The edges can be maintained in a heap data

structure with the property, E[PARENT(i)] E[i] This property can be used to sort data elements in

nonincreasing order. Construct a heap of the edge weights, the edge with lowest

cost is at the root During each step of edge removal, delete the root (minimum element) from the heap and rearrange the heap.

The use of heap data structure reduces the time taken because at every step we are only picking up the minimum or root element rather than sorting the edge weights.

Page 42: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

04/18/23 43

Consider a network of computers connected through bidirectional links. Each link is associated with a positive cost: the cost of sending a message on each link.

This network can be represented by an undirected graph with positive costs on each edge.

In bidirectional networks we can assume that the cost of sending a message on a link does not depend on the direction.

Suppose we want to broadcast a message to all the computers from an arbitrary computer.

The cost of the broadcast is the sum of the costs of links used to forward the message.

Page 43: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

04/18/23 44

1

2 1

4

5

2 6

3 3

A Local Area Network

F

E

D

C

B

A

1

2 1

4

5

2 6

3 3

F

E

D

C

B

A

1

2 1

2

3

The equivalent Graph and the MCST

Page 44: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

Problem Formulation◦ Given a set of n activities, S = {a1, a2, ..., an} that

require exclusive use of a common resource, find the largest possible set of nonoverlapping activities (also called mutually compatible). For example, scheduling the use of a classroom.

◦ Assume that ai needs the resource during period [si, fi), which is a half-open interval, where si = start time of the activity, and fi = finish time of the activity.

Note: Could have many other objectives:◦ Schedule room for longest time.◦ Maximize income rental fees.

Page 45: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

Assume the following set S of activities that are sorted by their finish time, find a maximum-size mutually compatible set.

i 1 2 3 4 5 6 7 8 9

si 1 2 4 1 5 8 9 11 13

fi 3 5 7 8 9 10 11 14 16

Page 46: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

i 1 2 3 4 5 6 7 8 9

si 1 2 4 1 5 8 9 11 13

fi 3 5 7 8 9 10 11 14 16

Page 47: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

Define Si,j = {ak S : fi sk < fk sj } ◦ activities that start after ai finishes and finish

before aj starts Activities in Si,j are compatible with:

◦ ai and aj ◦ aw that finishes not later than ai and start not

earlier than aj

Add the following [imaginary] activitiesa0 = [– , 0) and an+1 = [ , +1)

Hence, S = S0,n+1 and the range of Si,j is 0 i,j n+1

Page 48: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

Assume that activities are sorted by monotonically increasing finish time:◦ i.e., f0 f1 f2 ... fn < fn+1

Then, Si,j = for i j.

Proof: Therefore, we only need to worry about Si,j

where 0 i < j n+1

Page 49: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

Suppose that a solution to Si,j includes ak . We have 2 sub-problems:◦ Si,k (start after ai finishes, finish before ak starts)

◦ Sk,j (start after ak finishes, finish before aj starts)

The Solution to Si,j is

(solution to Si,k ) {ak} (solution to Sk,j ) Since ak is in neither sub-problem, and the

subproblems are disjoint, |solution to S| = |solution to Si,k| + 1 + |

solution to Sk,j|

Page 50: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

Let Ai,j = optimal solution to Si,j .

So Ai,j = Ai,k {ak} Ak,j, assuming:◦ Si,j is nonempty, and

◦ we know ak. Hence,

c[i,j]: number of activities in a max subset of mutually compatible activities

ji

Sajki

ji

Sjkckic

S

jic

jik

,

,

1],[],[max

0

],[

,

Page 51: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

Theorem: Let Si,j , and let am be the activity in Si,j with the earliest finish time: fm= min { fk : ak Si,j }. Then:

1. am is used in some maximum-size subset of mutually compatible activities of Si,j

2. Sim= , so that choosing am leaves Sm,j as the only nonempty subproblem.

Page 52: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,
Page 53: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

Algorithms Design, Greedy Algorithms 54

Page 54: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

ɵ(Sorting)+ɵ(n) = ɵ (n log n)

Page 55: Dr. Shahriar Bijani Shahed University Feb. 2014.  Kleinberg and Tardos, Algorithm Design, CSE 5311, M Kumar, Spring 2007.  Chapter 4, Computer Algorithms,

56

A board has a certain number of coins on it A robot starts in the upper-left corner, and

walks to the bottom left-hand corner◦ The robot can only move in two directions: right and

down◦ The robot collects coins as it goes

You want to collect all the coins using the minimum number of robots

Example: Do you see a greedy algorithm

for doing this? Does the algorithm guarantee

an optimal solution?◦ Can you prove it?◦ Can you find a counterexample?