26
David Luebke 1 06/21/22 CS 332: Algorithms NP Completeness

David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

Embed Size (px)

DESCRIPTION

David Luebke 3 1/14/2016 Review: Dynamic Programming l When applicable: n Optimal substructure: optimal solution to problem consists of optimal solutions to subproblems n Overlapping subproblems: few subproblems in total, many recurring instances of each n Basic approach: u Build a table of solved subproblems that are used to solve larger ones u What is the difference between memoization and dynamic programming? u Why might the latter be more efficient?

Citation preview

Page 1: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 1 05/03/23

CS 332: Algorithms

NP Completeness

Page 2: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 2 05/03/23

Administrivia

Homework 5 clarifications: Deadline extended to Thursday Dec 7 However, no late assignments will be accepted

This is so that I can discuss it in class You may not work with others on this one

Page 3: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 3 05/03/23

Review: Dynamic Programming

When applicable: Optimal substructure: optimal solution to problem consists

of optimal solutions to subproblems Overlapping subproblems: few subproblems in total, many

recurring instances of each Basic approach:

Build a table of solved subproblems that are used to solve larger ones

What is the difference between memoization and dynamic programming?

Why might the latter be more efficient?

Page 4: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 4 05/03/23

Review: Greedy Algorithms

A greedy algorithm always makes the choice that looks best at the moment The hope: a locally optimal choice will lead to a

globally optimal solution For some problems, it works

Yes: fractional knapsack problem No: playing a bridge hand

Dynamic programming can be overkill; greedy algorithms tend to be easier to code

Page 5: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 5 05/03/23

Review: Activity-Selection Problem

The activity selection problem: get your money’s worth out of a carnival Buy a wristband that lets you onto any ride Lots of rides, starting and ending at different times Your goal: ride as many rides as possible

Naïve first-year CS major strategy: Ride the first ride, when get off, get on the very next

ride possible, repeat until carnival ends What is the sophisticated third-year strategy?

Page 6: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 6 05/03/23

Review: Activity-Selection

Formally: Given a set S of n activities

si = start time of activity i fi = finish time of activity i Find max-size subset A of compatible activities Assume activities sorted by finish time

What is optimal substructure for this problem?

Page 7: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 7 05/03/23

Review: Activity-Selection

Formally: Given a set S of n activities

si = start time of activity i fi = finish time of activity i Find max-size subset A of compatible activities Assume activities sorted by finish time

What is optimal substructure for this problem? A: If k is the activity in A with the earliest finish time, then A - {k} is an optimal solution to

S’ = {i S: si fk}

Page 8: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 8 05/03/23

Review: Greedy Choice Property For Activity Selection

Dynamic programming? Memoize? Yes, but… Activity selection problem also exhibits the greedy

choice property: Locally optimal choice globally optimal sol’n Them 17.1: if S is an activity selection problem sorted

by finish time, then optimal solution A S such that {1} A

Sketch of proof: if optimal solution B that does not contain {1}, can always replace first activity in B with {1} (Why?). Same number of activities, thus optimal.

Page 9: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 9 05/03/23

Review: The Knapsack Problem

The 0-1 knapsack problem: A thief must choose among n items, where the ith item worth

vi dollars and weighs wi pounds Carrying at most W pounds, maximize value

A variation, the fractional knapsack problem: Thief can take fractions of items Think of items in 0-1 problem as gold ingots, in fractional

problem as buckets of gold dust What greedy choice algorithm works for the fractional

problem but not the 0-1 problem?

Page 10: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 10 05/03/23

Homework 4

Problem 1 (detecting overlap in rectangles): “Sweep” vertical line across rectangles; keep track

of line-rectangle intersections with interval tree Need to sort rectangles by Xmin, Xmax

Then need to do O(n) inserts and deletes Total time: O(n lg n)

Page 11: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 11 05/03/23

Homework 4

Problem 2: Optimizing Kruskal’s algorithm Edge weights integers from 1 to |V|:

Use counting sort to sort edges in O(V+E) time= O(E) time (Why?)

Bottleneck now the disjoint-set unify operations, so total algorithm time now O(E (E,V))

Edge weights integers from 1 to constant W: Again use counting sort to sort edges in O(W+E) time Which again = O(E) time Which again means O(E (E,V)) total time

Page 12: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 12 05/03/23

Homework 4

Problem 3: Optimizing Prim’s algorithm Running time of Prim’s algorithm depends on the

implementation of the priority queue Edge weights integers from 1 to constant W:

Implement queue as an array Q[1..W+1] The ith slot in Q holds doubly-linked list of vertices with

key = i (What does slot W+1 represent?) Extract-Min() takes O(1) time (How?) Decrease-Key() takes O(1) time (How?) Total asymptotic running time of Prims: O(E)

Page 13: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 13 05/03/23

Homework 4

Problem 4: No time to cover here, sorry

Page 14: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 14 05/03/23

NP-Completeness

Some problems are intractable: as they grow large, we are unable to solve them in reasonable time

What constitutes reasonable time? Standard working definition: polynomial time On an input of size n the worst-case running time is

O(nk) for some constant k Polynomial time: O(n2), O(n3), O(1), O(n lg n) Not in polynomial time: O(2n), O(nn), O(n!)

Page 15: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 15 05/03/23

Polynomial-Time Algorithms

Are some problems solvable in polynomial time? Of course: every algorithm we’ve studied provides

polynomial-time solution to some problem We define P to be the class of problems solvable in

polynomial time Are all problems solvable in polynomial time?

No: Turing’s “Halting Problem” is not solvable by any computer, no matter how much time is given

Such problems are clearly intractable, not in P

Page 16: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 16 05/03/23

NP-Complete Problems

The NP-Complete problems are an interesting class of problems whose status is unknown No polynomial-time algorithm has been

discovered for an NP-Complete problem No suprapolynomial lower bound has been proved

for any NP-Complete problem, either We call this the P = NP question

The biggest open problem in CS

Page 17: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 17 05/03/23

An NP-Complete Problem:Hamiltonian Cycles

An example of an NP-Complete problem: A hamiltonian cycle of an undirected graph is a

simple cycle that contains every vertex The hamiltonian-cycle problem: given a graph G,

does it have a hamiltonian cycle? Draw on board: dodecahedron, odd bipartite graph

Describe a naïve algorithm for solving the hamiltonian-cycle problem. Running time?

Page 18: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 18 05/03/23

P and NP

As mentioned, P is set of problems that can be solved in polynomial time

NP (nondeterministic polynomial time) is the set of problems that can be solved in polynomial time by a nondeterministic computer What the hell is that?

Page 19: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 19 05/03/23

Nondeterminism

Think of a non-deterministic computer as a computer that magically “guesses” a solution, then has to verify that it is correct If a solution exists, computer always guesses it One way to imagine it: a parallel computer that can freely

spawn an infinite number of processes Have one processor work on each possible solution All processors attempt to verify that their solution works If a processor finds it has a working solution

So: NP = problems verifiable in polynomial time

Page 20: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 20 05/03/23

P and NP

Summary so far: P = problems that can be solved in polynomial time NP = problems for which a solution can be verified

in polynomial time Unknown whether P = NP (most suspect not)

Hamiltonian-cycle problem is in NP: Cannot solve in polynomial time Easy to verify solution in polynomial time (How?)

Page 21: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 21 05/03/23

NP-Complete Problems

We will see that NP-Complete problems are the “hardest” problems in NP: If any one NP-Complete problem can be solved in

polynomial time… …then every NP-Complete problem can be solved in

polynomial time… …and in fact every problem in NP can be solved in

polynomial time (which would show P = NP) Thus: solve hamiltonian-cycle in O(n100) time, you’ve

proved that P = NP. Retire rich & famous.

Page 22: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 22 05/03/23

Reduction

The crux of NP-Completeness is reducibility Informally, a problem P can be reduced to another

problem Q if any instance of P can be “easily rephrased” as an instance of Q, the solution to which provides a solution to the instance of P

What do you suppose “easily” means? This rephrasing is called transformation

Intuitively: If P reduces to Q, P is “no harder to solve” than Q

Page 23: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 23 05/03/23

Reducibility

An example: P: Given a set of Booleans, is at least one TRUE? Q: Given a set of integers, is their sum positive? Transformation: (x1, x2, …, xn) = (y1, y2, …, yn) where yi = 1

if xi = TRUE, yi = 0 if xi = FALSE Another example:

Solving linear equations is reducible to solving quadratic equations

How can we easily use a quadratic-equation solver to solve linear equations?

Page 24: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 24 05/03/23

Using Reductions

If P is polynomial-time reducible to Q, we denote this P p Q

Definition of NP-Complete: If P is NP-Complete, all problems R NP are reducible to

P Formally: R p P R NP

If P p Q and P is NP-Complete, Q is also NP-Complete This is the key idea you should take away today

Page 25: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 25 05/03/23

Coming Up

Given one NP-Complete problem, we can prove many interesting problems NP-Complete Graph coloring (= register allocation) Hamiltonian cycle Hamiltonian path Knapsack problem Traveling salesman Job scheduling with penalities Many, many more

Page 26: David Luebke 1 1/14/2016 CS 332: Algorithms NP Completeness

David Luebke 26 05/03/23

The End