16
Lecture 12: Revision Lecture Dr John Levine 52236 Algorithms and Complexity March 27th 2006

Lecture 12: Revision Lecture Dr John Levine 52236 Algorithms and Complexity March 27th 2006

Embed Size (px)

Citation preview

Page 1: Lecture 12: Revision Lecture Dr John Levine 52236 Algorithms and Complexity March 27th 2006

Lecture 12: Revision Lecture

Dr John Levine

52236 Algorithms and ComplexityMarch 27th 2006

Page 2: Lecture 12: Revision Lecture Dr John Levine 52236 Algorithms and Complexity March 27th 2006

Lecture List

• Lecture 1: Why algorithms are so important

• Lecture 2: Big-Oh notation, a first look at different complexity classes (log, linear, log-linear, quadratic, polynomial, exponential, factorial)

• Lecture 3: simple search in a list of items is O(n) with unordered data but O(log n) with ordered data, and can be even faster with a good indexing scheme and parallel processors

• Lecture 4: sorting: random sort is O(n!), naïve sort is O(n2), bubblesort is O(n2), quicksort is O(n log n)

Page 3: Lecture 12: Revision Lecture Dr John Levine 52236 Algorithms and Complexity March 27th 2006

Lecture List

• Lecture 5: more on sorting: why comparison sort is O(n log n), doing better than O(n log n) by not doing comparisons (e.g. bucket sort)

• Lecture 6: harder search: how to represent a problem in terms of states and moves

• Lecture 7: uninformed search through states using an agenda: depth-first search and breadth-first search

• Lecture 8: making it smart: informed search using heuristics; how to use heuristic search without losing optimality – the A* algorithm

Page 4: Lecture 12: Revision Lecture Dr John Levine 52236 Algorithms and Complexity March 27th 2006

Lecture List

• Lecture 9: game tree search (Mark Dunlop’s notes), minimax and alpha-beta pruning

• Lecture 10: class review, ADT for lookup table, hash tables

• Lecture 11: Dijkstra’s algorithm

• Lecture 12: revision, NP-hard problems

Page 5: Lecture 12: Revision Lecture Dr John Levine 52236 Algorithms and Complexity March 27th 2006

Syllabus: Algorithmic Complexity

Introduction to Algorithmic Complexity: basic algorithmicclassification, with examples; the order notation (Big-oh);elementary complexity and estimation of run times; thetyranny of growth.

• Covered in specifically in Lectures 1 and 2 and Practical 1, also covered throughout the course

• Examinable

• Typical exam question: say what Big-oh is, give an analysis of some code fragments

Page 6: Lecture 12: Revision Lecture Dr John Levine 52236 Algorithms and Complexity March 27th 2006

Syllabus: Algorithmic Complexity

• Big-Oh notation (see Wikipedia article)

• The major classes of complexity: O(1), O(log n), O(n), O(n log n), O(n2), O(n3),… ,O(an), O(n!)

• Analysis of nested loops and operations on arrays

• Why complexity is important: the tyranny of growth

• Easy problems (P) are solvable in polynomial time

• Hard problems (NP) are only solvable in exponential time

• It is thought that P ≠ NP but no-one has proved this

Page 7: Lecture 12: Revision Lecture Dr John Levine 52236 Algorithms and Complexity March 27th 2006

Syllabus: Searching and SortingSearching and Sorting: the complexity of a range oftechniques, including the divide and conquer approach;the relative complexity of searching and sortingalgorithms; the sorting algorithms covered will includebubble sort, insertion sort, merge sort and quick sort;searching, including sequential search and the binarychop; hashing.

• Partially covered by Lectures 3 and 4

• We’ll do a quick recap of the four sorting algorithms

• We’ll look at hash tables today

• Examinable

Page 8: Lecture 12: Revision Lecture Dr John Levine 52236 Algorithms and Complexity March 27th 2006

Syllabus: Sorting

• Insertion sort: insert items one at a time into the correct position in a sorted list, aka naïve sort, O(n2)

• Bubble sort: pass through list swapping adjacent pairs, repeat until sorted, O(n2)

• Quicksort: choose pivot as estimate of middle item, partition into two sets (> pivot, < pivot), then partition each set using the same process and repeat until the list is sorted, O(n log n)

• Merge sort: divide into two sets, sort each set, then merge the two sorted sets, O(n log n)

Page 9: Lecture 12: Revision Lecture Dr John Levine 52236 Algorithms and Complexity March 27th 2006

Syllabus: Sorting

• The best we can do by comparisons is O(n log n): n items implies n! possible unsorted sequences, each comparison divides this by 2, hence we need to make log(n!) comparisons to identify the sequence, and log(n!) ≈ n log n

• We don’t have to use comparisons: if we are sorting items which have a natural finite sequence, we can use something like bucket sort, which is O(n)

• There are numerous search algorithms: see the Wikipedia article for many more

Page 10: Lecture 12: Revision Lecture Dr John Levine 52236 Algorithms and Complexity March 27th 2006

Syllabus: Searching

• Linear search through an (unordered) list of items is O(n), and if n is very large we want to do better

• Binary search is O(log n); search using a binary tree to store the data is O(log n)

• Using a hash table to hold the data, the search is O(1), but finding a really good hash function is hard!

Page 11: Lecture 12: Revision Lecture Dr John Levine 52236 Algorithms and Complexity March 27th 2006

Syllabus: Game TreesBinary Trees revisited: implementations by array;expression trees; binary tree implementation of sortedlist; access times; algorithms covered include traversal,searching, balancing and deletion.

• Already covered in Programming Techniques

• We did state space search and game trees instead

• Game trees, minimax search and alpha-beta pruning are examinable, the rest is not (because last year’s students didn’t do it)

• Typical exam question: map out a simple game tree and say what move the computer should make next

Page 12: Lecture 12: Revision Lecture Dr John Levine 52236 Algorithms and Complexity March 27th 2006

Syllabus: Game Trees

• Game trees: mapping out of states for each possible move by each player in turn

• Colouring the game tree: finding out what move the computer should make

• Minimax search: assume your opponent chooses the move to minimise the evaluation function, and you choose the move which forces the evaluation function to be maximised

• Alpha-beta pruning: don’t search branches of the game tree when you already know they are poor

Page 13: Lecture 12: Revision Lecture Dr John Levine 52236 Algorithms and Complexity March 27th 2006

Syllabus: Graph AlgorithmsGraphs revisited: directed and undirected graphs;representations of graphs; basic graph algorithms;applications of graphs to real world problems (forexample telecommunications, transportation systems,dependencies between objects).

• To be covered next week

• Representation of graphs, Dijkstra’s algorithm

• Application to the tube map problem

• Examinable

• Typical exam question: show application of Dijkstra’s algorithm to a real world problem

Page 14: Lecture 12: Revision Lecture Dr John Levine 52236 Algorithms and Complexity March 27th 2006

Syllabus: Graph Algorithms

• Dijkstra’s algorithm: from initial vertex A, find all vertices reachable in a single step.

• Add the shortest reachable vertex and the path taken to get from A to the vertex to the set S.

• From this new vertex, find all vertices reachable in a single step.

• Now add the shortest reachable vertex which is not already in S to S.

• Repeat until all vertices are in S.

• Simple analysis: O(V2)

Page 15: Lecture 12: Revision Lecture Dr John Levine 52236 Algorithms and Complexity March 27th 2006

Syllabus: NP-Hard Problems

Permutations and Combinations: branch and bound;greedy algorithms; backtracking search; typicalproblems, for example the TSP and related problems,the knapsack problem.

• Mostly covered by our consideration of state space search and games, and also in Topics 1 and 2

• I will touch briefly on this area in my summary and revision lecture in Week 10

• Not directly examinable – see the Week 10 lecture for what you need to know in this area

Page 16: Lecture 12: Revision Lecture Dr John Levine 52236 Algorithms and Complexity March 27th 2006

Syllabus: NP-Hard Problems

• What do you do with an NP-hard problem?

• Many of these are optimisation problems, so many solutions are feasible, very few are optimal

• Apply a polynomial time algorithm and make do with a less than perfect solution

• Run an improvement algorithm and wait for as long as you possibly can (always have a feasible solution)

• Branch-and-bound: similar to alpha-beta pruning

• Most instances are not hard: you may be able to find some structure in the instance which you can exploit