32
Computer Algorithms Submitted by: Rishi Jethwa Suvarna Angal

Computer Algorithms Submitted by: Rishi Jethwa Suvarna Angal

Embed Size (px)

Citation preview

Computer Algorithms

Submitted by: Rishi Jethwa

Suvarna Angal

Contents Red-Black Trees

Basics Properties Rotations Insertions

Union Find Algorithms Linked List Representation Union By Rank Path compression

Red-Black Trees

RB tress is a binary tree with one extra bit of storage per node; its color, which can be either RED or BLACK.

Its data structure for binary search tree with only difference that the trees are approximately balanced.

Red-Black Trees A binary tree is a red-black tree if it

satisfies following rules for red-black tree. Every node is either red or black.

The root is always black. Leaf nodes are black. If a node is red, then both its children are

black. The number of black nodes on every path are

same.

Red-Black Trees Properties of red-black trees

Suppose number of black nodes are 10, then the minimum height can be 10 and maximum height of the tree can be at most 19. Hence the maximum can be at most 1 less than twice of its minimum height.

Maximum path length is O(log n). Lookup for searches are good, O(log n). Insertion and deletion are not an

overhead exactly, complexity is O(log n).

Red-Black Trees

Rotation of red-black trees. A structural change to the red-black trees. Insertion and deletion modify the tree, the

result may violate the properties of red-black trees. To restore this properties rotations are done.

We can have either of left rotation or right rotation.

Red-Black Trees

b

c

d e

a

The above diagram depicts left and right rotations

c

eb

a d

Left rotation

Right rotation

Here in right diagram a < b < d < c < e

Red-Black Trees

Insertion in red black trees.11

2 14

151 7

5

4

8

Original tree

Number 4 added

Red-Black Trees The idea to insertion is that we

traverse the tree to see where it fits, assume it fits at end , so the idea is to traverse up again.

Coloring rule while insertion. Look at the father node, if it is red and the

uncle node is red too and if the grandfather node is black , then make father and uncle as black and grandfather as red.

Red-Black Trees

Diagram depicting rule for insertion mentioned in the

previous slide.

Red-Black Trees

Insertion example

11

2 14

151 7

5 8

4 Violation of rule( after 4 added to the tree)

Red-Black TreesInsertion example

11

2 14

151 7

5 8

4 Case 1

Red-Black Trees

Insertion example

11

7 14

152 8

4

1 5

Case 2

Red-Black TreesInsertion example

2

4

1 5

77

11

14

15

8

Case 3

Union Operation

1 2

43

Union: Merge 2 sets and create a new set

Initially each number is a set by itself.

From n singleton sets gradually merge to form a set.

After n-1 union operations we get a single set of n numbers.

Union operation is used for merging sets in Kruskal’s algorithm

Find operation

Every set has a name Thus Find(number) returns name

of the set. Perfect application in Kruskal’s

algorithm when there is a new edge added. Discard the already accounted for edge.

Linked List Representation

Represent each set using a linked list

First object in each linked list serves as the set’s name

Each list maintains pointers head, to the representative, and tail, to the last object in the list.

Linked List Representation

2 3 6

5 7

Extra pointers

pointing to the head

When uniting these 2 sets,

pointers for nodes 5 and 7 will have

to be made pointing to 2.

Drawbacks :

By using this representation, Find will take constant time O(1).

But Union takes linear time as after union all the pointers have to be redirected to the head.

Develop new data structure

2

5

3

7

6 Add the pointer pointing from new set’s head to the old old one.

1

4Union of 1 and 4

Combination of the 2 sets

2

5

3

7

6

1

4

While combining, we can have a pointer from 1 to 2 or from 2 to1.

But we choose the one from 1 to 2.

This gives us a balanced structure. The highest hop remains 2.

Union by Rank Algorithm The root of the tree with fewer nodes is

made to point to the root of the tree with more nodes.

For each node, a rank is maintained that is an upper bound on the height of the node.

In union by rank, the root with smaller rank is made to point to the root with larger rank during a UNION operation.

Longest path length unions:

When we always take singleton sets and keep merging the sets we get a star structure.

Time taken: n finds and n-1 unions.This is best case.

Worst case

Merge sets of equal path lengths.

32

1

4

76

5

8

Here, the path length becomes 3.

For n-1 unions and n finds:

Union: O(n)

Finds: path lengths can get bigger so, O(n log n).

Total sum: O(n log n).

Path Compression

For union by rank,

1

Log n

Best Case

Worst Case

Algorithm for Path Compression

1st walk: Find the name of the set . Take a walk until we reach the root.

2nd walk: Retrace the path and join all the elements along the path to the root using another pointer.

This enables future finds to take shorter paths.

Path compression

21

3root

1 2 3

root

Before Find: Each node has pointer to

its parent

After Find: Each node points directly

to the root

Amortized Analysis

Time for n-1 unions and n finds: O(nlog*n)

Log* n is a slow growing function.

n Log n Log*n

22 2 1

2^2^2 4 2

2^2^2^2 2^2^2 32^2^2^2^2 2^2^2^2 4

Comparisons of functions

4

F(n)=n

Log n

Log*n

(n)

Inverse Ackermann’s function

α(n) – quick growing function For k>= 0, j>=1 Ak(j) = j + 1 if k = 0

= Aj+1k-1(j) if k>=1

This is a recursive function.

Calculations

A3(2) = A2(A2(A2(2))) When k=1, A1(j) = 2j+1 For k =2, A2(j) = A1(A1(…. A1(j))) A2(j) = 2j+1(j+1) -1 Using the above analysis: A3(1) = 2047 A4(1) = 22047+1(2047+1) - 1

Nature of function

From the analysis we can see that Ak is a very fast growing function.

α(n) is inverse of Ak

Inverse is very slow growing. Run time: O(n α(n)).