34
AVL Trees 1 Part-F2 AVL Trees 6 3 8 4 v z

Part-F2 AVL Trees

Embed Size (px)

DESCRIPTION

6. v. 8. 3. z. 4. Part-F2 AVL Trees. AVL Tree Definition (§ 9.2). AVL trees are balanced. An AVL Tree is a binary search tree such that for every internal node v of T, the heights of the children of v can differ by at most 1. - PowerPoint PPT Presentation

Citation preview

Page 1: Part-F2 AVL Trees

AVL Trees 1

Part-F2

AVL Trees

6

3 8

4

v

z

Page 2: Part-F2 AVL Trees

AVL Trees 2

AVL Tree Definition (§ 9.2)

AVL trees are balanced.An AVL Tree is a binary search tree such that for every internal node v of T, the heights of the children of v can differ by at most 1.

88

44

17 78

32 50

48 62

2

4

1

1

2

3

1

1

An example of an AVL tree where the heights are shown next to the nodes:

Page 3: Part-F2 AVL Trees

AVL Trees 3

Balanced nodes A internal node is balanced if the heights of its two children differ by at most 1. Otherwise, such an internal node is unbalanced.

Page 4: Part-F2 AVL Trees

AVL Trees 4

Height of an AVL TreeFact: The height of an AVL tree storing n keys is O(log n).Proof: Let us bound n(h): the minimum number of internal nodes of an AVL tree of height h.We easily see that n(1) = 1 and n(2) = 2For n > 2, an AVL tree of height h contains the root node, one AVL subtree of height n-1 and another of height n-2.That is, n(h) = 1 + n(h-1) + n(h-2)Knowing n(h-1) > n(h-2), we get n(h) > 2n(h-2). Son(h) > 2n(h-2), n(h) > 4n(h-4), n(h) > 8n(n-6), … (by

induction),n(h) > 2in(h-2i)>2 {h/2 -1} (1) = 2 {h/2 -1}

Solving the base case we get: n(h) > 2 h/2-1

Taking logarithms: h < 2log n(h) +2Thus the height of an AVL tree is O(log n)

3

4 n(1)

n(2)

h-1 h-2

Page 5: Part-F2 AVL Trees

AVL Trees 5

Insertion in an AVL TreeInsertion is as in a binary search treeAlways done by expanding an external node.Example: 44

17 78

32 50 88

48 62

54w

b=x

a=y

c=z

44

17 78

32 50 88

48 62

before insertion after insertionIt is no longer balanced

Page 6: Part-F2 AVL Trees

AVL Trees 6

Names of important nodes

w: the newly inserted node. (insertion process follow the binary search tree method)The heights of some nodes in T might be increased after inserting a node.

Those nodes must be on the path from w to the root. Other nodes are not effected.

z: the first node we encounter in going up from w toward the root such that z is unbalanced.y: the child of z with higher height.

y must be an ancestor of w. (why? Because z in unbalanced after inserting w)

x: the child of y with higher height. The height of the sibling of x is smaller than that of x.

(Otherwise, the height of y cannot be increased.) x must be an ancestor of w.

See the figure in the last slide.

Page 7: Part-F2 AVL Trees

AVL Trees 7

Algorithm restructure(x): Input: A node x of a binary search tree T that has

both parent y and grand-parent z.Output: Tree T after a trinode restructuring.1. Let (a, b, c) be the list (increasing order) of

nodes x, y, and z. Let T0, T1, T2 T3 be a left-to-right (inorder) listing of the four subtrees of x, y, and z not rooted at x, y, or z.

2. Replace the subtree rooted at z with a new subtree rooted at b..

3. Let a be the left child of b and let T0 and T1 be the left and right subtrees of a, respectively.

4. Let c be the right child of b and let T2 and T3 be the left and right subtrees of c, respectively.

Page 8: Part-F2 AVL Trees

AVL Trees 8

Restructuring (as Single Rotations)

Single Rotations:

T0T1

T2

T3

c = xb = y

a = z

T0 T1 T2

T3

c = xb = y

a = zsingle rotation

T3T2

T1

T0

a = xb = y

c = z

T3T2T1

T0

a = xb = y

c = zsingle rotation

Page 9: Part-F2 AVL Trees

AVL Trees 9

Restructuring (as Double Rotations)

double rotations:

double rotationa = z

b = xc = y

T0T2

T1

T3 T0

T2T3T1

a = zb = x

c = y

double rotationc = z

b = xa = y

T3T1

T2

T0 T3

T1T0 T2

c = zb = x

a = y

Page 10: Part-F2 AVL Trees

AVL Trees 10

Insertion Example, continued

88

44

17 78

32 50

48 62

2

5

1

1

3

4

2

1

54

1

T0T2

T3

x

y

z

2

3

4

5

67

1

88

44

17

7832 50

48

622

4

1

1

2 2

3

154

1

T0 T1

T2

T3

x

y z

unbalanced...

...balanced

1

2

3

4

5

6

7

T1

Page 11: Part-F2 AVL Trees

AVL Trees 11

Theorem: One restructure operation is enough to ensure that the whole tree is balanced.

Proof: Left to the readers.

Page 12: Part-F2 AVL Trees

AVL Trees 12

Removal in an AVL TreeRemoval begins as in a binary search tree, which means the node removed will become an empty external node. Its parent, w, may cause an imbalance.Example:

44

17

7832 50

8848

62

54

44

17

7850

8848

62

54

before deletion of 32 after deletion

Page 13: Part-F2 AVL Trees

AVL Trees 13

Rebalancing after a Removal

Let z be the first unbalanced node encountered while travelling up the tree from w. Also, let y be the child of z with the larger height, let x be the child of y defined as follows;

If one of the children of y is taller than the other, choose x as the taller child of y.

If both children of y have the same height, select x be the child of y on the same side as y (i.e., if y is the left child of z, then x is the left child of y; and if y is the right child of z then x is the right child of y.)

Page 14: Part-F2 AVL Trees

AVL Trees 14

Rebalancing after a Removal

We perform restructure(x) to restore balance at z.As this restructuring may upset the balance of another node higher in the tree, we must continue checking for balance until the root of T is reached

44

17

7850

8848

62

54

w

c=x

b=y

a=z

44

17

78

50 88

48

62

54

Page 15: Part-F2 AVL Trees

AVL Trees 15

Unbalanced after restructuring

44

17

7850

88

62w

c=x

b=y

a=z

44

17

78

50 88

62

32

1 1

h=5h=5h=3h=4

Unbalanced balanced

Page 16: Part-F2 AVL Trees

AVL Trees 16

Rebalancing after a Removal

We perform restructure(x) to restore balance at z.As this restructuring may upset the balance of another node higher in the tree, we must continue checking for balance until the root of T is reached

44

17

7850

8848

62

54

w

c=x

b=y

a=z

44

17

78

50 88

48

62

54

Page 17: Part-F2 AVL Trees

AVL Trees 17

Running Times for AVL Trees

a single restructure is O(1) using a linked-structure binary tree

find is O(log n) height of tree is O(log n), no restructures needed

insert is O(log n) initial find is O(log n) Restructuring up the tree, maintaining heights is O(log

n)

remove is O(log n) initial find is O(log n) Restructuring up the tree, maintaining heights is O(log

n)

Page 18: Part-F2 AVL Trees

AVL Trees 18

Part-G1

Merge Sort

7 2 9 4 2 4 7 9

7 2 2 7 9 4 4 9

7 7 2 2 9 9 4 4

Page 19: Part-F2 AVL Trees

AVL Trees 19

Divide-and-Conquer (§ 10.1.1)

Divide-and conquer is a general algorithm design paradigm:

Divide: divide the input data S in two disjoint subsets S1 and S2

Recur: solve the subproblems associated with S1 and S2

Conquer: combine the solutions for S1 and S2 into a solution for S

The base case for the recursion are subproblems of size 0 or 1

Merge-sort is a sorting algorithm based on the divide-and-conquer paradigm Like heap-sort

It uses a comparator It has O(n log n) running

timeUnlike heap-sort

It does not use an auxiliary priority queue

It accesses data in a sequential manner (suitable to sort data on a disk)

Page 20: Part-F2 AVL Trees

AVL Trees 20

Merge-Sort (§ 10.1)Merge-sort on an input sequence S with n elements consists of three steps:

Divide: partition S into two sequences S1 and S2 of about n2 elements each

Recur: recursively sort S1 and S2

Conquer: merge S1 and S2 into a unique sorted sequence

Algorithm mergeSort(S, C)Input sequence S with n

elements, comparator C Output sequence S sorted

according to Cif S.size() > 1

(S1, S2) partition(S, n/2)

mergeSort(S1, C)

mergeSort(S2, C)

S merge(S1, S2)

Page 21: Part-F2 AVL Trees

AVL Trees 21

Merging Two Sorted Sequences

The conquer step of merge-sort consists of merging two sorted sequences A and B into a sorted sequence S containing the union of the elements of A and BMerging two sorted sequences, each with n2 elements and implemented by means of a doubly linked list, takes O(n) time

Algorithm merge(A, B)Input sequences A and B with

n2 elements each

Output sorted sequence of A B

S empty sequence

while A.isEmpty() B.isEmpty()if A.first().element() <

B.first().element()S.insertLast(A.remove(A.first()))

elseS.insertLast(B.remove(B.first()))

while A.isEmpty()S.insertLast(A.remove(A.first()))while B.isEmpty()S.insertLast(B.remove(B.first()))return S

Page 22: Part-F2 AVL Trees

AVL Trees 22

Merge-Sort TreeAn execution of merge-sort is depicted by a binary tree

each node represents a recursive call of merge-sort and stores unsorted sequence before the execution and its partition sorted sequence at the end of the execution

the root is the initial call the leaves are calls on subsequences of size 0 or 1

7 2 9 4 2 4 7 9

7 2 2 7 9 4 4 9

7 7 2 2 9 9 4 4

Page 23: Part-F2 AVL Trees

AVL Trees 23

Execution Example

Partition

7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6

7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6

7 7 2 2 9 9 4 4 3 3 8 8 6 6 1 1

7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

Page 24: Part-F2 AVL Trees

AVL Trees 24

Execution Example (cont.)

Recursive call, partition

7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6

7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6

7 7 2 2 9 9 4 4 3 3 8 8 6 6 1 1

7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

Page 25: Part-F2 AVL Trees

AVL Trees 25

Execution Example (cont.)

Recursive call, partition

7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6

7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6

7 7 2 2 9 9 4 4 3 3 8 8 6 6 1 1

7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

Page 26: Part-F2 AVL Trees

AVL Trees 26

Execution Example (cont.)

Recursive call, base case

7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6

7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6

7 7 2 2 9 9 4 4 3 3 8 8 6 6 1 1

7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

Page 27: Part-F2 AVL Trees

AVL Trees 27

Execution Example (cont.)

Recursive call, base case

7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6

7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6

7 7 2 2 9 9 4 4 3 3 8 8 6 6 1 1

7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

Page 28: Part-F2 AVL Trees

AVL Trees 28

Execution Example (cont.)

Merge

7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6

7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6

7 7 2 2 9 9 4 4 3 3 8 8 6 6 1 1

7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

Page 29: Part-F2 AVL Trees

AVL Trees 29

Execution Example (cont.)

Recursive call, …, base case, merge

7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6

7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6

7 7 2 2 3 3 8 8 6 6 1 1

7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

9 9 4 4

Page 30: Part-F2 AVL Trees

AVL Trees 30

Execution Example (cont.)

Merge

7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6

7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6

7 7 2 2 9 9 4 4 3 3 8 8 6 6 1 1

7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

Page 31: Part-F2 AVL Trees

AVL Trees 31

Execution Example (cont.)

Recursive call, …, merge, merge

7 2 9 4 2 4 7 9 3 8 6 1 1 3 6 8

7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6

7 7 2 2 9 9 4 4 3 3 8 8 6 6 1 1

7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

Page 32: Part-F2 AVL Trees

AVL Trees 32

Execution Example (cont.)

Merge

7 2 9 4 2 4 7 9 3 8 6 1 1 3 6 8

7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6

7 7 2 2 9 9 4 4 3 3 8 8 6 6 1 1

7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

Page 33: Part-F2 AVL Trees

AVL Trees 33

Analysis of Merge-SortThe height h of the merge-sort tree is O(log n)

at each recursive call we divide in half the sequence,

The overall amount or work done at the nodes of depth i is O(n)

we partition and merge 2i sequences of size n2i we make 2i1 recursive calls

Thus, the total running time of merge-sort is O(n log n)

depth #seqs

size

0 1 n

1 2 n2

i 2i n2i

… … …

Page 34: Part-F2 AVL Trees

AVL Trees 34

Summary of Sorting Algorithms

Algorithm Time Notes

selection-sort

O(n2) slow in-place for small data sets (< 1K)

insertion-sort

O(n2) slow in-place for small data sets (< 1K)

heap-sort O(n log n)

fast in-place for large data sets (1K —

1M)

merge-sort O(n log n) fast sequential data access for huge data sets (> 1M)