59
CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions Aaron Bauer Winter 2014

CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

  • Upload
    charo

  • View
    46

  • Download
    0

Embed Size (px)

DESCRIPTION

CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions. Aaron Bauer Winter 2014. Announcements. Homework 3 due at 11 p.m. (or later with late days) Homework 4 has been posted (due Feb. 20) Can be done with a partner Partner selection due Feb. 12 Partner form linked from homework. - PowerPoint PPT Presentation

Citation preview

Page 1: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

CSE373: Data Structures & AlgorithmsLecture 13: Hash Collisions

Aaron BauerWinter 2014

Page 2: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

2CSE373: Data Structures & Algorithms

Announcements

• Homework 3 due at 11 p.m. (or later with late days)• Homework 4 has been posted (due Feb. 20)

– Can be done with a partner– Partner selection due Feb. 12– Partner form linked from homework

Winter 2014

Page 3: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

3CSE373: Data Structures & Algorithms

Hash Tables: Review

• Aim for constant-time (i.e., O(1)) find, insert, and delete– “On average” under some reasonable assumptions

• A hash table is an array of some fixed size– But growable as we’ll see

Winter 2014

E int table-indexcollision? collision

resolution

client hash table library

0

TableSize –1

hash table

Page 4: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

4CSE373: Data Structures & Algorithms

One expert suggestion

• int result = 17;• foreach field f

– int fieldHashcode =• boolean: (f ? 1: 0)• byte, char, short, int: (int) f• long: (int) (f ^ (f >>> 32))• float: Float.floatToIntBits(f)• double: Double.doubleToLongBits(f), then above• Object: object.hashCode( )

– result = 31 * result + fieldHashcode

Winter 2014

Page 5: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

5CSE373: Data Structures & Algorithms

Collision resolution

Collision: When two keys map to the same location in the hash table

We try to avoid it, but number-of-keys exceeds table size

So hash tables should support collision resolution– Ideas?

Winter 2014

Page 6: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

6CSE373: Data Structures & Algorithms

Separate ChainingChaining:

All keys that map to the sametable location are kept in a list(a.k.a. a “chain” or “bucket”)

As easy as it sounds

Example:insert 10, 22, 107, 12, 42with mod hashingand TableSize = 10

Winter 2014

0 /1 /2 /3 /4 /5 /6 /7 /8 /9 /

Page 7: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

7CSE373: Data Structures & Algorithms

Separate Chaining

Winter 2014

01 /2 /3 /4 /5 /6 /7 /8 /9 /

10 / Chaining: All keys that map to the same

table location are kept in a list (a.k.a. a “chain” or “bucket”)

As easy as it sounds

Example: insert 10, 22, 107, 12, 42 with mod hashing and TableSize = 10

Page 8: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

8CSE373: Data Structures & Algorithms

Separate Chaining

Winter 2014

01 /23 /4 /5 /6 /7 /8 /9 /

10 /

22 /

Chaining: All keys that map to the same

table location are kept in a list (a.k.a. a “chain” or “bucket”)

As easy as it sounds

Example: insert 10, 22, 107, 12, 42 with mod hashing and TableSize = 10

Page 9: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

9CSE373: Data Structures & Algorithms

Separate Chaining

Winter 2014

01 /23 /4 /5 /6 /78 /9 /

10 /

22 /

107 /

Chaining: All keys that map to the same

table location are kept in a list (a.k.a. a “chain” or “bucket”)

As easy as it sounds

Example: insert 10, 22, 107, 12, 42 with mod hashing and TableSize = 10

Page 10: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

10CSE373: Data Structures & Algorithms

Separate Chaining

Winter 2014

01 /23 /4 /5 /6 /78 /9 /

10 /

12

107 /

22 /

Chaining: All keys that map to the same

table location are kept in a list (a.k.a. a “chain” or “bucket”)

As easy as it sounds

Example: insert 10, 22, 107, 12, 42 with mod hashing and TableSize = 10

Page 11: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

11CSE373: Data Structures & Algorithms

Separate Chaining

Winter 2014

01 /23 /4 /5 /6 /78 /9 /

10 /

42

107 /

12 22 /

Chaining: All keys that map to the same

table location are kept in a list (a.k.a. a “chain” or “bucket”)

As easy as it sounds

Example: insert 10, 22, 107, 12, 42 with mod hashing and TableSize = 10

Page 12: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

12CSE373: Data Structures & Algorithms

Thoughts on chaining

• Worst-case time for find? – Linear– But only with really bad luck or bad hash function– So not worth avoiding (e.g., with balanced trees at each

bucket)

• Beyond asymptotic complexity, some “data-structure engineering” may be warranted– Linked list vs. array vs. chunked list (lists should be short!)– Move-to-front– Maybe leave room for 1 element (or 2?) in the table itself, to

optimize constant factors for the common case• A time-space trade-off…

Winter 2014

Page 13: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

13CSE373: Data Structures & Algorithms

Time vs. space (constant factors only here)

Winter 2014

01 /23 /4 /5 /6 /78 /9 /

10 /

42

107 /

12 22 /

0 10 /1 / X2 423 / X4 / X5 / X6 / X7 107 /8 / X9 / X

12 22 /

Page 14: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

14CSE373: Data Structures & Algorithms

More rigorous chaining analysis

Definition: The load factor, , of a hash table is

Winter 2014

NTableSize

number of elements

Under chaining, the average number of elements per bucket is ___

Page 15: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

15CSE373: Data Structures & Algorithms

More rigorous chaining analysis

Definition: The load factor, , of a hash table is

Winter 2014

NTableSize

number of elements

Under chaining, the average number of elements per bucket is

So if some inserts are followed by random finds, then on average:• Each unsuccessful find compares against ____ items

Page 16: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

16CSE373: Data Structures & Algorithms

More rigorous chaining analysis

Definition: The load factor, , of a hash table is

Winter 2014

NTableSize

number of elements

Under chaining, the average number of elements per bucket is

So if some inserts are followed by random finds, then on average:• Each unsuccessful find compares against items• Each successful find compares against _____ items

Page 17: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

17CSE373: Data Structures & Algorithms

More rigorous chaining analysis

Definition: The load factor, , of a hash table is

Winter 2014

NTableSize

number of elements

Under chaining, the average number of elements per bucket is

So if some inserts are followed by random finds, then on average:• Each unsuccessful find compares against items• Each successful find compares against / 2 items

So we like to keep fairly low (e.g., 1 or 1.5 or 2) for chaining

Page 18: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

18CSE373: Data Structures & Algorithms

Alternative: Use empty space in the table

• Another simple idea: If h(key) is already full, – try (h(key) + 1) % TableSize. If full,– try (h(key) + 2) % TableSize. If full,– try (h(key) + 3) % TableSize. If full…

• Example: insert 38, 19, 8, 109, 10

Winter 2014

0 /1 /2 /3 /4 /5 /6 /7 /8 389 /

Page 19: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

19CSE373: Data Structures & Algorithms

Alternative: Use empty space in the table

Winter 2014

0 /1 /2 /3 /4 /5 /6 /7 /8 389 19

• Another simple idea: If h(key) is already full, – try (h(key) + 1) % TableSize. If full,– try (h(key) + 2) % TableSize. If full,– try (h(key) + 3) % TableSize. If full…

• Example: insert 38, 19, 8, 109, 10

Page 20: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

20CSE373: Data Structures & Algorithms

Alternative: Use empty space in the table

Winter 2014

0 81 /2 /3 /4 /5 /6 /7 /8 389 19

• Another simple idea: If h(key) is already full, – try (h(key) + 1) % TableSize. If full,– try (h(key) + 2) % TableSize. If full,– try (h(key) + 3) % TableSize. If full…

• Example: insert 38, 19, 8, 109, 10

Page 21: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

21CSE373: Data Structures & Algorithms

Alternative: Use empty space in the table

Winter 2014

0 81 1092 /3 /4 /5 /6 /7 /8 389 19

• Another simple idea: If h(key) is already full, – try (h(key) + 1) % TableSize. If full,– try (h(key) + 2) % TableSize. If full,– try (h(key) + 3) % TableSize. If full…

• Example: insert 38, 19, 8, 109, 10

Page 22: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

22CSE373: Data Structures & Algorithms

Alternative: Use empty space in the table

Winter 2014

0 81 1092 103 /4 /5 /6 /7 /8 389 19

• Another simple idea: If h(key) is already full, – try (h(key) + 1) % TableSize. If full,– try (h(key) + 2) % TableSize. If full,– try (h(key) + 3) % TableSize. If full…

• Example: insert 38, 19, 8, 109, 10

Page 23: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

23CSE373: Data Structures & Algorithms

Probing hash tables

Trying the next spot is called probing (also called open addressing)– We just did linear probing

• ith probe was (h(key) + i) % TableSize– In general have some probe function f and use h(key) + f(i) % TableSize

Open addressing does poorly with high load factor – So want larger tables– Too many probes means no more O(1)

Winter 2014

Page 24: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

24CSE373: Data Structures & Algorithms

Other operations

insert finds an open table position using a probe function

What about find?– Must use same probe function to “retrace the trail” for the data– Unsuccessful search when reach empty position

What about delete?– Must use “lazy” deletion. Why?

• Marker indicates “no data here, but don’t stop probing”– Note: delete with chaining is plain-old list-remove

Winter 2014

Page 25: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

25CSE373: Data Structures & Algorithms

(Primary) Clustering

It turns out linear probing is a bad idea, even though the probe function is quick to compute (which is a good thing)

Winter 2014

[R. Sedgewick]

Tends to produceclusters, which lead tolong probing sequences• Called primary

clustering• Saw this starting in

our example

Page 26: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

26CSE373: Data Structures & Algorithms

Analysis of Linear Probing

• Trivial fact: For any < 1, linear probing will find an empty slot– It is “safe” in this sense: no infinite loop unless table is full

• Non-trivial facts we won’t prove:Average # of probes given (in the limit as TableSize → )– Unsuccessful search:

– Successful search:

• This is pretty bad: need to leave sufficient empty space in the table to get decent performance (see chart)Winter 2014

2111

21

1

1121

Page 27: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

27CSE373: Data Structures & Algorithms

In a chart

• Linear-probing performance degrades rapidly as table gets full– (Formula assumes “large table” but point remains)

• By comparison, chaining performance is linear in and has no trouble with >1

Winter 2014

0.01

0.10

0.19

0.28

0.37

0.46

0.55

0.64

0.73

0.82

0.91

0.00

50.00

100.00

150.00

200.00

250.00

300.00

350.00

Linear Probing

linear probing not found

linear probing found

Load FactorA

vera

ge #

of P

robe

s0.0

10.0

90.1

70.2

50.3

30.4

10.4

90.5

70.6

50.7

30.8

10.8

90.9

70.002.004.006.008.00

10.0012.0014.0016.00

Linear Probing

linear probing not found

linear probing found

Load Factor

Ave

rage

# o

f Pro

bes

Page 28: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

28CSE373: Data Structures & Algorithms

Quadratic probing

• We can avoid primary clustering by changing the probe function (h(key) + f(i)) % TableSize

• A common technique is quadratic probing: f(i) = i2

– So probe sequence is:• 0th probe: h(key) % TableSize• 1st probe: (h(key) + 1) % TableSize• 2nd probe: (h(key) + 4) % TableSize• 3rd probe: (h(key) + 9) % TableSize• …• ith probe: (h(key) + i2) % TableSize

• Intuition: Probes quickly “leave the neighborhood”Winter 2014

Page 29: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

29CSE373: Data Structures & Algorithms

Quadratic Probing Example

Winter 2014

0123456789

TableSize=10Insert: 8918495879

Page 30: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

30CSE373: Data Structures & Algorithms

Quadratic Probing Example

Winter 2014

0123456789 89

TableSize=10Insert: 8918495879

Page 31: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

31CSE373: Data Structures & Algorithms

Quadratic Probing Example

Winter 2014

012345678 189 89

TableSize=10Insert: 8918495879

Page 32: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

32CSE373: Data Structures & Algorithms

Quadratic Probing Example

Winter 2014

0 4912345678 189 89

TableSize=10Insert: 8918495879

Page 33: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

33CSE373: Data Structures & Algorithms

Quadratic Probing Example

Winter 2014

0 4912 58345678 189 89

TableSize=10Insert: 8918495879

Page 34: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

34CSE373: Data Structures & Algorithms

Quadratic Probing Example

Winter 2014

0 4912 583 7945678 189 89

TableSize=10Insert: 8918495879

Page 35: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

35CSE373: Data Structures & Algorithms

Another Quadratic Probing Example

Winter 2014

TableSize = 7

Insert:76 (76 % 7 = 6)77 (40 % 7 = 5)48 (48 % 7 = 6)5 ( 5 % 7 = 5)55 (55 % 7 = 6)47 (47 % 7 = 5)

0123456

Page 36: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

36CSE373: Data Structures & Algorithms

Another Quadratic Probing Example

Winter 2014

TableSize = 7

Insert:76 (76 % 7 = 6)77 (40 % 7 = 5)48 (48 % 7 = 6)5 ( 5 % 7 = 5)55 (55 % 7 = 6)47 (47 % 7 = 5)

0123456 76

Page 37: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

37CSE373: Data Structures & Algorithms

Another Quadratic Probing Example

Winter 2014

TableSize = 7

Insert:76 (76 % 7 = 6)77 (40 % 7 = 5)48 (48 % 7 = 6)5 ( 5 % 7 = 5)55 (55 % 7 = 6)47 (47 % 7 = 5)

012345 406 76

Page 38: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

38CSE373: Data Structures & Algorithms

Another Quadratic Probing Example

Winter 2014

TableSize = 7

Insert:76 (76 % 7 = 6)77 (40 % 7 = 5)48 (48 % 7 = 6)5 ( 5 % 7 = 5)55 (55 % 7 = 6)47 (47 % 7 = 5)

0 4812345 406 76

Page 39: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

39CSE373: Data Structures & Algorithms

Another Quadratic Probing Example

Winter 2014

TableSize = 7

Insert:76 (76 % 7 = 6)77 (40 % 7 = 5)48 (48 % 7 = 6)5 ( 5 % 7 = 5)55 (55 % 7 = 6)47 (47 % 7 = 5)

0 4812 5345 406 76

Page 40: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

40CSE373: Data Structures & Algorithms

Another Quadratic Probing Example

Winter 2014

TableSize = 7

Insert:76 (76 % 7 = 6)77 (40 % 7 = 5)48 (48 % 7 = 6)5 ( 5 % 7 = 5)55 (55 % 7 = 6)47 (47 % 7 = 5)

0 4812 53 5545 406 76

Page 41: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

41CSE373: Data Structures & Algorithms

Another Quadratic Probing Example

Winter 2014

TableSize = 7

Insert:76 (76 % 7 = 6)77 (40 % 7 = 5)48 (48 % 7 = 6)5 ( 5 % 7 = 5)55 (55 % 7 = 6)47 (47 % 7 = 5)

0 4812 53 5545 406 76

Doh!: For all n, ((n*n) +5) % 7 is 0, 2, 5, or 6• Excel shows takes “at least” 50 probes and a pattern• Proof (like induction) using(n2+5) % 7 = ((n-7)2+5) % 7

• In fact, for all c and k, (n2+c) % k = ((n-k)2+c) % k

Page 42: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

42CSE373: Data Structures & Algorithms

From Bad News to Good News

• Bad news: – Quadratic probing can cycle through the same full indices,

never terminating despite table not being full

• Good news: – If TableSize is prime and < ½, then quadratic probing will

find an empty slot in at most TableSize/2 probes– So: If you keep < ½ and TableSize is prime, no need to

detect cycles

– Optional: Proof is posted in lecture13.txt• Also, slightly less detailed proof in textbook• Key fact: For prime T and 0 < i,j < T/2 where i j,

(k + i2) % T (k + j2) % T (i.e., no index repeat)

Winter 2014

Page 43: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

43CSE373: Data Structures & Algorithms

Clustering reconsidered

• Quadratic probing does not suffer from primary clustering: no problem with keys initially hashing to the same neighborhood

• But it’s no help if keys initially hash to the same index– Called secondary clustering

• Can avoid secondary clustering with a probe function that depends on the key: double hashing…

Winter 2014

Page 44: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

44CSE373: Data Structures & Algorithms

Double hashing

Idea: – Given two good hash functions h and g, it is very unlikely

that for some key, h(key) == g(key)– So make the probe function f(i) = i*g(key)

Probe sequence:• 0th probe: h(key) % TableSize• 1st probe: (h(key) + g(key)) % TableSize• 2nd probe: (h(key) + 2*g(key)) % TableSize• 3rd probe: (h(key) + 3*g(key)) % TableSize• …• ith probe: (h(key) + i*g(key)) % TableSize

Detail: Make sure g(key) cannot be 0Winter 2014

Page 45: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

45CSE373: Data Structures & Algorithms

Double-hashing analysis

• Intuition: Because each probe is “jumping” by g(key) each time, we “leave the neighborhood” and “go different places from other initial collisions”

• But we could still have a problem like in quadratic probing where we are not “safe” (infinite loop despite room in table)– It is known that this cannot happen in at least one case:

• h(key) = key % p• g(key) = q – (key % q)• 2 < q < p• p and q are prime

Winter 2014

Page 46: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

46CSE373: Data Structures & Algorithms

More double-hashing facts• Assume “uniform hashing”

– Means probability of g(key1) % p == g(key2) % p is 1/p

• Non-trivial facts we won’t prove:Average # of probes given (in the limit as TableSize → )– Unsuccessful search (intuitive):

– Successful search (less intuitive):

• Bottom line: unsuccessful bad (but not as bad as linear probing), but successful is not nearly as badWinter 2014

11

1 1log 1e

Page 47: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

47CSE373: Data Structures & Algorithms

Charts

Winter 2014

0.01

0.10

0.19

0.28

0.37

0.46

0.55

0.64

0.73

0.82

0.91

0.00

50.00

100.00

150.00

200.00

250.00

300.00

350.00

Linear Probing

linear probing not found

linear probing found

Load Factor

Ave

rage

# o

f Pro

bes

0.01

0.09

0.17

0.25

0.33

0.41

0.49

0.57

0.65

0.73

0.81

0.89

0.97

0.002.004.006.008.00

10.0012.0014.0016.00

Linear Probing

linear probing not found

linear probing found

Load Factor

Ave

rage

# o

f Pro

bes

0.01

0.10

0.19

0.28

0.37

0.46

0.55

0.64

0.73

0.82

0.91

0.00

20.00

40.00

60.00

80.00

100.00

120.00

Uniform Hashing

uniform hashing not found

uniform hashing found

Load Factor

Ave

rage

# o

f Pro

bes

0.01

0.11

0.21

0.31

0.41

0.51

0.61

0.71

0.81

0.91

0.001.002.003.004.005.006.007.00

Uniform Hashing

uniform hashing not found

uniform hashing found

Load Factor

Ave

rage

# o

f Pro

bes

Page 48: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

48CSE373: Data Structures & Algorithms

Rehashing• As with array-based stacks/queues/lists, if table gets too full,

create a bigger table and copy everything

• With chaining, we get to decide what “too full” means– Keep load factor reasonable (e.g., < 1)?– Consider average or max size of non-empty chains?

• For probing, half-full is a good rule of thumb

• New table size– Twice-as-big is a good idea, except that won’t be prime!– So go about twice-as-big – Can have a list of prime numbers in your code since you

won’t grow more than 20-30 times

Winter 2014

Page 49: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

49CSE373: Data Structures & Algorithms

Graphs• A graph is a formalism for representing relationships among

items– Very general definition because very general concept

• A graph is a pairG = (V,E)– A set of vertices, also known as nodes

V = {v1,v2,…,vn}– A set of edges

E = {e1,e2,…,em}

• Each edge ei is a pair of vertices

(vj,vk)• An edge “connects” the vertices

• Graphs can be directed or undirected

Winter 2014

Han

Leia

Luke

V = {Han,Leia,Luke}E = {(Luke,Leia), (Han,Leia), (Leia,Han)}

Page 50: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

50CSE373: Data Structures & Algorithms

An ADT?

• Can think of graphs as an ADT with operations like isEdge((vj,vk))

• But it is unclear what the “standard operations” are

• Instead we tend to develop algorithms over graphs and then use data structures that are efficient for those algorithms

• Many important problems can be solved by:1. Formulating them in terms of graphs2. Applying a standard graph algorithm

• To make the formulation easy and standard, we have a lot of standard terminology about graphs

Winter 2014

Page 51: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

51CSE373: Data Structures & Algorithms

Some Graphs

For each, what are the vertices and what are the edges?

• Web pages with links• Facebook friends• “Input data” for the Kevin Bacon game• Methods in a program that call each other• Road maps (e.g., Google maps)• Airline routes• Family trees• Course pre-requisites• …

Using the same algorithms for problems across so many domains sounds like “core computer science and engineering”

Winter 2014

Page 52: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

52CSE373: Data Structures & Algorithms

Undirected Graphs

• In undirected graphs, edges have no specific direction– Edges are always “two-way”

Winter 2014

• Thus, (u,v) E implies (v,u) E – Only one of these edges needs to be in the set– The other is implicit, so normalize how you check for it

• Degree of a vertex: number of edges containing that vertex– Put another way: the number of adjacent vertices

A

B

C

D

Page 53: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

53CSE373: Data Structures & Algorithms

Directed Graphs• In directed graphs (sometimes called digraphs), edges have a

direction

Winter 2014

• Thus, (u,v) E does not imply (v,u) E. • Let (u,v) E mean u → v • Call u the source and v the destination

• In-degree of a vertex: number of in-bound edges,i.e., edges where the vertex is the destination

• Out-degree of a vertex: number of out-bound edgesi.e., edges where the vertex is the source

or

2 edges here

A

B

C

D A

B

C

Page 54: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

54CSE373: Data Structures & Algorithms

Self-Edges, Connectedness

• A self-edge a.k.a. a loop is an edge of the form (u,u)– Depending on the use/algorithm, a graph may have:

• No self edges• Some self edges• All self edges (often therefore implicit, but we will be explicit)

• A node can have a degree / in-degree / out-degree of zero

• A graph does not have to be connected– Even if every node has non-zero degree

Winter 2014

Page 55: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

55CSE373: Data Structures & Algorithms

More notation

For a graph G = (V,E)

• |V| is the number of vertices• |E| is the number of edges

– Minimum?– Maximum for undirected?– Maximum for directed?

Winter 2014

A

B

C

V = {A, B, C, D}E = {(C, B), (A, B), (B, A) (C, D)}

D

Page 56: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

56CSE373: Data Structures & Algorithms

More notation

For a graph G = (V,E)

• |V| is the number of vertices• |E| is the number of edges

– Minimum? 0– Maximum for undirected?– Maximum for directed?

Winter 2014

A

B

C

V = {A, B, C, D}E = {(C, B), (A, B), (B, A) (C, D)}

D

Page 57: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

57CSE373: Data Structures & Algorithms

More notation

For a graph G = (V,E)

• |V| is the number of vertices• |E| is the number of edges

– Minimum? 0– Maximum for undirected? |V||V+1|/2 O(|V|2)– Maximum for directed?

Winter 2014

A

B

C

D

Page 58: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

58CSE373: Data Structures & Algorithms

More notation

For a graph G = (V,E)

• |V| is the number of vertices• |E| is the number of edges

– Minimum? 0– Maximum for undirected? |V||V+1|/2 O(|V|2)– Maximum for directed? |V|2 O(|V|2)

(assuming self-edges allowed, else subtract |V|)

Winter 2014

A

B

C

D

Page 59: CSE373: Data Structures & Algorithms Lecture 13: Hash Collisions

59CSE373: Data Structures & Algorithms

More notation

For a graph G = (V,E):

• |V| is the number of vertices• |E| is the number of edges

– Minimum? 0– Maximum for undirected? |V||V+1|/2 O(|V|2)– Maximum for directed? |V|2 O(|V|2)

(assuming self-edges allowed, else subtract |V|)

• If (u,v) E – Then v is a neighbor of u, i.e., v is adjacent to u– Order matters for directed edges

• u is not adjacent to v unless (v,u) EWinter 2014

A

B

C

D