Upload
carmella-goodman
View
214
Download
0
Embed Size (px)
Citation preview
Information Theory and Games (Ch. 16)
Information Theory• Information theory studies information flow
• Under this context information has no intrinsic meaning– Information may be partial (e.g., a sound)– Information measures the degree of uncertainty
• Basic model: (1) sender passes information to (2) receiver
• Measure of information gained is a number in the [0,1] range:– 0 bit: gained no information– 1 bit: gained the most information
1 2information - How much information 2 gained?
- Was there any distortion (“noise”) while passing the information?
Recall: Probability Distribution
• The events E1, E2, …, Ek must meet the following
conditions:• One always occur• No two can occur at the same time
• The probabilities p1, …, pk are numbers associated with
these events, such that 0 pi 1 and p1 + … + pk = 1
A probability distribution assigns probabilities to events such that the two properties above holds
Information Gain versus Probability
• Suppose that I flip a “fair” coin:
what is the probability that it will come heads:
How much information you gain when it fall:
0.5
1 bit
• Suppose that I flip a “totally unfair” coin (always come heads):
what is the probability that it will come heads:
How much information you gain when it fall:
1
0
Information Gain versus Probability (2)
• Suppose that I flip a “very unfair” coin (99% will come heads):
what is the probability that it will come heads:
How much information you gain when it fall:
0.99
Fraction of A bit
Info
rmat
ion
gain
probability
Information Gain versus Probability (3)
• Imagine a stranger, “JL”. Which of the following questions, once answered, will provide more information about JL:
Did you have breakfast this morning?What is your favorite color?
• Hints:
• What are your chances of guessing the answer correctly? What if you knew JL and you knew his preferences?
Information Gain versus Probability (4)
• If the probability that an event occurs is high, I gain less information when the event actually occurs
• If the probability that an event occurs is smaller, I gain more information when the event actually occurs
• In general, the information provided by an event decreases with the increase in the probability that that event occurs. Information gain of an event e (Shannon and Weaver, 1949):
I(e) = log2(1/p(e))
Information, Uncertainty, and Meaningful Play
• Recall discussion of relation between uncertainty and Games– What happens if there is no uncertainty at all in a game
(both at macro-level and micro-level)?• What is the relation between uncertainty and information
gain?
If there is no uncertainty then information gain is 0. As a result, player’s actions are not meaningful!
Lets Play Twenty Questions
• I am thinking of an animal:
• You can ask “yes/no” questions only
• Winning condition:– If you guess the animal correctly after asking 20
questions or less, and– you can’t make more than 3 attempts to guess the right
animal
What is happening? (Constitutive Rules)
• We are building a binary (two children) decision tree
a questionno
yes
# potential questions
20
21
22
23
# levels
0
1
2
3
# questions made = log2(# potential questions)
Same Principle Operates for Online Version
• Game: http://www.20q.net/• Ok so how can this be done?• It uses information gain:
Ex’ple Bar Fri Hun Pat Type Res wait
x1 no no yes some french yes yes
x4 no yes yes full thai no yes
x5 no yes no full french yes no
x6
x7
x8
x9
x10
x11
Table of movies stored in the systemPatrons?
no yes
nonesome
waitEstimate?
no yes
0-10>60
Full
Alternate?
Reservation?
Yes
30-60
no
yes
No
no
Bar?
Yes
no
yes
Fri/Sat?
No Yes
yes
no yes
Hungry?
yes
No
10-30
Alternate?
yes
Yes
no
Raining?
no yes
yes
no yes
Nice: Resulting tree is optimal.
Decision Tree
Example
Entry Bar Fri Hungry Patrons Alt Type wait x1 no no yes some yes French yes
x4 no yes yes full yes Thai yes
x5 no yes no full yes French no
x6 yes no yes some no Italian yes
x7 yes no no none no Burger no
x8 no no yes some no Thai yes
x9 yes yes no full no Burger no
x10 yes yes yes full yes Italian no
x11 no No no none no Thai no
Expected Information Gain
• We are given a probability distribution:
The events E1, E2, …, Ek
The probabilities p1, …, pk associated with these events
We have the information gain for those events:
I(E1), I(E2), …, I(Ek)
• The Expected Information Gain (EIG):
EIG = p1 * I(E1) + … + pk * I(Ek)
Decision Tree
• Obtained using expected information gain• In this example it has the minimum height, which is nice
(why?) Patrons?
none
no
some
yes
full
Hungry
no yes
YesType?
Yesno
Fri/Sat?
frenchitalian thai burger
yes
no yes
no yes
Noise and Redundancy• Noise: affects component to component communication
– Example in a game?
• Redundancy: counterbalance to noise– Making sure information is communicated properly– Example in game?
• Balance act: noise versus redundancy– Too much information: signal might be lost– Too little information: signal might be lost
Charades: playing with noise
Crossword puzzle. Other example?
1 2information - Noise: distortion in the
communication. Example
1 2information - Redundancy: passing the same
information by two or more different channels
information