Upload
others
View
24
Download
0
Embed Size (px)
Citation preview
Conditional Probability Practice
Data Science: Jordan Boyd-GraberUniversity of MarylandJANUARY 13, 2018
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 1 / 1
Don’t Panic
� Freakout on Piazza on NY / WV (4 districts = .1 points / 40)
� HW2 released in next few days
� Office hours huge success
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 2 / 1
Today
� Conditional probability problems� Important for classification� You’ll be estimating them in HW2
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 3 / 1
Dice
A die is rolled twice
what is the probability that the sum of the faces is greater than 7, given that
� the first outcome was 4?
� the first outcome was greater than 4?
� the first outcome was a 1?
� the first outcome was less than 5?
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 4 / 1
Dice
� p(X1 +X2 > 7 |X1 = 4) =
� p(X1 +X2 > 7 |X1 > 4) =
� p(X1 +X2 > 7 |X1 = 1) =
� p(X1 +X2 > 7 |X1 < 5) =
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 5 / 1
Dice
� p(X1 +X2 > 7 |X1 = 4) = p(X1+X2>7∧X1=4)p(X1=4) =
� p(X1 +X2 > 7 |X1 > 4) =
� p(X1 +X2 > 7 |X1 = 1) =
� p(X1 +X2 > 7 |X1 < 5) =
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 5 / 1
Dice
� p(X1 +X2 > 7 |X1 = 4) = p(X1+X2>7∧X1=4)p(X1=4) = 3/36
1/6 = 12
� p(X1 +X2 > 7 |X1 > 4) =
� p(X1 +X2 > 7 |X1 = 1) =
� p(X1 +X2 > 7 |X1 < 5) =
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 5 / 1
Dice
� p(X1 +X2 > 7 |X1 = 4) = p(X1+X2>7∧X1=4)p(X1=4) = 3/36
1/6 = 12
� p(X1 +X2 > 7 |X1 > 4) = p(X1+X2>7∧X1>4)p(X1>4) =
� p(X1 +X2 > 7 |X1 = 1) =
� p(X1 +X2 > 7 |X1 < 5) =
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 5 / 1
Dice
� p(X1 +X2 > 7 |X1 = 4) = p(X1+X2>7∧X1=4)p(X1=4) = 3/36
1/6 = 12
� p(X1 +X2 > 7 |X1 > 4) = p(X1+X2>7∧X1>4)p(X1>4) = 9/36
1/3 = 2736 =
34
� p(X1 +X2 > 7 |X1 = 1) =
� p(X1 +X2 > 7 |X1 < 5) =
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 5 / 1
Dice
� p(X1 +X2 > 7 |X1 = 4) = p(X1+X2>7∧X1=4)p(X1=4) = 3/36
1/6 = 12
� p(X1 +X2 > 7 |X1 > 4) = p(X1+X2>7∧X1>4)p(X1>4) = 9/36
1/3 = 2736 =
34
� p(X1 +X2 > 7 |X1 = 1) = p(X1+X2>7∧X1=1)p(X1=1) =
� p(X1 +X2 > 7 |X1 < 5) =
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 5 / 1
Dice
� p(X1 +X2 > 7 |X1 = 4) = p(X1+X2>7∧X1=4)p(X1=4) = 3/36
1/6 = 12
� p(X1 +X2 > 7 |X1 > 4) = p(X1+X2>7∧X1>4)p(X1>4) = 9/36
1/3 = 2736 =
34
� p(X1 +X2 > 7 |X1 = 1) = p(X1+X2>7∧X1=1)p(X1=1) = 0
1/6 = 0
� p(X1 +X2 > 7 |X1 < 5) =
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 5 / 1
Dice
� p(X1 +X2 > 7 |X1 = 4) = p(X1+X2>7∧X1=4)p(X1=4) = 3/36
1/6 = 12
� p(X1 +X2 > 7 |X1 > 4) = p(X1+X2>7∧X1>4)p(X1>4) = 9/36
1/3 = 2736 =
34
� p(X1 +X2 > 7 |X1 = 1) = p(X1+X2>7∧X1=1)p(X1=1) = 0
1/6 = 0
� p(X1 +X2 > 7 |X1 < 5) = p(X1+X2>7∧X1<5)p(X1<5) =
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 5 / 1
Dice
� p(X1 +X2 > 7 |X1 = 4) = p(X1+X2>7∧X1=4)p(X1=4) = 3/36
1/6 = 12
� p(X1 +X2 > 7 |X1 > 4) = p(X1+X2>7∧X1>4)p(X1>4) = 9/36
1/3 = 2736 =
34
� p(X1 +X2 > 7 |X1 = 1) = p(X1+X2>7∧X1=1)p(X1=1) = 0
1/6 = 0
� p(X1 +X2 > 7 |X1 < 5) = p(X1+X2>7∧X1<5)p(X1<5) = 6/36
2/3 = 1864 =
14
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 5 / 1
Children
What is the probability a family of two children has two boys
� given that it has at least one boy?
� given that the first child is a boy?
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 6 / 1
Children
� P(X1 =>,X2 =>|X1 =>∨X2 =>) =� P(X1 =>,X2 =>|X1 =>) =
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 7 / 1
Children
� P(X1 =>,X2 =>|X1 =>∨X2 =>) =P(X1=>,X2=>)P(X1=>∨X2=>)
=
� P(X1 =>,X2 =>|X1 =>) =
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 7 / 1
Children
� P(X1 =>,X2 =>|X1 =>∨X2 =>) =P(X1=>,X2=>)P(X1=>∨X2=>)
= 1/43/4 =
13
� P(X1 =>,X2 =>|X1 =>) =
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 7 / 1
Children
� P(X1 =>,X2 =>|X1 =>∨X2 =>) =P(X1=>,X2=>)P(X1=>∨X2=>)
= 1/43/4 =
13
� P(X1 =>,X2 =>|X1 =>) =P(X1=>,X2=>)
P(X1=>)=
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 7 / 1
Children
� P(X1 =>,X2 =>|X1 =>∨X2 =>) =P(X1=>,X2=>)P(X1=>∨X2=>)
= 1/43/4 =
13
� P(X1 =>,X2 =>|X1 =>) =P(X1=>,X2=>)
P(X1=>)= 1/4
1/2 =12
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 7 / 1
Conditional Probabilities
One coin in a collection of 65 has two heads. The rest are fair. If a coin,chosen at random from the lot and then tossed, turns up heads 6 times in arow, what is the probability that it is the two-headed coin?
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 8 / 1
Conditional Probabilities
� Let C be the coin chose (> for fake)
� Let H be the number of heads out of six
P(C =>|H = 6) = (1)
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 9 / 1
Conditional Probabilities
� Let C be the coin chose (> for fake)
� Let H be the number of heads out of six
P(C =>|H = 6) =P(C =>∧H = 6)
P(H = 6)= (1)
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 9 / 1
Conditional Probabilities
� Let C be the coin chose (> for fake)
� Let H be the number of heads out of six
P(C =>|H = 6) =P(C =>∧H = 6)
P(H = 6)=
1/65
1/65+ 6465 ·
126
= (1)
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 9 / 1
Conditional Probabilities
� Let C be the coin chose (> for fake)
� Let H be the number of heads out of six
P(C =>|H = 6) =P(C =>∧H = 6)
P(H = 6)=
1/65
1/65+ 6465 ·
126
=1/65
2/65=
1
2(1)
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 9 / 1
Bayes Rule
There’s a test for Boogie Woogie Fever (BWF). The probability of geting apositive test result given that you have BWF is 0.8, and the probability ofgetting a positive result given that you do not have BWF is 0.01. The overallincidence of BWF is 0.01.
1. What is the marginal probability of getting a positive test result?
2. What is the probability of having BWF given that you got a positive testresult?
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 10 / 1
Bayes Rule
Let D be the disease, T be the test
� P(T =>) =� P(D =>|T =>) =
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 11 / 1
Bayes Rule
Let D be the disease, T be the test
� P(T =>) =∑
x=>,⊥P(T =>,D = x) =
� P(D =>|T =>) =
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 11 / 1
Bayes Rule
Let D be the disease, T be the test
� P(T =>) =∑
x=>,⊥P(T =>,D = x) = 0.01 · .8+ .99 · .01=
� P(D =>|T =>) =
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 11 / 1
Bayes Rule
Let D be the disease, T be the test
� P(T =>) =∑
x=>,⊥P(T =>,D = x) = 0.01 · .8+ .99 · .01= 0.02
� P(D =>|T =>) =
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 11 / 1
Bayes Rule
Let D be the disease, T be the test
� P(T =>) =∑
x=>,⊥P(T =>,D = x) = 0.01 · .8+ .99 · .01= 0.02
� P(D =>|T =>) = P(T=>|D=>)P(D=>)P(T=>) =
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 11 / 1
Bayes Rule
Let D be the disease, T be the test
� P(T =>) =∑
x=>,⊥P(T =>,D = x) = 0.01 · .8+ .99 · .01= 0.02
� P(D =>|T =>) = P(T=>|D=>)P(D=>)P(T=>) = 0.8·0.01
0.02 =
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 11 / 1
Bayes Rule
Let D be the disease, T be the test
� P(T =>) =∑
x=>,⊥P(T =>,D = x) = 0.01 · .8+ .99 · .01= 0.02
� P(D =>|T =>) = P(T=>|D=>)P(D=>)P(T=>) = 0.8·0.01
0.02 = 0.4
Data Science: Jordan Boyd-Graber | UMD Conditional Probability Practice | 11 / 1