Transcript
Page 1: CS 121: Lecture 23 Probability Review - Harvard University

CS 121: Lecture 23Probability Review

Madhu Sudan

https://madhu.seas.Harvard.edu/courses/Fall2020

Book: https://introtcs.org

Only the course heads (slower): [email protected]{How to contact usThe whole staff (faster response): CS 121 Piazza

Page 2: CS 121: Lecture 23 Probability Review - Harvard University

Announcements:β€’ Advanced section: Roei Tell (De)Randomizationβ€’ 2nd feedback surveyβ€’ Homework 6 out today. Due 12/3/2020β€’ Section 11 week starts

Page 3: CS 121: Lecture 23 Probability Review - Harvard University

Where we are:Part I: Circuits: Finite computation, quantitative study

Part II: Automata: Infinite restricted computation, quantitative study

Part III: Turing Machines: Infinite computation, qualitative study

Part IV: Efficient Computation: Infinite computation, quantitative study

Part V: Randomized computation: Extending studies to non-classical algorithms

Page 4: CS 121: Lecture 23 Probability Review - Harvard University

Review of course so farβ€’ Circuits:

β€’ Compute every finite function … but no infinite functionβ€’ Measure of complexity: SIZE. ALL𝑛𝑛 βŠ† SIZE 𝑂𝑂 2𝑛𝑛

𝑛𝑛;βˆƒπ‘“π‘“ βˆ‰ SIZE π‘œπ‘œ 2𝑛𝑛

𝑛𝑛

β€’ Finite Automata:β€’ Compute some infinite functions (all Regular functions) β€’ Do not compute many (easy) functions (e.g. EQ π‘₯π‘₯,𝑦𝑦 = 1 ⇔ π‘₯π‘₯ = 𝑦𝑦)

β€’ Turing Machines:β€’ Compute everything computable (definition/thesis)β€’ HALT is not computable:

β€’ Efficient Computation:β€’ P Polytime computable Boolean functions – our notion of efficiently computableβ€’ NP Efficiently verifiable Boolean functions. Our wishlist … β€’ NP-complete: Hardest problems in NP: 3SAT, ISET, MaxCut, EU3SAT, Subset Sum

Counting

Pumping/Pigeonhole

Diagonalization, Reductions

Polytime Reductions

Page 5: CS 121: Lecture 23 Probability Review - Harvard University

Last Module: Challenges to STCT

β€’ Strong Turing-Church Thesis: Everything physically computable in polynomial time can be computable in polynomial time on Turing Machine.

β€’ Challenges: β€’ Randomized algorithms: Already being computed by our computers:

β€’ Weather simulation, Market prediction, … β€’ Quantum algorithms:

β€’ Status:β€’ Randomized: May not add power?β€’ Quantum: May add power?β€’ Still can be studied with same tools. New classes: BPP, BQP:

β€’ B – bounded error; P – Probabilistic, Q - Quantum

Page 6: CS 121: Lecture 23 Probability Review - Harvard University

Today: Review of Basic Probability

β€’ Sample spaceβ€’ Eventsβ€’ Union/intersection/negation – AND/OR/NOT of eventsβ€’ Random variablesβ€’ Expectationβ€’ Concentration / tail bounds

Page 7: CS 121: Lecture 23 Probability Review - Harvard University

Throughout this lectureProbabilistic experiment: tossing 𝑛𝑛 independent unbiased coins.

Equivalently: Choose π‘₯π‘₯ ∼ {0,1}𝑛𝑛

Page 8: CS 121: Lecture 23 Probability Review - Harvard University

EventsFix sample space to be π‘₯π‘₯ ∼ {0,1}𝑛𝑛

An event is a set 𝐴𝐴 βŠ† {0,1}𝑛𝑛. Probability that 𝐴𝐴 happens is Pr 𝐴𝐴 = 𝐴𝐴2𝑛𝑛

Example: If π‘₯π‘₯ ∼ {0,1}3 , Pr π‘₯π‘₯0 = 1 = 48

= 12

Page 9: CS 121: Lecture 23 Probability Review - Harvard University

0 1

0

0 0 0 0

01

1 1 1 1

1

000 001 010 011 100 101 110 111

Q: Let 𝑛𝑛 = 3, 𝐴𝐴 = π‘₯π‘₯0 = 1 ,𝐡𝐡 = π‘₯π‘₯0 + π‘₯π‘₯1 + π‘₯π‘₯2 β‰₯ 2 ,𝐢𝐢 = {π‘₯π‘₯0 + π‘₯π‘₯1 + π‘₯π‘₯2 = 1 π‘šπ‘šπ‘œπ‘œπ‘šπ‘š 2} .What are (i) Pr[𝐡𝐡], (ii) Pr[𝐢𝐢], (iii) Pr[𝐴𝐴 ∩ 𝐡𝐡], (iv) Pr[𝐴𝐴 ∩ 𝐢𝐢] (v) Pr[𝐡𝐡 ∩ 𝐢𝐢] ?

Page 10: CS 121: Lecture 23 Probability Review - Harvard University

0 1

0

0 0 0 0

01

1 1 1 1

1

000 001 010 011 100 101 110 111

Q: Let 𝑛𝑛 = 3, 𝐴𝐴 = π‘₯π‘₯0 = 1 ,𝐡𝐡 = π‘₯π‘₯0 + π‘₯π‘₯1 + π‘₯π‘₯2 β‰₯ 2 ,𝐢𝐢 = {π‘₯π‘₯0 + π‘₯π‘₯1 + π‘₯π‘₯2 = 1 π‘šπ‘šπ‘œπ‘œπ‘šπ‘š 2} .What are (i) Pr[𝐡𝐡], (ii) Pr[𝐢𝐢], (iii) Pr[𝐴𝐴 ∩ 𝐡𝐡], (iv) Pr[𝐴𝐴 ∩ 𝐢𝐢] (v) Pr[𝐡𝐡 ∩ 𝐢𝐢] ?

𝟏𝟏𝟐𝟐

πŸ’πŸ’πŸ–πŸ–

= 𝟏𝟏𝟐𝟐

πŸπŸπŸ–πŸ–

= πŸπŸπŸ’πŸ’

πŸπŸπŸ–πŸ–

= πŸπŸπŸ’πŸ’

πŸπŸπŸ–πŸ–

Page 11: CS 121: Lecture 23 Probability Review - Harvard University

0 1

0

0 0 0 0

01

1 1 1 1

1

000 001 010 011 100 101 110 111

Q: Let 𝑛𝑛 = 3, 𝐴𝐴 = π‘₯π‘₯0 = 1 ,𝐡𝐡 = π‘₯π‘₯0 + π‘₯π‘₯1 + π‘₯π‘₯2 β‰₯ 2 ,𝐢𝐢 = {π‘₯π‘₯1 + π‘₯π‘₯2 + π‘₯π‘₯3 = 1 π‘šπ‘šπ‘œπ‘œπ‘šπ‘š 2} .What are (i) Pr[𝐡𝐡], (ii) Pr[𝐢𝐢], (iii) Pr[𝐴𝐴 ∩ 𝐡𝐡], (iv) Pr[𝐴𝐴 ∩ 𝐢𝐢] (v) Pr[𝐡𝐡 ∩ 𝐢𝐢] ?

𝟏𝟏𝟐𝟐

πŸ’πŸ’πŸ–πŸ–

= 𝟏𝟏𝟐𝟐

πŸπŸπŸ–πŸ–

= πŸπŸπŸ’πŸ’

πŸπŸπŸ–πŸ–

= πŸπŸπŸ’πŸ’

πŸπŸπŸ–πŸ–

Compare Pr 𝐴𝐴 ∩ 𝐡𝐡 vs Pr 𝐴𝐴 Pr[𝐡𝐡]Pr[𝐴𝐴 ∩ 𝐢𝐢] vs Pr 𝐴𝐴 Pr[𝐢𝐢]

Page 12: CS 121: Lecture 23 Probability Review - Harvard University

Operations on eventsPr 𝐴𝐴 π‘œπ‘œπ‘œπ‘œ 𝐡𝐡 β„Žπ‘Žπ‘Žπ‘Žπ‘Žπ‘Žπ‘Žπ‘Žπ‘Žπ‘›π‘›π‘Žπ‘Ž = Pr[ 𝐴𝐴 βˆͺ 𝐡𝐡]

Pr 𝐴𝐴 π‘Žπ‘Žπ‘›π‘›π‘šπ‘š 𝐡𝐡 β„Žπ‘Žπ‘Žπ‘Žπ‘Žπ‘Žπ‘Žπ‘Žπ‘Žπ‘›π‘› = Pr[ 𝐴𝐴 ∩ 𝐡𝐡]

Pr 𝐴𝐴 π‘šπ‘šπ‘œπ‘œπ‘Žπ‘Žπ‘Žπ‘Žπ‘›π‘›β€²π‘‘π‘‘ β„Žπ‘Žπ‘Žπ‘Žπ‘Žπ‘Žπ‘Žπ‘Žπ‘Žπ‘›π‘› = Pr 𝐴𝐴 = Pr {0,1}𝑛𝑛 βˆ– 𝐴𝐴 = 1 βˆ’ Pr[𝐴𝐴]

Q: Prove the union bound: Pr 𝐴𝐴 ∨ 𝐡𝐡 ≀ Pr 𝐴𝐴 + Pr[𝐡𝐡]

Example: Pr 𝐴𝐴 = 4/25 , Pr 𝐡𝐡 = 6/25, Pr 𝐴𝐴 βˆͺ 𝐡𝐡 = 9/25

Page 13: CS 121: Lecture 23 Probability Review - Harvard University

IndependenceInformal: Two events 𝐴𝐴,𝐡𝐡 are independent if knowing 𝐴𝐴 happened doesn’t give any information on whether 𝐡𝐡 happened.Formal: Two events 𝐴𝐴,𝐡𝐡 are independent if Pr 𝐴𝐴 ∩ 𝐡𝐡 = Pr 𝐴𝐴 Pr[𝐡𝐡]

Q: Which pairs are independent?(i) (ii) (iii)

Pr 𝐡𝐡 𝐴𝐴 = Pr π΅π΅β€œConditioning is the soul

of statistics”

Page 14: CS 121: Lecture 23 Probability Review - Harvard University

More than 2 eventsInformal: Two events 𝐴𝐴,𝐡𝐡 are independent if knowing whether 𝐴𝐴 happened doesn’t give any information on whether 𝐡𝐡 happened.

Formal: Three events 𝐴𝐴,𝐡𝐡,𝐢𝐢 are independent if every pair 𝐴𝐴,𝐡𝐡 , 𝐴𝐴,𝐢𝐢 and 𝐡𝐡,𝐢𝐢is independent and Pr 𝐴𝐴 ∩ 𝐡𝐡 ∩ 𝐢𝐢 = Pr 𝐴𝐴 Pr[𝐡𝐡] Pr[𝐢𝐢]

(i) (ii)

Page 15: CS 121: Lecture 23 Probability Review - Harvard University

Random variablesAssign a number to every outcome of the coins.

Formally r.v. is 𝑋𝑋: {0,1}𝑛𝑛 β†’ ℝ

Example: 𝑋𝑋 π‘₯π‘₯ = π‘₯π‘₯0 + π‘₯π‘₯1 + π‘₯π‘₯2

0 1 1 2 1 2 2 3

𝒗𝒗 𝑷𝑷𝑷𝑷[𝑿𝑿 = 𝒗𝒗]

0 1/8

1 3/8

2 3/8

3 1/8

Page 16: CS 121: Lecture 23 Probability Review - Harvard University

ExpectationAverage value of 𝑋𝑋 : 𝔼𝔼 𝑋𝑋 = βˆ‘π‘₯π‘₯∈{0,1}𝑛𝑛 2βˆ’π‘›π‘›π‘‹π‘‹(π‘₯π‘₯) = βˆ‘π‘£π‘£βˆˆβ„ 𝑣𝑣 β‹… Pr[𝑋𝑋 = 𝑣𝑣]

Example: 𝑋𝑋 π‘₯π‘₯ = π‘₯π‘₯0 + π‘₯π‘₯1 + π‘₯π‘₯2

0 1 1 2 1 2 2 3

𝒗𝒗 𝑷𝑷𝑷𝑷[𝑿𝑿 = 𝒗𝒗]

0 1/8

1 3/8

2 3/8

3 1/8

Q: What is 𝔼𝔼[𝑋𝑋]? 0 β‹…18

+ 1 β‹…38

+ 2 β‹…38

+ 3 β‹…18

=128

= 1.5

Page 17: CS 121: Lecture 23 Probability Review - Harvard University

Linearity of expectationLemma: 𝔼𝔼 𝑋𝑋 + π‘Œπ‘Œ = 𝔼𝔼 𝑋𝑋 + 𝔼𝔼[π‘Œπ‘Œ]

Corollary: 𝔼𝔼 π‘₯π‘₯0 + π‘₯π‘₯1 + π‘₯π‘₯2 = 𝔼𝔼 π‘₯π‘₯0 + 𝔼𝔼 π‘₯π‘₯1 + 𝔼𝔼 π‘₯π‘₯2 = 1.5

Proof:

𝔼𝔼 𝑋𝑋 + π‘Œπ‘Œ = 2βˆ’π‘›π‘›οΏ½π‘₯π‘₯

(𝑋𝑋 π‘₯π‘₯ + π‘Œπ‘Œ π‘₯π‘₯ ) = 2βˆ’π‘›π‘›οΏ½π‘₯π‘₯

𝑋𝑋 π‘₯π‘₯ + 2βˆ’π‘›π‘›οΏ½π‘₯π‘₯

π‘Œπ‘Œ(π‘₯π‘₯) = 𝔼𝔼 𝑋𝑋 + 𝔼𝔼[π‘Œπ‘Œ]

Page 18: CS 121: Lecture 23 Probability Review - Harvard University

Independent random variablesDef: 𝑋𝑋,π‘Œπ‘Œ are independent if {𝑋𝑋 = 𝑒𝑒} and {π‘Œπ‘Œ = 𝑣𝑣} are independent βˆ€π‘’π‘’, 𝑣𝑣

Def: 𝑋𝑋0, … ,π‘‹π‘‹π‘˜π‘˜βˆ’1, independent if 𝑋𝑋0 = 𝑣𝑣0 , … , {π‘‹π‘‹π‘˜π‘˜βˆ’1 = π‘£π‘£π‘˜π‘˜βˆ’1} ind. βˆ€π‘’π‘’0 β€¦π‘’π‘’π‘˜π‘˜βˆ’1i.e. βˆ€π‘£π‘£, … , π‘£π‘£π‘˜π‘˜βˆ’1 βˆ€π‘†π‘† βŠ† [π‘˜π‘˜]

Pr οΏ½π‘–π‘–βˆˆπ‘†π‘†

𝑋𝑋𝑖𝑖 = 𝑣𝑣𝑖𝑖 = οΏ½π‘–π‘–βˆˆπ‘†π‘†

Pr[𝑋𝑋𝑖𝑖 = 𝑣𝑣𝑖𝑖]

Q: Let π‘₯π‘₯ ∼ {0,1}𝑛𝑛. Let 𝑋𝑋0 = π‘₯π‘₯0,𝑋𝑋1 = π‘₯π‘₯1, … ,π‘‹π‘‹π‘›π‘›βˆ’1 = π‘₯π‘₯π‘›π‘›βˆ’1Let π‘Œπ‘Œ0 = π‘Œπ‘Œ1 = β‹― = π‘Œπ‘Œπ‘˜π‘˜βˆ’1 = π‘₯π‘₯0

Are 𝑋𝑋0, … ,π‘‹π‘‹π‘›π‘›βˆ’1 independent?

Are π‘Œπ‘Œ0, … ,π‘Œπ‘Œπ‘›π‘›βˆ’1 independent?

Page 19: CS 121: Lecture 23 Probability Review - Harvard University

Independence and concentrationQ: Let π‘₯π‘₯ ∼ {0,1}𝑛𝑛. Let 𝑋𝑋0 = π‘₯π‘₯0,𝑋𝑋1 = π‘₯π‘₯1, … ,π‘‹π‘‹π‘›π‘›βˆ’1 = π‘₯π‘₯π‘›π‘›βˆ’1

Let π‘Œπ‘Œ0 = π‘Œπ‘Œ1 = β‹― = π‘Œπ‘Œπ‘˜π‘˜βˆ’1 = π‘₯π‘₯0Let 𝑋𝑋 = 𝑋𝑋0 + β‹―+ π‘‹π‘‹π‘›π‘›βˆ’1 , π‘Œπ‘Œ = π‘Œπ‘Œ0 + β‹―+ π‘Œπ‘Œπ‘›π‘›βˆ’1

Compute 𝔼𝔼[𝑋𝑋], compute 𝔼𝔼[π‘Œπ‘Œ]

For 𝑛𝑛 = 100, estimate Pr[π‘Œπ‘Œ βˆ‰ 0.4𝑛𝑛, 0.6𝑛𝑛 ] , Pr[𝑋𝑋 βˆ‰ 0.4𝑛𝑛, 0.6𝑛𝑛 ]

Page 20: CS 121: Lecture 23 Probability Review - Harvard University

Sums of independent random variables

𝑛𝑛 = 5 Pr[𝑋𝑋 βˆ‰ 0.4,0.6 𝑛𝑛]= 37.5%https://www.stat.berkeley.edu/~stark/Java/Html/BinHist.htm

Page 21: CS 121: Lecture 23 Probability Review - Harvard University

Sums of independent random variables

𝑛𝑛 = 10 Pr[𝑋𝑋 βˆ‰ 0.4,0.6 𝑛𝑛]= 34.4%https://www.stat.berkeley.edu/~stark/Java/Html/BinHist.htm

Page 22: CS 121: Lecture 23 Probability Review - Harvard University

Sums of independent random variables

𝑛𝑛 = 20 Pr[𝑋𝑋 βˆ‰ 0.4,0.6 𝑛𝑛]= 26.3%https://www.stat.berkeley.edu/~stark/Java/Html/BinHist.htm

Page 23: CS 121: Lecture 23 Probability Review - Harvard University

Sums of independent random variables

𝑛𝑛 = 50 Pr[𝑋𝑋 βˆ‰ 0.4,0.6 𝑛𝑛]= 11.9%https://www.stat.berkeley.edu/~stark/Java/Html/BinHist.htm

Page 24: CS 121: Lecture 23 Probability Review - Harvard University

Sums of independent random variables

𝑛𝑛 = 100 Pr[𝑋𝑋 βˆ‰ 0.4,0.6 𝑛𝑛]= 3.6%https://www.stat.berkeley.edu/~stark/Java/Html/BinHist.htm

Page 25: CS 121: Lecture 23 Probability Review - Harvard University

Sums of independent random variables

𝑛𝑛 = 200 Pr[𝑋𝑋 βˆ‰ 0.4,0.6 𝑛𝑛]= 0.4%https://www.stat.berkeley.edu/~stark/Java/Html/BinHist.htm

Page 26: CS 121: Lecture 23 Probability Review - Harvard University

Sums of independent random variables

𝑛𝑛 = 400 Pr[𝑋𝑋 βˆ‰ 0.4,0.6 𝑛𝑛]= 0.01%https://www.stat.berkeley.edu/~stark/Java/Html/BinHist.htm

Page 27: CS 121: Lecture 23 Probability Review - Harvard University

Concentration bounds𝑋𝑋 sum of 𝑛𝑛 independent random variables – β‰ˆβ€œNormal/Gaussian/bell curve”:

Pr 𝑋𝑋 βˆ‰ 0.99,1.01 𝔼𝔼 𝑋𝑋 < exp(βˆ’π›Ώπ›Ώ β‹… 𝑛𝑛)

Otherwise: weaker bounds

In concentrated r.v.’s, expectation, median, mode, etc.. β‰ˆ same

Chernoff Bound: Let 𝑋𝑋0, … ,π‘‹π‘‹π‘›π‘›βˆ’1 i.i.d. r.v.’s with 𝑋𝑋𝑖𝑖 ∈ [0,1]. Then if 𝑋𝑋 = 𝑋𝑋0 + β‹―+ π‘‹π‘‹π‘›π‘›βˆ’1 and π‘Žπ‘Ž = 𝔼𝔼 𝑋𝑋 for every πœ–πœ– > 0,

Pr 𝑋𝑋 βˆ’ π‘›π‘›π‘Žπ‘Ž > πœ–πœ–π‘›π‘› < 2 β‹… exp(βˆ’2πœ–πœ–2 β‹… 𝑛𝑛)

Independent & identically distributed

Page 28: CS 121: Lecture 23 Probability Review - Harvard University

Concentration bounds𝑋𝑋 sum of 𝑛𝑛 independent random variables – β‰ˆβ€œNormal/Gaussian/bell curve”:

Pr 𝑋𝑋 βˆ‰ 0.99,1.01 𝔼𝔼 𝑋𝑋 < exp(βˆ’π›Ώπ›Ώ β‹… 𝑛𝑛)

Otherwise: weaker bounds

In concentrated r.v.’s, expectation, median, mode, etc.. β‰ˆ same

Chernoff Bound: Let 𝑋𝑋0, … ,π‘‹π‘‹π‘›π‘›βˆ’1 i.i.d. r.v.’s with 𝑋𝑋𝑖𝑖 ∈ [0,1]. Then if 𝑋𝑋 = 𝑋𝑋0 + β‹―+ π‘‹π‘‹π‘›π‘›βˆ’1 and π‘Žπ‘Ž = 𝔼𝔼 𝑋𝑋 for every πœ–πœ– > 0,

Pr 𝑋𝑋 βˆ’ π‘›π‘›π‘Žπ‘Ž > πœ–πœ–π‘›π‘› < 2 β‹… exp(βˆ’2πœ–πœ–2 β‹… 𝑛𝑛)

Independent & identically distributed

< exp βˆ’πœ–πœ–2 β‹… 𝑛𝑛 if 𝑛𝑛 >1πœ–πœ–2

Page 29: CS 121: Lecture 23 Probability Review - Harvard University

Simplest Bound: Markov’s InequalityQ: Suppose that the average age in a neighborhood is 20. Prove that at most 1/4 of the residents are 80 or older.

Thm (Markov’s Inequality): Let 𝑋𝑋 be non-negative r.v. and πœ‡πœ‡ = 𝔼𝔼[𝑋𝑋]. Then for every π‘˜π‘˜ > 1, Pr 𝑋𝑋 β‰₯ π‘˜π‘˜πœ‡πœ‡ ≀ 1/π‘˜π‘˜Proof: Same as question above

Avg > 14β‹… 80 = 20

A: Suppose otherwise:80

0

>14

Contribute enough to

make avg>20

Page 30: CS 121: Lecture 23 Probability Review - Harvard University

Variance & ChebychevIf 𝑋𝑋 is r.v. with πœ‡πœ‡ = 𝔼𝔼[𝑋𝑋] then π‘‰π‘‰π‘Žπ‘Žπ‘œπ‘œ 𝑋𝑋 = 𝔼𝔼 𝑋𝑋 βˆ’ πœ‡πœ‡ 2 = 𝔼𝔼 𝑋𝑋2 βˆ’ 𝔼𝔼 𝑋𝑋 2

𝜎𝜎 𝑋𝑋 = π‘‰π‘‰π‘Žπ‘Žπ‘œπ‘œ(𝑋𝑋)

Chebychev’s Inequality: For every r.v. 𝑋𝑋

Pr 𝑋𝑋 β‰₯ π‘˜π‘˜ π‘šπ‘šπ‘Žπ‘Žπ‘£π‘£π‘‘π‘‘π‘Žπ‘Žπ‘‘π‘‘π‘‘π‘‘π‘œπ‘œπ‘›π‘›π‘Žπ‘Ž π‘“π‘“π‘œπ‘œπ‘œπ‘œπ‘šπ‘š πœ‡πœ‡ ≀1π‘˜π‘˜2

Compare with 𝑋𝑋 = βˆ‘π‘‹π‘‹π‘–π‘– i.i.d or other r.v.’s well approx. by Normal where

Pr 𝑋𝑋 β‰₯ π‘˜π‘˜ π‘šπ‘šπ‘Žπ‘Žπ‘£π‘£π‘‘π‘‘π‘Žπ‘Žπ‘‘π‘‘π‘‘π‘‘π‘œπ‘œπ‘›π‘›π‘Žπ‘Ž π‘“π‘“π‘œπ‘œπ‘œπ‘œπ‘šπ‘š πœ‡πœ‡ β‰ˆ exp(βˆ’π‘˜π‘˜2)

(Proof: Markov on π‘Œπ‘Œ = 𝑋𝑋 βˆ’ πœ‡πœ‡ 2 )

Page 31: CS 121: Lecture 23 Probability Review - Harvard University

Next Lectures

β€’ Randomized Algorithmsβ€’ Some examples

β€’ Randomized Complexity Classesβ€’ BPTIME 𝑇𝑇 𝑛𝑛 , BPPβ€’ Properties of randomized computation (Reducing error …)

Page 32: CS 121: Lecture 23 Probability Review - Harvard University
Page 33: CS 121: Lecture 23 Probability Review - Harvard University

Recommended