95
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product Lecture 5: Mathematical Expectation Assist. Prof. Dr. Emel YAVUZ DUMAN MCB1007 Introduction to Probability and Statistics ˙ Istanbul K¨ ult¨ ur University

Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

  • Upload
    others

  • View
    9

  • Download
    1

Embed Size (px)

Citation preview

Page 1: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Lecture 5: Mathematical Expectation

Assist. Prof. Dr. Emel YAVUZ DUMAN

MCB1007 Introduction to Probability and StatisticsIstanbul Kultur University

Page 2: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Outline

1 Introduction

2 The Expected Value of a Random Variable

3 Moments

4 Chebyshev’s Theorem

5 Moment Generating Functions

6 Product Moments

7 Conditional Expectation

Page 3: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Outline

1 Introduction

2 The Expected Value of a Random Variable

3 Moments

4 Chebyshev’s Theorem

5 Moment Generating Functions

6 Product Moments

7 Conditional Expectation

Page 4: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Introduction

A very important concept in probability and statistics is that of themathematical expectation, expected value, or briefly theexpectation, of a random variable. Originally, the concept of amathematical expectation arose in connection with games ofchance, and in its simplest form it is the product of the amount aplayer stands to win and the probability that he or she will win. Forinstance, if we hold one of 10, 000 tickets in a raffle for which thegrand prize is a trip worth $4, 800, our mathematical expectation is

4, 800 · 1

10, 000= $0.48.

This amount will have to be interpreted in the sense of an average- altogether the 10, 000 tickets pay $4, 800, or on the average$4,80010,000 = $0.48 per ticket.

Page 5: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Outline

1 Introduction

2 The Expected Value of a Random Variable

3 Moments

4 Chebyshev’s Theorem

5 Moment Generating Functions

6 Product Moments

7 Conditional Expectation

Page 6: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

The Expected Value of a Random Variable

Definition 1

If X is a discrete random variable and f (x) is the value of itsprobability distribution at x , the expected value of X is

E (X ) =∑x

xf (x).

Correspondingly, if X is a continuous random variable and f (x) isthe value of its probability density at x , the expected value of X is

E (X ) =

∫ ∞

−∞xf (x)dx .

In this definition it is assumed, of course, that the sum or theintegral exists; otherwise, the mathematical expectation isundefined.

Page 7: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 2

Imagine a game in which, on any play, a player has a 20% chanceof winning $3 and an 80% chance of losing $1. The probabilitydistribution function of the random variable X , the amount won orlost on a single play is

x $3 −$1

f (x) 0.2 0.8

and so the average amount won (actually lost, since it is negative)- in the long run - is

E (X ) = ($3)(0.2) + (−$1)(0.8) = −$0.20.

What does “in the long run” mean? If you play, are youguaranteed to lose no more than 20 cents?

Page 8: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Solution. If you play and lose, you are guaranteed to lose $1. Anexpected loss of 20 cents means that if you played the game overand over and over and over · · · again, the average of your $3winnings and your $1 losses would be a 20 cent loss. “In the longrun” means that you can’t draw conclusions about one or twoplays, but rather thousands and thousands of plays.

Page 9: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 3

A lot of 12 television sets includes 2 with white cords. If three ofthe sets are chosen at random for shipment to a hotel, how manysets with white cords can the shipper expect to send to the hotel?

Solution. Since x of the two sets with white cords and 3− x ofthe 10 other sets can be chosen in

(2x

)(103−x

)ways, three of the 12

sets can be chosen in(123

)ways, and these

(123

)possibilities are

presumably equiprobable, we find that the probability distributionof X , the number of sets with white cords shipped to the hotel, isgiven by

f (x) =

(2x

)( 103−x

)(123

) for x = 0, 1, 2

Page 10: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

or, in tabular form,

x 0 1 2

f (x) 6/11 9/22 1/22

Now,

E (X ) =2∑

x=0

xf (x) = 0 · 6

11+ 1 · 9

22+ 2 · 1

22=

1

2

and since half a set cannot possibly be shipped, it should be clearthat the term “expect” is not used in its colloquial sense. Indeed,it should be interpreted as an average pertaining to repeatedshipments made under the given conditions.

Page 11: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 4

A saleswoman has been offered a new job with a fixed salary of$290. Her records from her present job show that her weeklycommissions have the following probabilities:

commission 0 $100 $200 $300 $400

probability 0.05 0.15 0.25 0.45 0.1

Should she change jobs?

Solution. The expected value (expected income) from her presentjob is

(0× $0.05) + (0.15 × $100) + (0.25 × $200)

+ (0.45 × $300) + (0.1× $400) = $240.

If she is making a decision only on the amount of money earned,then clearly she ought to change jobs.

Page 12: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 5

Dan, Eric, and Nick decide to play a fun game. Each player puts$2 on the table and secretly writes down either heads or tails.Then one of them tosses a fair coin. The $6 on the table is dividedevenly among the players who correctly predicted the outcome ofthe coin toss. If everyone guessed incorrectly, then everyone takestheir money back. After many repetitions of this game, Dan haslost a lot of money - more than can be explained by bad luck.What’s going on?

Solution. A tree diagram for this problem is worked out below,under the assumptions that everyone guesses correctly withprobability 1/2 and everyone is correct independently.

Page 13: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Y

Y

Y

Y

Y

Y

Y

N

N

N

N

N

N

N

Probability

Dan right?

Eric right?

Nick right? Dan’s payoff

1/2

1/2

1/2

1/2

1/2

1/2

1/2

1/2

1/2

1/2

1/2

1/2

1/2

1/2

1/8

1/8

1/8

1/8

1/8

1/8

1/8

1/8

$0

$0

$1

$1

$4

−$2

−$2

−$2

Page 14: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

In the payoff column, we are accounting for the fact that Dan hasto put in $2 just to play. So, for example, if he guesses correctlyand Eric and Nick are wrong, then he takes all $6 on the table, buthis net profit is only $4. Working from the tree diagram, Dan’sexpected payoff is:

E (payoff) =0 · 18+ 1 · 1

8+ 1 · 1

8+ 4 · 1

8+ (−2) · 1

8+ (−2) · 1

8

+ (−2) · 18+ 0 · 1

8=0.

So the game perfectly fair! Over time, he should neither win norlose money.

A game is called fair if the expected value of the game iszero.

Page 15: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Let say that Nick and Eric are collaborating; in particular, theyalways make opposite guesses. So our assumption everyone iscorrect independently is wrong; actually the events that Nick iscorrect and Eric is correct are mutually exclusive! As a result, Dancan never win all the money on the table. When he guessescorrectly, he always has to split his winnings with someone else.This lowers his overall expectation, as the corrected tree diagrambelow shows:

Page 16: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Y

Y

Y

Y

Y

Y

Y

N

N

N

N

N

N

N

Probability

Dan right?

Eric right?

Nick right? Dan’s payoff

0

0

0

0

0

0

0

0

1

1

1

1

1/2

1/2

1/2

1/2

1/2

1/2

1/4

1/4

1/4

1/4

$0

$0

$1

$1

$4

−$2

−$2

−$2

Page 17: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

From this revised tree diagram, we can work out Dan’s actualexpected payoff

E (payoff) =0 · 0 + 1 · 14+ 1 · 1

4+ 4 · 0 + (−2) · 0 + (−2) · 1

4

+ (−2) · 14+ 0 · 0

=− 1

2.

So he loses an average of a half-dollar per game!

Page 18: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 6

Consider the following game in which two players each spin a fairwheel in turn

���

��

���� ��! ���� ��"

The player that spins the higher number wins, and the loser has topay the winner the difference of the two numbers (in dollars). Whowould win more often? What is that player’s expected winnings?

Page 19: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Solution. Assuming 3 equally likely outcomes for spinning A’swheel, and 2 equally likely outcomes for B ’s wheel, there are then3× 2 = 6 possible equally-likely outcomes for this game:

Sample Space������B

A2 4 12

5 (2, 5) (4, 5) (12, 5)

6 (2, 6) (4, 6) (12, 6)

We see here that player A wins in 2/6 games, and player B wins in4/6 games, so player B wins the most often. But, let’s computethe expected winnings for player B :

Winnings, X , for Player B������B

A2 4 12

5 $3 $1 −$7

6 $4 $2 −$6

Page 20: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Each of these possible amounts won has the same probability(1/6). The expected winnings for player B is then simply:

E (X ) = $3 · 16+ $4 · 1

6+ $1 · 1

6+ $2 · 1

6+ (−$7) · 1

6+ (−$6) · 1

6= −$0.50.

Hence, we see that even though player B is more likely to win on asingle game, his expected winnings is actually negative. If he keepson playing the game, he will eventually start losing money!

Page 21: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 7

Find the expected value of the random variable X whoseprobability density is given by

f (x) =

⎧⎪⎨⎪⎩x for 0 < x < 1,

2− x for 1 ≤ x < 2,

0 elsewhere.

Solution.

E (X ) =

∫ ∞

−∞xf (x)dx

=

∫ 0

−∞x f (x)︸︷︷︸

0

dx +

∫ 1

0x f (x)︸︷︷︸

x

dx +

∫ 2

1x f (x)︸︷︷︸

2−x

dx +

∫ ∞

0x f (x)︸︷︷︸

0

dx

=x3

3

∣∣∣∣10

+

(x2 − x3

3

)∣∣∣∣21

=1

3+ 22 − 23

3− 1 +

1

3= 1.

Page 22: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 8

Certain coded measurements of the pitch diameter of threads of afitting have the probability density

f (x) =

{4

π(1+x2)for 0 < x < 1,

0 elsewhere.

Find the expected value of this random variable.

Solution.

E (X ) =

∫ ∞

−∞xf (x)dx =

∫ 1

0x · 4dx

π(1 + x2)=

2

π

∫ 1

0· 2xdx

(1 + x2)

=2

πln(x2 + 1)

∣∣10=

2

π(ln 2− ln 1) =

ln 4

π≡ 0.4412712003.

Page 23: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Theorem 9

If X is a discrete random variable and f (x) is the value of itsprobability distribution at x, the expected value of g(X ) is given by

E [g(X )] =∑x

g(x)f (x).

Correspondingly, if X is a continuous random variable and f (x) isthe value of its probability density at x, the expected value ofg(X ) is given by

E [g(X )] =

∫ ∞

−∞g(x)f (x)dx .

Page 24: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 10

Roulette, introduced in Paris in 1765, is a very popular casinogame. Suppose that Hannah’s House of Gambling has a roulettewheel containing 38 numbers: zero (0), double zero (00), and thenumbers 1, 2, 3, · · · , 36. Let X denote the number on which theball lands and u(x) denote the amount of money paid to thegambler, such that:

u(x) =

⎧⎪⎪⎪⎪⎨⎪⎪⎪⎪⎩$5 if x = 0,

$10 if x = 00,

$1 if x is odd,

$2 if x is even.

How much would Hannah’s House of Gambling has to charge eachgambler to play in order to ensure that Hannah’s House ofGambling made some money?

Page 25: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Solution. Assuming that the ball has an equally likely chance oflanding on each number, the probability distribution function of Xis

f (x) = 1/38

for each x = 0, 00, 1, 2, 3, · · · , 36. Therefore, the expected value ofu(X ) is

E (u(X )) =∑x

u(x)f (x)

=$5

(1

38

)+ $10

(1

38

)

+

[$1

(1

38

)× 18

]+

[$2

(1

38

)× 18

]= $1.82

Page 26: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Note that the 18 that is multiplied by the $1 and $2 is becausethere are 18 odd and 18 even numbers on the wheel. Ourcalculation tells us that, in the long run, Hannah’s House ofGambling would expect to have to pay out $1.82 for each spin ofthe roulette wheel. Therefore, in order to ensure that the Housemade money, the House would have to charge at least $1.82 perplay.

Page 27: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 11

If X is the number of points rolled with a balanced die, find theexpected value of g(X ) = 2X 2 + 1.

Solution. Since each possible outcome has the probability 16 , we

get

E [g(X )] =E [2X 2 + 1] =6∑

x=1

g(x)f (x) =6∑

x=1

(2x2 + 1)1

6

=(2 · 12 + 1)1

6+ (2 · 22 + 1)

1

6+ · · ·+ (2 · 62 + 1)

1

6

=94

3.

Page 28: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 12

If X has the probability density

f (x) =

{e−x for x > 0,

0 elsewhere

find the expected value of g(X ) = e3X/4.

Solution.

E [g(X )] = E [e3X/4] =

∫ ∞

−∞g(x)f (x)dx = lim

c→∞

∫ c

0e

3x4 e−xdx

= limc→∞

∫ c

0e−

x4 dx = −4 lim

c→∞ e−x4

∣∣∣c0= −4 lim

c→∞(e−c4 − e0)

= 4.

Page 29: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Some Theorems on Expectation

The determination of mathematical expectation can often besimplified by using the following theorems. Since the steps areessentially the same, some proofs will be given either for thediscrete case or the continuous case; others are left for the readeras exercises.

Page 30: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Theorem 13

If a and b are any constants, then

E (aX + b) = aE (X ) + b.

Proof.

Using Theorem 9 with g(X ) = aX + b, we get

E (aX + b) =

∫ ∞

−∞(ax + b)f (x)dx

= a

∫ ∞

−∞xf (x)dx︸ ︷︷ ︸E(X )

+b

∫ ∞

−∞f (x)dx︸ ︷︷ ︸1

= aE (X ) + b.

Page 31: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

If we set b = 0 or a = 0, it follows from Theorem 13 that

Corollary 14

If a is a constant, then

E (aX ) = aE (X ).

Corollary 15

If b is a constant, thenE (b) = b.

Theorem 16

If c1, c2, and cn are constants, then

E [c1g1(X ) + c2g2(X ) + · · ·+ cngn(X )]

= c1E [g1(X )] + c2E [g2(X )] + · · ·+ cnE [gn(X )].

Page 32: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

The concept of a mathematical expectation can easily be extendedto situations involving more than one random variable. Forinstance, if Z is the random variable whose values are related tothose of the two random variables X and Y by means of theequation z = g(x , y), it can be shown that

Theorem 17

If X and Y are discrete random variables and f (x , y) is the valueof their joint probability distribution at (x , y), the expected valueof g(X ,Y ) is

E [g(X ,Y )] =∑x

∑y

g(x , y) · f (x , y).

Page 33: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Theorem 18

If X and Y are any random variables, then

E (X + Y ) = E (X ) + E (Y ).

Proof. Let f (x , y) be the joint probability distribution of X and Y ,assumed discrete. Then

E (X + Y ) =∑x

∑y

(x + y)f (x , y)

=∑x

∑y

xf (x , y) +∑x

∑y

yf (x , y)

= E (X ) + E (Y ).

Note that the theorem is true whether or not X and Y areindependent.

Page 34: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Theorem 19

If X and Y are independent random variables, then

E (X · Y ) = E (X ) · E (Y ).

Proof. Let f (x , y) be the joint probability distribution of X and Y ,assumed discrete. If the variables X and Y are independent, wehave f (x , y) = g(x) · h(y). Therefore,

E (X · Y ) =∑x

∑y

(x · y)f (x , y) =∑x

∑y

(x · y)g(x) · h(y)

=∑x

(x · g(x)

∑y

y · h(y))

=∑x

(x · g(x)E (Y )) = E (X ) · E (Y ).

Page 35: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Note that the validity of this theorem hinges on whether f (x , y)can be expressed as a function of x multiplied by a function of y ,for all x and y , i.e., on whether X and Y are independent. Fordependent variables it is not true in general.

Page 36: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 20

Suppose the probability distribution of the discrete random variableX is

x 0 1 2 3

f (x) 0.2 0.1 0.4 0.3

Find E (X ), E (X 2). Knowing that, what is E (4X 2) andE (3X + 2X 2)?

Solution.E (X ) =

∑3x=0 xf (x) = 0 · 0.2 + 1 · 0.1 + 2 · 0.4 + 3 · 0.3 = 1.8,

E (X 2) =∑3

x=0 x2f (x) = 02 ·0.2+12 ·0.1+22 ·0.4+32 ·0.3 = 4.4,

E (4X 2) = 4E (X 2) = 4 · 4.4 = 17.6,E (3X + 2X 2) = 3E (X ) + 2E (X 2) = 3 · 1.8 + 2 · 4.4 = 14.2.

Page 37: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 21

If the probability density of X is given by

f (x) =

{2(1− x) for 0 < x < 1

0 elsewhere,

(a) show that

E (X r ) =2

(r + 1)(r + 2),

(b) and use this result to evaluate

E [(2X + 1)2].

Page 38: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

f (x) =

{2(1 − x) for 0 < x < 1

0 elsewhere,

Solution.

(a)

E (X r ) =

∫ ∞

−∞x r f (x)dx

=

∫ 0

−∞x r f (x)︸︷︷︸

0

dx +

∫ 1

0x r f (x)︸︷︷︸

2(1−x)

dx +

∫ ∞

1x r f (x)︸︷︷︸

0

dx

= 2

∫ 1

0(x r − x r+1)dx = 2

(x r+1

r + 1− x r+2

r + 2

)1

0

= 2

(1

r + 1− 1

r + 2

)=

2

(r + 1)(r + 2).

Page 39: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

E (X r ) = 2(r+1)(r+2)

(b) Since

E [(2X + 1)2] = E [4X 2 + 4X + 1] = 4E (X 2) + 4E (X ) + 1

and substitution of r = 1 and r = 2 into the precedingformula yields

E (X ) =2

(2)(3)=

1

3and E (X 2) =

2

(3)(4)=

1

6,

we get

E [(2X + 1)2] = 4 · 16+ 4 · 1

3+ 1 = 3.

Page 40: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Outline

1 Introduction

2 The Expected Value of a Random Variable

3 Moments

4 Chebyshev’s Theorem

5 Moment Generating Functions

6 Product Moments

7 Conditional Expectation

Page 41: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Moments

Definition 22

The r th moment about the origin of a random variable X ,denoted by μ′

r , is the expected value of X r ; symbolically,

μ′r = E (X r ) =

∑x

x r f (x)

for r = 0, 1, 2, · · · when X is discrete, and

μ′r = E (X r ) =

∫ ∞

−∞x r f (x)dx

when X is continuous.

Page 42: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

1 When r = 0, we have μ′0 = E (X 0) = E (1) = 1.

2 When r = 1, we have μ′1 = E (X 1) = E (X ), which is just the

expected value of the random variable X , and we give it aspecial symbol and a special name.

Definition 23

μ′1 is called the mean of the distribution function of X , or simply

the mean of X , and it is denoted by μ.

Page 43: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

The special moments we shall define next are of importance instatistics because they serve to describe the shape of thedistribution of a random variable, that is, the shape of the graph ofits probability distribution or probability density.

Definition 24

The r th moment about the mean of a random variable X ,denoted by μr , is the expected value of (X − μ)r ; symbolically,

μr = E [(X − μ)r ] =∑x

(x − μ)r f (x)

for r = 0, 1, 2, · · · when X is discrete, and

μr = E [(X − μ)r ] =

∫ ∞

−∞(x − μ)r f (x)dx

when X is continuous.

Page 44: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

1 When r = 0, we have μ0 = E [(X − μ)0] = 1.

2 When r = 1, we haveμ1 = E [(X − μ)1] = E (X − μ) = E (X )− E (μ) = μ− μ = 0.

for any random variable for which μ exists.The second moment about the mean is of special importance instatistics because it is indicative of the spread or dispersion of thedistribution of a random variable; thus, it is given a special symboland a special name.

Definition 25

μ2 is called the variance of the distribution of X , or simply thevariance of X , and it is denoted by σ2, var(X ), or V (X ); σ, thepositive square root of the variance, is called the standarddeviation.

Page 45: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

This figure shows how the variance reflects the spread or dispersionof the distribution of a random variable. Here we show thehistograms of the probability distributions of four random variableswith the same mean μ = 5 but variances equaling 5.26, 3.18, 1.66,and 0.88.

Page 46: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

A large standard deviation indicates that the data points are farfrom the mean and a small standard deviation indicates that theyare clustered closely around the mean. For example, each of thethree populations {0, 0, 14, 14}, {0, 6, 8, 14} and {6, 6, 8, 8} has amean of 7. Their standard deviations are 7, 5, and 1, respectively.The third population has a much smaller standard deviation thanthe other two because its values are all close to 7. It will have thesame units as the data points themselves. If, for instance, the dataset {0, 6, 8, 14} represents the ages of a population of four siblingsin years, the standard deviation is 5 years. As another example, thepopulation {1000, 1006, 1008, 1014} may represent the distancestraveled by four athletes, measured in meters. It has a mean of1007 meters, and a standard deviation of 5 meters.

Page 47: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Theorem 26

σ2 = μ′2 − μ2.

Proof.

σ2 = E [(X − μ)2] = E (X 2 − 2Xμ+ μ2)

= E (X 2)− 2μE (X )︸ ︷︷ ︸μ

+μ2

= E (X 2)− 2μ2 + μ2

= E (X 2)︸ ︷︷ ︸μ′2

−μ2

= μ′2 − μ2

Page 48: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 27

Use Theorem 26 to calculate the variance of X , representing thenumber of points rolled with a balanced die.

Solution. First we compute

μ = E (X ) = 1 · 16+ 2 · 1

6+ 3 · 1

6+ 4 · 1

6+ 5 · 1

6+ 6 · 1

6=

7

2.

Now,

μ′2 = E (X 2) = 12 · 1

6+22 · 1

6+32 · 1

6+42 · 1

6+52 · 1

6+62 · 1

6=

91

6

and it follows that

σ2 = μ′2 − μ2 =

91

6−(7

2

)2

=35

12.

Page 49: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 28

With reference to Example 8, find the standard deviation of therandom variable X .

Solution. In Example 8 we showed that μ = E (X ) = ln 4π . Now,

μ′2 = E (X 2) =

4

π

∫ 1

0

x2

1 + x2dx =

4

π

∫ 1

0

(1− 1

1 + x2

)dx

=4

π(x − arctan x)10 =

4

π

[(1− arctan 1)− (0− arctan 0)

]=

4

π

[ (1− π

4

)− (0− 0)

]=

4

π− 1

and it follows that

σ2 =4

π− 1−

(ln 4

π

)2

≈ 0.078519272

and σ =√0.078519272 = 0.280212905.

Page 50: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 29

Find μ, μ′2, and σ2 for the random variable X that has the

probability distribution f (x) = 12 for x = −2 and x = 2.

Solution.

μ = E (X ) =∑x

xf (x) = (−2)1

2+ 2

1

2= 0,

μ′2 = E (X 2) =

∑x

x2f (x) = (−2)21

2+ 22

1

2= 4,

σ2 = μ2 = μ′2 − μ2 = 4− 02 = 4,

or

σ2 = μ2 = E [(X−μ)2] =∑x

(x−μ)2f (x) = (−2−0)21

2+(2−0)2

1

2= 4.

Page 51: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 30

Find μ, μ′2, and σ2 for the random variable X that has the

probability density

f (x) =

{x/2 for 0 < x < 2,

0 elsewhere.

Solution.

μ = E (X ) =

∫ ∞

−∞xf (x)dx =

∫ 2

0xx

2dx =

x3

6

∣∣∣∣20

=4

3,

μ′2 = E (X 2) =

∫ ∞

−∞x2f (x)dx =

∫ 2

0x2

x

2dx =

x4

8

∣∣∣∣20

= 2,

Page 52: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

σ2 = μ2 = μ′2 − μ2 = 2−

(4

3

)2

=2

9,

or

σ2 = μ2 = E [(X − μ)2] =

∫ ∞

−∞(x − μ)2f (x)dx =

∫ 2

0

(x − 4

3

)2 x

2dx

=1

2

∫ 2

0

(x3 − 8

3x2 +

16

9x

)dx =

1

2

(x4

3− 8x3

9+

8x2

9

)2

0

=2

9

Page 53: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Theorem 31

If X has the variance σ2, then

var(aX + b) = a2σ2.

Proof.

SinceE [aX + b] = aE (X ) + b,

andE [(aX + b)2] = a2E (X 2) + 2abE (X ) + b2

then we have

σ2 = a2E (X 2) + 2abE (X ) + b2 − [aE (X ) + b]2

= a2E (X 2) + 2abE (X ) + b2 − a2[E (X )]2 − 2abE (X )− b2

= a2E (X 2)

= a2σ2.

Page 54: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

var(aX + b) = a2σ2.

1 For a = 1, we find that the addition of a constant to thevalues of a random variable, resulting in a shift of all thevalues of X to the left or to the right, in no way affects thespread of its distribution.

2 For b = 0 we find that if the values of a random variable aremultiplied by a constant, the variance is multiplied by thesquare of that constant, resulting in a corresponding change inthe spread of the distribution.

Page 55: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Theorem 32

If X and Y are independent random variables,

var(X + Y ) = var(X ) + var(Y ), var(X − Y ) = var(X ) + var(Y ).

Proof.

The proof is left as an exercise.

Page 56: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 33

Suppose that X and Y are independent, with var(X ) = 6,E (X 2Y 2) = 150, E (Y ) = 2, E (X ) = 2. Compute var(Y ).

Solution.

var(X ) = 6 = E (X 2)− (E (X ))2 = E (X 2)− 22 ⇒ E (X 2) = 10.

Since X and Y are independent, we have E (XY ) = E (X )E (Y ),thus E (X 2Y 2) = E (X 2)E (Y 2). Therefore,

E (X 2Y 2) = 150 = E (X 2)E (Y 2) = 10E (Y 2) ⇒ E (Y 2) = 15.

Also, since var(Y ) = E (Y 2)− (E (Y ))2, then we obtain that

var(Y ) = E (Y 2)− (E (Y ))2 = 15− 22 = 11.

Page 57: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Outline

1 Introduction

2 The Expected Value of a Random Variable

3 Moments

4 Chebyshev’s Theorem

5 Moment Generating Functions

6 Product Moments

7 Conditional Expectation

Page 58: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Chebyshev’s Theorem

Theorem 34 (Chebyshev’s Theorem)

Suppose that X is a random variable (discrete or continuous)having mean μ and variance σ2, which are finite. Then if ε is anypositive number

P(|X − μ| ≥ ε) ≤ σ2

ε2

or, with ε = kσ

P(|X − μ| ≥ kσ) ≤ 1

k2.

Page 59: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 35

A random variable X has density function given by

f (x) =

{2e−2x , x ≥ 0,

0, x < 0.

(a) Find P(|X − μ| ≥ 1).

(b) Use Chebyshev’s theorem to obtain an upper bound onP(|X − μ| ≥ 1) and compare with the result in (a).

Page 60: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Solution.(a) First, we need to find μ:

μ = E (X ) =

∫ ∞

−∞xf (x)dx =

∫ ∞

0x2e−2xdx

= limc→∞

∫ c

0x2e−2xdx(x = u ⇒ dx = du, 2e−2xdx = dv ⇒ −e−2x = v)

= limc→∞

(−xe−2x

∣∣c0+

∫ c

0e−2xdx

)= lim

c→∞

(−xe−2x − 1

2e−2x

)c

0

= − limc→∞ e−2x

(x +

1

2

)∣∣∣∣c0

= − limc→∞

(e−2c

(c +

1

2

)− e0

(0 +

1

2

))

= −(0− 1

2

)=

1

2= 0.5.

Since P(|X − μ| ≥ 1) = 1− P(|X − μ| < 1) and|X − μ| = |X − 0.5| < 1 ⇔ −1 < X − 0.5 < 1 ⇔ −1 + 0.5 < X <1 + 0.5 ⇒ −0.5 < X < 1.5 then

Page 61: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

f (x) =

{2e−2x , x ≥ 0,

0, x < 0.

P(|X − μ| ≥ 1) = 1− P(|X − μ| < 1) = 1− P(−0.5 < X < 1.5)

= 1−∫ 1.5

−0.5f (x)dx = 1−

∫ 1.5

02e−2xdx

= 1 +(e−2x

)1.50

= 1 + (e−3 − e0) = e−3 = 0.049787068.

(b) If we take ε = 1 in Chebyshev’s theorem we obtain that

P(|X − μ| ≥ 1) ≤ σ2

12= σ2.

One can show that σ2 = 0.25. So an upper bound for the givenprobability is 0.25 which is quite far from the real value.

Page 62: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 36

If a random variable X is such that E [(X − 1)2] = 10,E [(X − 2)2] = 6,

(a) Find the mean of X .

(b) Find the standard deviation of X .

(c) Find an upper bound for P(∣∣X − 7

2

∣∣ ≥ √15).

Page 63: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Solution.

(a) Since

E [(X − 1)2] = E [X 2 − 2X + 1] = E [X 2]− 2E [X ] + 1 = 10 ⇒E [X 2]− 2E [X ] = 9,

E [(X − 2)2] = E [X 2 − 4X + 4] = E [X 2]− 4E [X ] + 4 = 6 ⇒E [X 2]− 4E [X ] = 2,

then we have that

E [X 2] = 16 and μ = E [X ] =7

2.

(b) σ =√

Var(X ) =√

E [X 2]− (E [X ])2 =√

16− 494 =

√152 .

(c) If we use Chebyshev’s inequality by taking μ = 72 , ε =

√15

and σ2 = 154 , we obtain

P(|X − μ| ≥ ε) ≤ σ2

ε2⇒ P

(∣∣∣∣X − 7

2

∣∣∣∣ ≥ √15

)≤ 15/4

15=

1

4.

Page 64: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Outline

1 Introduction

2 The Expected Value of a Random Variable

3 Moments

4 Chebyshev’s Theorem

5 Moment Generating Functions

6 Product Moments

7 Conditional Expectation

Page 65: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Moment Generating Functions

Although the moments of most distributions can be determined directlyby evaluating the necessary integrals or sums, an alternative proceduresometimes provides considerable simplification. This technique utilizesmoment-generating functions.

Definition 37

The moment-generating function of a random variable X , whereit exists, is given by

MX (t) = E [etX ] =∑x

etx f (x)

when X is discrete and

MX (t) = E [etX ] =

∫ ∞

−∞etx f (x)dx

when X is continuous.

Page 66: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

To explain why we refer to this function as a moment-generating functionlet us substitute for etx its Maclaurin’s series expansion, that is

etx = 1 + tx +t2x2

2!+

t3x3

3!+ · · ·+ trx r

r !+ · · · .

For discrete case, we thus get

MX (t) =∑x

[1 + tx +

t2x2

2!+ · · ·+ trx r

r !+ · · ·

]f (x)

=∑x

f (x)

︸ ︷︷ ︸1

+t∑x

xf (x)

︸ ︷︷ ︸E [X ]=µ′

1=µ

+t2

2!

∑x

x2f (x)

︸ ︷︷ ︸µ′2

+ · · ·+ tr

r !

∑x

x r f (x)

︸ ︷︷ ︸µ′r

+ · · ·

= 1 + μt + μ′2

t2

2!+ · · ·+ μ′

r

tr

r !+ · · ·

and it can be seen that in the Maclaurin’s series of the momentgenerating function of X the coefficient of tr/r ! is μ′

r , the rth momentabout the origin. In the continuous case, the argument is the same.

Page 67: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 38

Find the moment generating function of the random variablewhose probability density is given by

f (x) =

{e−x for x > 0,

0 elsewhere

and use it to find an expression for μ′r .

Page 68: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Solution. By definition

MX (t) = E (etX ) =

∫ ∞

0etxe−xdx =

∫ ∞

0e−x(1−t)dx

= limc→∞

∫ c

0e−x(1−t)dx = − 1

1− tlimc→∞ e−x(1−t)

∣∣∣c0

= − 1

1− tlimc→∞(e−c(1−t) − e0) =

1

1− tfor t < 1.

As is well known, when |t| < 1 the Maclaurin’s series for thismoment generating function is

MX (t) =1

1− t= 1 + t + t2 + t3 + · · ·+ tr + . . .

= 1 + 1!t

1!+ 2!

t2

2!+ 3!

t3

3!+ · · · + r !

tr

r !+ · · ·

and hence μ′r = r ! for r = 0, 1, 2, · · · .

Page 69: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Theorem 39

d rMX (t)

dtr

∣∣∣∣t=0

= μ′r .

This follows from the fact that if a function is expanded as a powerseries in t, the coefficient of tr/r ! is the r th derivative of thefunction with respect to t at t = 0.

Page 70: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 40

Given that X has the probability distribution f (x) = 18

(3x

)for

x = 0, 1, 2 and 3, find the moment-generating function of thisrandom variable and use it to determine μ′

1 and μ′2.

Solution. Since

MX (t) = E (etx ) =3∑

x=0

etx(3

x

)=

1

8(1 + 3et + 3e2t + 3e3t)

=1

8(1 + et)3.

Then, by Theorem 39,

μ′1 = M ′

X (t)∣∣t=0

=3

8(1 + et)2et

∣∣∣∣t=0

=3

2,

and

μ′2 = M ′′

X (t)∣∣t=0

=[34(1 + et)2e2t +

3

8(1 + et)2et

∣∣∣∣t=0

= 3.

Page 71: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Often the work involved in using moment-generating functions canbe simplified by making use of the following theorem:

Theorem 41

If a and b are constant, then

(a) MX+a(t) = E [e(X+a)t ] = eatMX (t);

(b) MbX (t) = E (ebXt) = MX (bt);

(c) MX+ab(t) = E

[e(

X+ab )t] = e

abtMX

(tb

).

Proof.

The proof is left as an exercise.

Page 72: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Outline

1 Introduction

2 The Expected Value of a Random Variable

3 Moments

4 Chebyshev’s Theorem

5 Moment Generating Functions

6 Product Moments

7 Conditional Expectation

Page 73: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Product Moments

Definition 42

The r th and sth product moment about the origin of therandom variables X and Y , denoted by μ′

r ,s , is the expected valueof X rY s ; symbolically

μ′r ,s = E (X rY s) =

∑x

∑y

x ry s f (x , y)

for r = 0, 1, 2, · · · and s = 0, 1, 2, · · · when X and Y are discrete.

The double summation extends over the entire joint range of thetwo random variables. Note that μ1,0 = E (X ), which we denotehere μX , and that μ0,1 = E (Y ), which we denote here by μY .

Page 74: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Definition 43

The r th and sth product moment about the means of therandom variables X and Y , denoted by μr ,s , is the expected valueof (X − μX )

r (Y − μY )s ; symbolically

μr ,s = E [(X −μX )r (Y −μY )

s ] =∑x

∑y

(x−μX )r (y−μY )

s f (x , y)

for r = 0, 1, 2, · · · and s = 0, 1, 2, · · · when X and Y are discrete.

In statistics, μ1,1 is of special importance because it is indicative ofthe relationship, if any, between the values of X and Y ; thus it isgiven a special symbol and special name.

Page 75: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Definition 44

μ1,1 is called the covariance of X and Y , and it is denoted byσXY , cov(X ,Y ), or C (X ,Y ).

Observe that if there is a high probability that large values of Xwill go with large values of Y and small values of X with smallvalues of Y , the covariance will be positive, if there is a highprobability that large values of X will go with small values of Y ,and vice versa, the covariance will be negative.

x

y

+

+−−

μX

μY

It is in this sense that the covariance measures the relationship, orassociation, between the values of X and Y .

Page 76: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Theorem 45

σXY = μ′1,1 − μXμY .

Proof.

Using the various theorems about expected values, we can write

σXY = E [(X − μX )(Y − μY )]

= E [XY − XμY − Y μX + μXμY ]

= E (XY )− μYE (X )− μXE (Y ) + μXμY

= E (XY )− μYμX − μXμY + μXμY

= μ′1,1 − μXμY .

Page 77: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 46

We know that, the joint and marginal probabilities of X and Y ,the numbers of aspirin and sedative caplets among two capletsdrawn at random from a bottle containing three aspirin, twosedative, and four laxative caplets were recorded as follows:

������yx

0 1 2 h(y)

0 6/36 12/36 3/36 21/36

1 8/36 6/36 14/36

2 1/36 1/36

g(x) 15/36 18/36 3/36 1

Find the covariance of X and Y .

Page 78: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

������yx

0 1 2 h(y)

0 6/36 12/36 3/36 21/36

1 8/36 6/36 14/36

2 1/36 1/36

g(x) 15/36 18/36 3/36 1

Solution. Referring to the joint probabilities given here, we get

μ′1,1 =E (XY ) =

2∑x=0

2∑y=0

xyf (x , y)

=0 · 0 · 6

36+ 1 · 0 · 12

36+ 2 · 0 · 3

36

+ 0 · 1 · 8

36+ 1 · 1 · 6

36+ 0 · 2 · 1

36=

6

36

Page 79: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

������yx

0 1 2 h(y)

0 6/36 12/36 3/36 21/36

1 8/36 6/36 14/36

2 1/36 1/36

g(x) 15/36 18/36 3/36 1

and using the marginal probabilities, we get

μX = E (X ) =2∑

x=0

xf (x , y) =2∑

x=0

xg(x) = 0·1536

+1·1836

+2· 336

=24

36

and

μY = E (Y ) =2∑

y=0

yf (x , y) =2∑

y=0

yh(y) = 0·2136

+1·1436

+2· 136

=16

36.

Page 80: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

It follows that

σXY = μ′1,1 − μXμY =

6

36− 24

36· 1636

= − 7

54.

The negative result suggests that the more aspirin tablets we getthe fewer sedative tables we will get, and vice versa, and this, ofcourse, make sense.

Page 81: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 47

A quarter is bent so that the probabilities of heads and tails are 0.4and 0.6. If it is tossed twice, what is the covariance of Z , thenumber of heads obtained on the first toss, and W , the totalnumber of heads obtained in the two tosses of the coin.

Solution.

f (0, 0) = P(T ,T ) = 0.6·0.6 = 0.36, f (1, 1) = P(H,T ) = 0.4·0.6 = 0.24,

f (0, 1) = P(T ,H) = 0.6·0.4 = 0.24, f (1, 2) = P(H,H) = 0.4·0.4 = 0.16.

������zw

0 1 2 g(z)

0 0.36 0.24 0.60

1 0.24 0.16 0.40

h(w) 0.36 0.48 0.16 1

Page 82: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

σZW = μ′1,1 − μZμW = E (ZW )− E (Z )E (W ).

μ′1,1 = E (ZW ) =

1∑z=0

2∑w=1

f (z ,w)

= 0 · 0 · 0.36 + 0 · 1 · 0.24 + 1 · 1 · 0.24 + 1 · 2 · 0.16 = 0.56,

μZ = E (Z ) =1∑

z=0

f (z ,w) =1∑

z=0

g(z) = 0 · 0.60 + 1 · 0.40 = 0.40,

μW = E (W ) =2∑

w=0

f (z ,w) =2∑

w=0

h(w) = 0 · 0.36 + 1 · 0.48 + 2 · 0.16

= 0.80 ⇒σZW = 0.56 − 0.40 · 0.80 = 0.24.

Page 83: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

As far as the relation between X and Y is concerned, observe thatif X and Y are independent, their covariance is zero; symbolically,

Theorem 48

If X and Y are independent, then σXY = 0.

Proof.

If X and Y are independent then we know that

E (XY ) = E (X )E (Y ).

SinceσXY = μ′

1,1 − μXμY = E (XY )− E (X )E (Y )

thusσXY = E (X )E (Y )− E (X )E (Y ) = 0.

Page 84: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

It is of interest to note that the independence of two randomvariables implies a zero covariance, but a zero covariance does notnecessarily imply their independence. This is illustrated by thefollowing example.

Example 49

If the joint distribution of X and Y is given by

������yx −1 0 1 h(y)

−1 1/6 2/6 1/6 4/6

0 0 0 0 0

1 1/6 0 1/6 2/6

g(x) 2/6 2/6 2/6 1

show that their covariance is zero even though the two randomvariables are not independent.

Page 85: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Solution. Using the probabilities shown in the margins, we get

μX =1∑

x=−1

xg(x) = (−1) · 26+ 0 · 2

6+ 1 · 2

6= 0,

μY =1∑

y=−1

yh(y) = (−1) · 46+ 0 · 0 + 1 · 2

6= −2

6,

μ′1,1 =

1∑x=−1

1∑y=−1

xyf (x , y)

= (−1)(−1)1

6+ 1(−1) · 1

6+ (−1)1 · 1

6+ 1 · 1 · 1

6= 0.

Thus, σXY = 0− 0(−2

6

)= 0, the covariance is zero, but the two

random variables are not independent. For instance,f (x , y) = g(x)h(y) for x = −1 and y = −1.

Page 86: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Theorem 50

Var(X ± Y ) = Var(X ) + Var(Y )± 2Cov(X ,Y )

orσ2X±Y = σ2

X + σ2Y ± 2σXY .

Proof.

Var(X ± Y )

=E [((X ± Y )− (μX ± μY ))2]

=E [(X ± Y )2 − 2(X ± Y )(μX ± μY ) + (μX ± μY )2]

=E [X 2 ± 2XY + Y 2 − 2XμX ∓ 2XμY ∓ 2YμX − 2YμY

+ μ2X ± 2μXμY + μ2

Y ]

=E [(X − μX )2 + (Y − μY )

2 ± 2XY ∓ 2XμY ∓ 2YμX ± 2μXμY ]

=E [(X − μX )2] + E [(Y − μY )

2]± 2E (XY )∓ 2μYE (X )∓ 2μXE (Y )± 2μXμY

=Var(X ) + Var(Y )± 2Cov(X ,Y ).

Page 87: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Outline

1 Introduction

2 The Expected Value of a Random Variable

3 Moments

4 Chebyshev’s Theorem

5 Moment Generating Functions

6 Product Moments

7 Conditional Expectation

Page 88: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Conditional Expectation

Definition 51

If X is a discrete random variable and f (x |y) is the value of theconditional probability distribution of X given Y = y at x , theconditional expectation of u(X ) given Y = y is

E [u(X )|y ] =∑x

u(x)f (x |y).

Similar expressions based on the conditional probability distributionof Y given X = x define the conditional expectation of v(Y ) givenX = x .

Page 89: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Definition 52

If we let u(X ) = X in Definition 51, we obtain the conditionalmean of the random variable X given Y = y , which we denote by

μX |y = E (X |y) =∑x

xf (x |y).

Definition 53

Correspondingly, the conditional variance of X given Y = y is

σ2X |y = E [(X − μX |y )2|y ] = E (X 2|y)− μ2

X |y

where E (X 2|y) is given by Definition 51 with u(X ) = X 2

Page 90: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 54

With reference to Example 46, find the conditional mean of Xgiven Y = 1.

Solution. Making use of the results obtained before, that is,f (0|1) = 4

7 , f (1|1) = 37 and f (2|1) = 0, we get

E (X |1) =2∑

x=0

xf (x |1) = 0 · 47+ 1 · 3

7+ 2 · 0 =

3

7.

Page 91: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 55

Suppose that we have the following joint probability distribution

������yx

1 2 3 4 h(y)

1 0.01 0.07 0.09 0.03 0.20

2 0.20 0.05 0.25 0.50

3 0.09 0.03 0.06 0.12 0.30

g(x) 0.30 0.10 0.20 0.40 1

(a) Find the conditional expectation of u(x) = x2 given Y = 2.

(b) Find the conditional mean of X given Y = 2.

(c) Find the conditional variance of X given Y = 2.

Page 92: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Solution. Sincef (x |y) = f (x ,y)

h(y) ⇒ f (x |2) = f (x ,2)h(2) = f (x ,2)

0.5 = 2f (x , 2), then wehave

f (x |2) =

⎧⎪⎪⎪⎪⎨⎪⎪⎪⎪⎩2f (1, 2) = 0.40, x = 1,

2f (2, 2) = 0.00, x = 2,

2f (3, 2) = 0.10, x = 3,

2f (4, 2) = 0.50, x = 4.

(a) The conditional expectation of u(x) = x2 given Y = 2 is

E [X 2|Y = 2] =4∑

x=1

x2f (x |2)

= 12f (1|2) + 22f (2|2) + 32f (3|2) + 42f (4|2)= 1(0.40) + 4(0.00) + 9(0.10) + 16(0.50)

= 9.30.

Page 93: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

(b) The conditional mean of X given Y = 2 is

E [X |Y = 2] = μX |2 =4∑

x=1

xf (x |2)

= 1f (1|2) + 2f (2|2) + 3f (3|2) + 4f (4|2)= 1(0.40) + 2(0.00) + 3(0.10) + 4(0.50)

= 2.70

Page 94: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

(c) The conditional variance of X given Y = 2.

E [(X − μX |2)2|2] =σ2X |2 =

4∑x=1

(x − 2.7)2f (x |2)

=(1− 2.7)2f (1|2) + (2− 2.7)2f (2|2)+ (3− 2.7)2f (3|2) + (4− 2.7)2f (4|2)

=2.89(0.40) + 0.49(0.00) + 0.09(0.10) + 1.69(0.50)

=2.01

or simply

E [(X − μX |2)2|2] =σ2X |2 = E (X 2|Y = 2)− μ2

X |2=9.30 − (2.70)2

=2.01.

Page 95: Lecture 5: Mathematical Expectationweb.iku.edu.tr/~eyavuz/dersler/probability/5-nopause.pdf · Lecture 5: Mathematical Expectation Assist. Prof. Dr. EmelYAVUZDUMAN MCB1007 Introduction

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Thank You!!!