84
Expectation for multivariate distributions

Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Embed Size (px)

Citation preview

Page 1: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Expectation

for multivariate distributions

Page 2: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Definition

Let X1, X2, …, Xn denote n jointly distributed random variable with joint density function

f(x1, x2, …, xn )

then

1, , nE g X X

1 1 1, , , , , ,n n ng x x f x x dx dx

Page 3: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Example

Let X, Y, Z denote 3 jointly distributed random variable with joint density function then

2127 0 1,0 1,0 1

, ,0 otherwise

x yz x y zf x y z

Determine E[XYZ].

Page 4: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

1 1 1

2

0 0 0

12

7E XYZ xyz x yz dxdydz Solution:

11 1 1 14 2

2 2 2 2

0 0 0 00

12 32

7 4 2 7

x

x

x xyz y z dydz yz y z dydz

1 1 1

3 2 2

0 0 0

12

7x yz xy z dxdydz

11 12 32 2

0 00

3 3 1 22

7 2 3 7 2 3

y

y

y yz z dz z z dz

12 3

0

3 2 3 1 2 3 17 17

7 4 9 7 4 9 7 36 84

z z

Page 5: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Some Rules for Expectation

Page 6: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

1 11. , ,i i n nE X x f x x dx dx

i i i ix f x dx

Thus you can calculate E[Xi] either from the joint distribution of

X1, … , Xn or the marginal distribution of Xi. Proof: 1 1, , , ,i n nx f x x dx dx

1 1 1 1, ,i n i i n ix f x x dx dx dx dx dx

i i i ix f x dx

Page 7: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

1 1 1 12. n n n nE a X a X a E X a E X

The Linearity property

Proof:

1 1 1 1, ,n n n na x a x f x x dx dx

1 1 1 1, , n na x f x x dx dx

1 1, ,n n n na x f x x dx dx

Page 8: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

1 1, , , ,q q kE g X X h X X

In the simple case when k = 2

3. (The Multiplicative property) Suppose X1, … , Xq

are independent of Xq+1, … , Xk then

1 1, , , ,q q kE g X X E h X X

E XY E X E Y

if X and Y are independent

Page 9: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

1 1, , , ,q q kE g X X h X X

Proof:

1 1 1 1, , , , , ,q q k k ng x x h x x f x x dx dx

1 1, , , ,q q kE g X X E h X X

1 1 1 1, , , , , ,q q k qg x x h x x f x x

2 1 1 1, ,q k q q kf x x dx dx dx dx

1 1 1 1, , q q q kf x x dx dx dx dx

1 2 1 1, , , , , ,q k q k qh x x f x x g x x

Page 10: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

1 1, , , ,q q kE g X X E h X X

1 2 1 1, , , ,q k q k q kh x x f x x dx dx

1, , qE g X X

Page 11: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Some Rules for Variance

2 2 2Var X XX E X E X

Page 12: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

1. Var Var Var 2Cov ,X Y X Y X Y

Proof

Thus

where Cov , = X YX Y E X Y

2Var X YX Y E X Y

where X Y X YE X Y

2Var X YX Y E X Y

2 22X X Y YE X X Y Y

Var 2Cov , VarX X Y Y

Page 13: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

and Var Var VarX Y X Y

Note: If X and Y are independent, then

Cov , = X YX Y E X Y

= X YE X E Y

= 0X YE X E Y

Page 14: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

2 2and Var 2 X Y XY X YX Y

Definition: For any two random variables X and Y then define the correlation coefficient XY to be:

Cov , Cov ,=

Var Varxy

X Y

X Y X Y

X Y

Thus Cov , = XY X YX Y

if X and Y are independent

2 2X Y

Page 15: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Properties of the correlation coefficient XY

Cov , Cov ,=

Var Varxy

X Y

X Y X Y

X Y

If and are independent than 0.XYX Y

: Cov , 0 X Y Reason

The converse is not necessarily true.

i.e. XY = 0 does not imply that X and Y are independent.

Page 16: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

More properties of the correlation coefficient XY

1 1XY

if there exists a and b such thatand 1XY

1P Y bX a

whereXY = +1 if b > 0 and XY = -1 if b< 0

Proof: Let

and . X YU X V Y

Let 20 g b E V bU

for all b.

Consider choosing b to minimize

Page 17: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Since g(b) ≥ 0, then g(bmin) ≥ 0

or

2 g b E V bU

Consider choosing b to minimize

2 2 22 E V bVU b U 2 2 22 E V bE VU b E U

22 2 0 g b E VU bE U

min 2

E VUb b

E U

Page 18: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Hence g(bmin) ≥ 0

2 2 2min min min2 g b E V b E VU b E U

2

2

2 22

E VU E VUE V E VU

E U E U

2

2

20

E VUE V

E U

Hence 2

2 21

E VU

E U E V

Page 19: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

or

2

2

2 21

X Y

XY

X Y

E X Y

E X E Y

2 2 2min min min2 g b E V b E VU b E U

2

min 0E V b U

Note

If and only if2 1XY

This will be true if min 0 1P V b U

i.e. min 0 1Y XP Y b X min min1 where Y XP Y b X a a b

Page 20: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Summary1 1XY

if there exists a and b such thatand 1XY

1P Y bX a

where

min 2

X X

X

E X Yb b

E X

minand YY X Y XY X

X

a b

2

Cov ,= =

VarXY X Y Y

XYX X

X Y

X

Page 21: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

2 22. Var Var Var 2 Cov ,aX bY a X b Y ab X Y

Proof

Thus

2Var aX bYaX bY E aX bY

with aX bY X YE aX bY a b

2Var X YaX bY E aX bY a b

2 22 22X X Y YE a X ab X Y b Y 2 2Var 2 Cov , Vara X ab X Y b Y

Page 22: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

1 13. Var n na X a X

2 21 1Var Varn na X a X

1 2 1 2 1 12 Cov , 2 Cov ,n na a X X a a X X

2 3 2 3 2 22 Cov , 2 Cov ,n na a X X a a X X

1 12 Cov ,n n n na a X X

2

1

Var 2 Cov ,n

i i i j i ji

a X a a X X

i j

21

1

Var if , , are mutually independentn

i i ni

a X X X

Page 23: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Some Applications

(Rules of Expectation & Variance)

Let1

1

1 1 1n

i ni

X X X Xn n n

Let X1, … , Xn be n mutually independent random variables each having mean and standard deviation (variance 2).

1 1 n na X a X

Then 1

1 1nX E X E X E X

n n

1 1

n n

Page 24: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Also

or X n

2 2

21

1 1nX Var X Var X Var X

n n

2 22 21 1

n n

2 2

2n

n n

and X X n

Thus

Hence the distribution of is centered at and becomes more and more compact about as n increases

X

Page 25: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Tchebychev’s Inequality

Page 26: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Tchebychev’s InequalityLet X denote a random variable with

mean =E(X) and

variance Var(X) = E[(X – )2] = 2

then

Note:

Is called the standard deviation of X,

2

11P X k

k

2

11P k X k

k

2Var X E X

Page 27: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Proof:

dxxfxXVar 22)(

kdxxfx 2

k

k

kdxxfxdxxfx 22

k

kdxxfkdxxfk 2222

kdxxfx 2

kdxxfx 2

Page 28: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

kXPkXPk 22

k

kdxxkfdxxfk 22

kXPk 22

kXPk 222 Thus

2

1or

kkXP

2

11 and

kkXP

Page 29: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Tchebychev’s inequality is very conservative

•k =1

•k = 2

•k = 3

2

11

kkXkPkXP

01

11

2 XPXP

4

3

2

11222

2 XPXP

9

8

3

11333

2 XPXP

Page 30: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

The Law of Large Numbers

Page 31: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

The Law of Large Numbers

Let1

1 n

ii

X Xn

Let X1, … , Xn be n mutually independent random variables each having mean

Then for any > 0 (no matter how small)

1 as P X P X n

Page 32: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Proof

2

11X X X XP k X k

k

and X X n

Now

We will use Tchebychev’s inequality which states for any random variable X.

P X

where or X

nk k k

n

2

11

kkXkP XX

Page 33: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

as n

Thus

Thus

2 2

11 1 1 P X

k n

1 as P X n

Page 34: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Thus the Law of Large Numbers states

ˆ 1 as P p p p n

A Special case

Let X1, … , Xn be n mutually independent random variables each having Bernoulli distribution with parameter p

1 if repetition is (prob )

0 if repetition is (prob 1 )i

pX

q p

S

F

iE X p

1 ˆ proportion of successesnX XX p

n

Page 35: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Thus the Law of Large Numbers states that

as n

Some people misinterpret this to mean that if the proportion of successes is currently lower that p then the proportion of successes in the future will have to be larger than p to counter this and ensure that the Law of Large numbers holds true.

Of course if in the infinite future the proportion of successes is p than this is enough to ensure that the Law of Large numbers holds true.

ˆ proportion of successesp

converges to the probability of success p

Page 36: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Some more applications

Rules of expectation and Rules of Variance

Page 37: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

The mean and variance

of a Binomial Random variable

We have already computed this by other methods:

1. Using the probability function p(x).

2. Using the moment generating function mX(t).

Suppose that we have observed n independent repetitions of a Bernoulli trialLet X1, … , Xn be n mutually independent random variables each having Bernoulli distribution with parameter pand defined by

1 if repetition is (prob )

0 if repetition is (prob )i

i pX

i q

S

F

Page 38: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Now X = X1 + … + Xn has a Binomial distribution with parameters n and pX is the total number of successes in the n repetitions.

1 0iE X p q p

1X nE X E X p p np

2 22 1 0iVar X p p p q pq

21var varX nX X pq pq npq

Page 39: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

The mean and variance

of a Hypergeometric distribution

The hypergeometric distribution arises when we sample with replacement n objects from a population of N = a + b objects. The population is divided into to groups (group A and group B). Group A contains a objects while group B contains b objects

Let X denote the number of objects in the sample of n that come from group A. The probability function of X is:

a b

x n xp x

a b

n

Page 40: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Then

Let X1, … , Xn be n random variables defined by

1 if object selected comes from group

0 if object selected comes from group

th

i th

i AX

i B

1 nX X X

1 and 0i i

a bP X P X

a b a b

Proof

1 1

1 !

1 1 !1

!

!

a b ni

a b n

a ba

a P a b n aP X

a bP a b

a b n

Page 41: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

and

Therefore

1 1 0 0 i i i

aE X P X P X

a b

2 2 21 1 0 0 i i i

aE X P X P X

a b

2

22var - i i i

a aX E X E X

a b a b

1- a a a b

a b a b a b a b

Page 42: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Thus

bn

a b

1 nE X E X X

1

n

ii

E X

Page 43: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

and

Also

var i

a bX

a b a b

1Var Var nX X X

1

Var 2 Cov ,n

i i ji

X X X

We need to also calculate Cov ,i jX X

Note: Cov , U VU V E U V U V U VE UV V U

U V V U U VE UV

U VE UV E UV E U E V

Page 44: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

and ia

E Xa b

Thus Cov ,i j i j i jX X E X X E X E X

Note:

1 1 0 0i j i j i jE X X P X X P X X 1 1, 1i j i jP X X P X X

2 2

2

11, 1 a b n

i ja b n

a a PP X X

P

2 !1

2 2 ! 1

! 1

!

a ba a

a b n a a

a b a b a b

a b n

Page 45: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

and

Thus

Cov ,i j i j i jX X E X X E X E X

1

1i j

a aE X X

a b a b

21

1

a a a

a b a b a b

1

1

a a a

a b a b a b

1 1

1

a a b a a ba

a b a b a b

21

ab

a b a b

Page 46: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

with

Thus

2var i

a b abX

a b a b a b

1Var Var nX X X

1

Var 2 Cov ,n

i i ji

X X X

and 2Cov ,

1i j

abX X

a b a b

1

Var Var 2 Cov ,n

i i ji

X X X X

2 2

12

2 1

n nab abn

a b a b a b

i j

i j

Page 47: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Thus

1

Var Var 2 Cov ,n

i i ji

X X X X

2 2

12

2 1

n nab abn

a b a b a b

i j

2

11

1

nabn

a ba b

1A Bnp p f

1 1where , and

1 1A B

a b n np p f

a b a b a b N

Page 48: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Thus if X has a hypergeometric distribution with parameters a, b and n then

Var 1A BX np p f

1 1where , and

1 1A B

a b n np p f

a b a b a b N

A

aE X n np

a b

Page 49: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

The mean and variance

of a Negative Binomial distribution

The Negative Binomial distribution arises when we repeat a Bernoulli trial until k successes (S) occur. Then X = the trial on which the kth success occurred.

The probability function of X is:

1 , 1, 2,...

1k x kx

p x p q x k k kk

Let X1= the number of trial on which the 1st success occurred.

and Xi = the number of trials after the (i -1)st success on which the ith success occurred (i ≥ 2)

Page 50: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Xi each have a geometric distribution with parameter p.

Then X = X1 + … + Xk

and X1, … , Xk are mutually independent

2

1thus and Vari i

qE X X

p p

1

hence k

ii

kE X E X

p

21

and Var Vark

ii

kqX X

p

Page 51: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Thus if X has a negative binomial distribution with parameters k and p then

2Var

kqX

p

kE X

p

Page 52: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Multivariate Moments

Non-central and Central

Page 53: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Definition

Let X1 and X2 be a jointly distirbuted random variables (discrete or continuous), then for any pair of positive integers (k1, k2) the joint moment of (X1, X2) of order (k1, k2) is defined to be:

1 2

1 2 1 2k k

k k E X X

1 2

1 2

1 2

1 2 1 2 1 2

1 2 1 2 1 2 1 2

-

, if , are discrete

, if , are continuous

k k

x x

k k

x x p x x X X

x x f x x dx dx X X

Page 54: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Definition

Let X1 and X2 be a jointly distirbuted random variables (discrete or continuous), then for any pair of positive integers (k1, k2) the joint central moment of (X1, X2) of order (k1, k2) is defined to be:

1 2

1 2

0, 1 1 2 2

k k

k k E X X

1 2

1 2

1 2

1 1 2 2 1 2 1 2

1 1 2 2 1 2 1 2 1 2

-

, if , are discrete

, if , are continuous

k k

x x

k k

x x p x x X X

x x f x x dx dx X X

where 1 = E [X1] and 2 = E [X2]

Page 55: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Note

01,1 1 1 2 2 1 2 Cov ,E X X X X

= the covariance of X1 and X2.

Definition: For any two random variables X and Y then define the correlation coefficient XY to be:

Cov , Cov ,=

Var Varxy

X Y

X Y X Y

X Y

Page 56: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Properties of the correlation coefficient XY

Cov , Cov ,=

Var Varxy

X Y

X Y X Y

X Y

If and are independent than 0.XYX Y

: Cov , 0 X Y Reason

The converse is not necessarily true.

i.e. XY = 0 does not imply that X and Y are independent.

Page 57: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

More properties of the correlation coefficient

1 1XY

if there exists a and b such thatand 1XY

1P Y bX a

whereXY = +1 if b > 0 and XY = -1 if b< 0

Page 58: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Some Rules for Expectation

Page 59: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

1 11. , ,i i n nE X x f x x dx dx

i i i ix f x dx

Thus you can calculate E[Xi] either from the joint distribution of

X1, … , Xn or the marginal distribution of Xi.

1 1 1 12. n n n nE a X a X a E X a E X

The Linearity property

Page 60: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

1 1, , , ,q q kE g X X h X X

In the simple case when k = 2

3. (The Multiplicative property) Suppose X1, … , Xq

are independent of Xq+1, … , Xk then

1 1, , , ,q q kE g X X E h X X

E XY E X E Y

if X and Y are independent

Page 61: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Some Rules for Variance

2 2 2Var X XX E X E X

Page 62: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

1. Var Var Var 2Cov ,X Y X Y X Y

where Cov , = X YX Y E X Y

and Var Var VarX Y X Y

Note: If X and Y are independent, then

Cov , = X YX Y E X Y

= X YE X E Y

= 0X YE X E Y

Page 63: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

2 2and Var 2 X Y XY X YX Y

Definition: For any two random variables X and Y then define the correlation coefficient XY to be:

Cov , Cov ,=

Var Varxy

X Y

X Y X Y

X Y

Thus Cov , = XY X YX Y

if X and Y are independent

2 2X Y

Page 64: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

2 22. Var Var Var 2 Cov ,aX bY a X b Y ab X Y

Proof

Thus

2Var aX bYaX bY E aX bY

with aX bY X YE aX bY a b

2Var X YaX bY E aX bY a b

2 22 22X X Y YE a X ab X Y b Y 2 2Var 2 Cov , Vara X ab X Y b Y

Page 65: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

1 13. Var n na X a X

2 21 1Var Varn na X a X

1 2 1 2 1 12 Cov , 2 Cov ,n na a X X a a X X

2 3 2 3 2 22 Cov , 2 Cov ,n na a X X a a X X

1 12 Cov ,n n n na a X X

2

1

Var 2 Cov ,n

i i i j i ji

a X a a X X

i j

21

1

Var if , , are mutually independentn

i i ni

a X X X

Page 66: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Distribution functions, Moments,

Moment generating functions in the Multivariate case

Page 67: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

The distribution function F(x)

This is defined for any random variable, X.

F(x) = P[X ≤ x]

Properties

1. F(-∞) = 0 and F(∞) = 1.

2. F(x) is non-decreasing

(i. e. if x1 < x2 then F(x1) ≤ F(x2) )

3. F(b) – F(a) = P[a < X ≤ b].

Page 68: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

4. Discrete Random Variables

F(x) is a non-decreasing step function with

u x

F x P X x p u

jump in at .p x F x F x F x x

0 and 1F F

0

0.2

0.4

0.6

0.8

1

1.2

-1 0 1 2 3 4

F(x)

p(x)

Page 69: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

5. Continuous Random Variables Variables

F(x) is a non-decreasing continuous function with

x

F x P X x f u du

.f x F x

0 and 1F F

F(x)

f(x) slope

0

1

-1 0 1 2x

To find the probability density function, f(x), one first finds F(x) then .f x F x

Page 70: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

The joint distribution function F(x1, x2, …, xk)

is defined for k random variables, X1, X2, … , Xk.

F(x1, x2, … , xk) = P[ X1 ≤ x1, X2 ≤ x2 , … , Xk ≤ xk ]

for k = 2

F(x1, x2) = P[ X1 ≤ x1, X2 ≤ x2]

(x1, x2)

x1

x2

Page 71: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Properties

1. F(x1 , -∞) = F(-∞ , x2) = F(-∞ , -∞) = 0

2. F(x1 , ∞) = P[ X1 ≤ x1, X2 ≤ ∞] = P[ X1 ≤ x1] = F1 (x1)

= the marginal cumulative distribution function of X1

F(∞, ∞) = P[ X1 ≤ ∞, X2 ≤ ∞] = 1

= the marginal cumulative distribution function of X2

F(∞, x2) = P[ X1 ≤ ∞, X2 ≤ x2] = P[ X2 ≤ x2] = F2 (x2)

Page 72: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

3. F(x1, x2 ) is non-decreasing in both the x1 direction and the x2 direction.

i.e. if a1 < b1 if a2 < b2 then

i. F(a1, x2) ≤ F(b1 , x2)

ii. F(x1, a2) ≤ F(x1 , b2)

iii. F( a1, a2) ≤ F(b1 , b2)(b1, b2)

x1

(b1, a2)(a1, a2)

(a1, b2)

x2

Page 73: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

4. P[a < X1 ≤ b, c < X2 ≤ d] =

F(b,d) – F(a,d) – F(b,c) + F(a,c).

(b, d)

x1

(b, c)(a, c)

(a, d)

x2

Page 74: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

4. Discrete Random Variables

F(x1, x2) is a step surface

2 2 1 1

1 2 1 1 2 2 1 2, , ,u x u x

F x x P X x X x p u u

1 2 1 2 1 2, jump in , at , .p x x F x x x x

(x1, x2)

x1

x2

Page 75: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

5. Continuous Random Variables

F(x1, x2) is a surface

1 1

1 2 1 1 2 2 1 2 1 2, , ,x x

F x x P X x X x f u u du du

2 21 2 1 2

1 21 2 2 12

, ,,

F x x F x xf x x

x x x x

(x1, x2)

x1

x2

Page 76: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Multivariate Moments

Non-central and Central

Page 77: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Definition

Let X1 and X2 be a jointly distirbuted random variables (discrete or continuous), then for any pair of positive integers (k1, k2) the joint moment of (X1, X2) of order (k1, k2) is defined to be:

1 2

1 2 1 2k k

k k E X X

1 2

1 2

1 2

1 2 1 2 1 2

1 2 1 2 1 2 1 2

-

, if , are discrete

, if , are continuous

k k

x x

k k

x x p x x X X

x x f x x dx dx X X

Page 78: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Definition

Let X1 and X2 be a jointly distirbuted random variables (discrete or continuous), then for any pair of positive integers (k1, k2) the joint central moment of (X1, X2) of order (k1, k2) is defined to be:

1 2

1 2

0, 1 1 2 2

k k

k k E X X

1 2

1 2

1 2

1 1 2 2 1 2 1 2

1 1 2 2 1 2 1 2 1 2

-

, if , are discrete

, if , are continuous

k k

x x

k k

x x p x x X X

x x f x x dx dx X X

where 1 = E [X1] and 2 = E [X2]

Page 79: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Note

01,1 1 1 2 2 1 2 Cov ,E X X X X

= the covariance of X1 and X2.

Page 80: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Multivariate Moment Generating functions

Page 81: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Recall

The moment generating function

if is discrete

if is continuous

tx

xtX

Xtx

e p x X

m t E ee f x dx X

Page 82: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Definition

Let X1, X2, … Xk be a jointly distributed random variables (discrete or continuous), then the joint moment generating function is defined to be:

1 1

1 , , 1, , k k

k

t X t XX X km t t E e

1 1

1

1 1

1 1

1 1 1

-

, , if , , are discrete

, , if , , are continuous

k k

k

k k

t x t xk k

x x

t x t xk k k

e p x x X X

e f x x dx dx X X

Page 83: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Definition

Let X1, X2, … Xk be a jointly distributed random variables (discrete or continuous), then the joint moment generating function is defined to be:

1 1

1 , , 1, , k k

k

t X t XX X km t t E e

1 1

1

1 1

1 1

1 1 1

-

, , if , , are discrete

, , if , , are continuous

k k

k

k k

t x t xk k

x x

t x t xk k k

e p x x X X

e f x x dx dx X X

1 , ,: 0, ,0 1

kX Xm Note

1 , ,

0, , , 0k iX X X

i

m t m t

Page 84: Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x

Power Series expansion the joint moment generating function (k = 2)

, , tX sY tX sYX Ym t s E e E e e

2 3 4

using 12! 3! 4!

u u u ue u

2 2

1 12! 2!

tX sYE tX sY

2 22 21

2! 2! ! !

k mk mt s t s

E Xt Ys X XYts Y X Yk m

2 2

1,0 0,1 2,0 1,1 2,0 ,12! 2! ! !

k m

k m

t s t st s ts

k m

2,0 0,2 ,2 21,0 0,1 1,11

2! 2! ! !k m k mt s t ts s t s

k m