16
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 8. Some Approximations to Probability Distributions: Limit Theorems Sections 8.2 -- 8.3: Convergence in Probability and in Distribution Jiaping Wang Department of Mathematical Science 04/22/2013, Monday

Jiaping Wang Department of Mathematical Science 04/22/2013, Monday

  • Upload
    dyanne

  • View
    31

  • Download
    0

Embed Size (px)

DESCRIPTION

Chapter 8. Some Approximations to Probability Distributions: Limit Theorems Sections 8 .2 -- 8.3: Convergence in Probability and in Distribution. Jiaping Wang Department of Mathematical Science 04/22/2013, Monday. Outline. Convergence in Probability Convergence in Distribution. - PowerPoint PPT Presentation

Citation preview

Page 1: Jiaping  Wang Department of Mathematical Science  04/22/2013, Monday

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

Chapter 8. Some Approximations to Probability Distributions: Limit Theorems

Sections 8.2 -- 8.3: Convergence in Probability and in Distribution

Jiaping Wang

Department of Mathematical Science

04/22/2013, Monday

Page 2: Jiaping  Wang Department of Mathematical Science  04/22/2013, Monday

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

Outline

Convergence in Probability

Convergence in Distribution

Page 3: Jiaping  Wang Department of Mathematical Science  04/22/2013, Monday

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

Part 1. Convergence in Probability

Page 4: Jiaping  Wang Department of Mathematical Science  04/22/2013, Monday

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

Introduction

Suppose that a coin has probability p, with 0≤p≤1, of coming up heads on a single flip. Suppose that we flip the coin n times, what can we say about the fraction of heads observed in the n flips?For example, if p=0.5, we draw different numbers of trials in a simulation, the result is given in the table

From here, we can find when n∞, the ratio is closer to 0.5 and thus the difference is closer to zero.

n 100 200 300 400

% 0.4700 0.5200 0.4833 0.5050

|%-0.5| 0.03 0.02 0.0167 0.005

Page 5: Jiaping  Wang Department of Mathematical Science  04/22/2013, Monday

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

Definition 8.1

In mathematical notations, let X denote the number of heads observed in the n tosses. Then E(X)=np, V(X)=np(1-p). One way to measure the closeness of X/n to p is to ascertain the probability that the distance will be less than a pre-assigned small value ε so that .

Definition 8.1: The sequence of random variables X1,X2, .., Xn is said to convergence in probability to the constant c, if for every positive number ε,

Page 6: Jiaping  Wang Department of Mathematical Science  04/22/2013, Monday

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

Theorem 8.1

Weak Law of Large Numbers: Let X1,X2, .., Xnbe independent and identical distributed random variables, with E(Xi)=μ and V(Xi)=σ2<∞ for each i=1,…, n. Let Then, for any positive real number ε,

Or .Thus, converges in probability toward μ.

The proof can be shown based on the Tchebysheff’s theorem with X replaced by and σ2 by σ2/n, then let

Page 7: Jiaping  Wang Department of Mathematical Science  04/22/2013, Monday

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

Theorem 8.2

Suppose that Xn converges in probability toward μ1 and Yn converges in probability toward μ2. Then the following statements are also true.

1. Xn+Yn converges in probability toward u1+u2.

2. XnYnconverges in probability toward u1u2.

3. Xn/Yn converges in probability toward u1/u2, provided u2≠0.

4. converges in probability toward , provided P(Xn≥0)=1.

Page 8: Jiaping  Wang Department of Mathematical Science  04/22/2013, Monday

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

Example 8.1

Let X be a binomial random variable with probability of success p and number of trials n. Show that X/n converges in probability toward p.

Answer: We have seen that we can write X as ∑Yi with Yi=1 if the i-th trial results inSuccess, and Yi=0 otherwise. Then X/n=1/n ∑Yi . Also E(Yi)=p and V(Yi)=p(1-p).Then the conditions of Theorem 8.1 are fulfilled with μ=p and σ2=p(1-p)< ∞ and thus we can conclude that, for any positive ε,

limn∞P(|X/n-p| ≥ε)=0.

Page 9: Jiaping  Wang Department of Mathematical Science  04/22/2013, Monday

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

Example 8.2

Suppose that X1, X2, …, Xn are independent and identically distributed random Variables with and all assumed finite. Let S2 denote the sample variance given by

Show that S2 converges in probability to V(Xi).Answer: Notice that where The quantity is the average of n independent and identical distributed variables of the form with E()= and V ()= - which is finite. Thus Theorem 8.1 tell us that converges to in probability. Finally, based on Theorem 8.2, we can have converges in probability to - =V(Xi).This example shows that for large samples, the sample variance has a high probability of being close to the population variance.

Page 10: Jiaping  Wang Department of Mathematical Science  04/22/2013, Monday

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

Part 2. Convergence in Distribution

Page 11: Jiaping  Wang Department of Mathematical Science  04/22/2013, Monday

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

Definition 8.2

In the last section, we only study the convergence of certain random variables Toward constants. In this section, we study the probability distributions of certain type random variables as n tends toward infinity.

Definition 8.2: Let Xn be a random variable with distribution function Fn(x). Let X be a random variable with distribution function F(x). If

limn∞Fn(x)=F(x)

At every point x for which F(x) is continuous, then Xn is said to converge in distribution toward X. F(x) is called the limiting distribution function of Xn.

Page 12: Jiaping  Wang Department of Mathematical Science  04/22/2013, Monday

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

Example 8.3

Let X1, X2, …, Xn be independent uniform random variables over the interval (θ, 0) for a negative constant θ. In addition, let Yn=min(X1, X2, …, Xn). Find the limiting distribution of Yn.

Answer: The distribution function for the uniform random variable Xi is

We know

so we can find =

Page 13: Jiaping  Wang Department of Mathematical Science  04/22/2013, Monday

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

Theorem 8.3

Let Xn and X be random variables with moment-generating functions Mn(t) and M(t), respectively. If

limn∞Mn(t)=M(t)

For all real t, then Xn converges in distribution toward X.

Page 14: Jiaping  Wang Department of Mathematical Science  04/22/2013, Monday

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

Example 8.4

Answer: We know the moment-generating function for the binomial random variablesXn, Mn(t) is given as

= based on np=λ .Recall that Letting k=λ(et-1), we have

which is the moment generating function of the Poisson random variable.

Let Xn be a binomial random variable with n trials and probability p of success on each trial. If n tends toward infinity and p tends zero with np remaining fixed. Show that Xn converges in distribution toward a Poisson random variable.

As an example, when n=10 and p=0.1, we can find the true probability from the binomialDistribution is 0.73609 for X is less than 2 and the approximate value from the PoissonIs 0.73575, they are very close. So we can approximate the probability from binomial Distribution by the Poisson distribution when n is large and p is small.

Page 15: Jiaping  Wang Department of Mathematical Science  04/22/2013, Monday

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

Example 8.5

In monitoring for a pollution, an experiment collects a small volume of water and counts the number of bacteria in the sample. Unlike earlier problems, we have only one observation. For purposes of approximating the probability distribution of counts, we can think of the volume as the quantity that is getting large.Let X denote the bacteria count per cubic centimeter of water and assume that X has a Poisson probability distribution with mean λ, which we do by showing that converges in distribution toward a standard normal random variable as λ tends toward infinity.Specifically, if the allowable pollution in a water supply is a count of 110 bacteria per cubic centimeter, approximate the probability that X will be at most 110, assuming that λ=100.

Page 16: Jiaping  Wang Department of Mathematical Science  04/22/2013, Monday

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

Solution

Answer: We know the mgf for Poisson random variable X is , thus We can have the mgf of Y as )-1)]. The term )-1) can be written as

+Thus MY(t)=exp[+ +=exp[When λ∞, MY(t) exp(t2/2) which is the mgf of the standard normal distribution. So wecan approximate the probability of the Poisson random variable by the standard normaldistribution when λ is large enough (for example, λ25).