26
PARAMETER ESTIMATION Chapter 7

Parameter estimation

Embed Size (px)

Citation preview

Page 1: Parameter estimation

PARAMETER ESTIMATION Chapter 7

Page 2: Parameter estimation

EVALUATING A POINT ESTIMATOR

Let X = (X1, . . . , Xn) be a sample from a population whose distribution is specified up to an unknown parameter θ.

Let d = d(X) be an estimator of θ.

How are we to determine its worth as an estimator of θ ?

Page 3: Parameter estimation

EVALUATING A POINT ESTIMATOR

r(d, θ) : the mean square error of the estimator d

An indicator of the worth of d as an estimator of θ

No single estimator d that minimized r(d, θ) for all

possible values of θ.

Page 4: Parameter estimation

EVALUATING A POINT ESTIMATOR

Minimum mean square estimators rarely exist

Possible to find an estimator having the smallest mean square error among all estimators that satisfy a certain property.

One such property is that of unbiasedness.

An estimator is unbiased if its expected value always equals the value of the parameter it is attempting to estimate.

The bias of d as an estimator of θ is defined as below:

Page 5: Parameter estimation

EVALUATING A POINT ESTIMATOR

Example: Let X = (X1, . . . , Xn) be a sample from a population whose distribution is specified up to an unknown parameter θ.

Let d = d(X) be an estimator of θ.

Find the bias for the following estimators:

Page 6: Parameter estimation

EVALUATING A POINT ESTIMATOR

Page 7: Parameter estimation

EVALUATING A POINT ESTIMATOR

Page 8: Parameter estimation

EVALUATING A POINT ESTIMATOR

Combining Independent Unbiased Estimators:

Let d1 and d2 denote independent unbiased estimators of θ,

having known variances σ12 and σ2

2. For i = 1, 2,

New estimator

from the old

ones

Page 9: Parameter estimation

EVALUATING A POINT ESTIMATOR

It will be unbiased.

To determine the value of λ that results in d having the smallest possible mean square error:

Page 10: Parameter estimation

EVALUATING A POINT ESTIMATOR

The optimal weight to give an estimator is inversely proportional to its

variance (when all the estimators are unbiased and independent).

Page 11: Parameter estimation

EVALUATING A POINT ESTIMATOR

A generalization of the result that the mean square error of an unbiased estimator is equal to its variance

is that

the mean square error of any estimator is equal to its variance plus the square of its bias.

Page 12: Parameter estimation

EVALUATING A POINT ESTIMATOR

Page 13: Parameter estimation

EVALUATING A POINT ESTIMATOR

The mean square error of any estimator is equal to its variance plus the square of its bias

Page 14: Parameter estimation

EVALUATING A POINT ESTIMATOR

Let X = (X1, . . . , Xn) be a sample from

a uniform distribution (0, θ) with the unknown parameter θ.

Let us evaluate the following estimators:

Page 15: Parameter estimation

EVALUATING A POINT ESTIMATOR

Is d1 is unbiased?

Page 16: Parameter estimation

EVALUATING A POINT ESTIMATOR

Page 17: Parameter estimation

How to find the mean and variance of this estimator?

First we have to find the distribution :-

Page 18: Parameter estimation
Page 19: Parameter estimation
Page 20: Parameter estimation
Page 21: Parameter estimation
Page 22: Parameter estimation

IS THE FOLLOWING ESTIMATOR BIASED?

Page 23: Parameter estimation
Page 24: Parameter estimation
Page 25: Parameter estimation

The (biased) estimator (n + 2)/ (n + 1) maxi Xi has about

half the mean square error of the MLE maxi Xi.

Page 26: Parameter estimation

Thank you for your attention