Upload
hadley-wickham
View
586
Download
0
Tags:
Embed Size (px)
Citation preview
Hadley Wickham
Stat310Estimation
Sunday, 11 April 2010
Misc.
• Test grading in progress. Should have back by Thursday.
• Will also calculate overall ytd grades.
• Two party planning commissioners needed to help plan end of class party - please email me if interested.
Sunday, 11 April 2010
1. Recap
2. Method of moments
3. Maximum likelihood
4. Feedback
Sunday, 11 April 2010
Recap
Sunday, 11 April 2010
Inference
Up to now: Given a sequence of random variables with distribution F, what can we say about the mean of a sample?
What we really want: Given the mean of a sample, what can we say about the underlying sequence of random variables?
Sunday, 11 April 2010
Marijuana
Up to now: If I told you that smoking pot was Binomially distributed with probability p, you could tell me what I would expect to see if I sampled n people at random.
What we want: If I sample n people at random and find out m of them smoke pot, what does that tell me about p?
Sunday, 11 April 2010
Your turn
Let’s I selected 10 people at random from the Rice campus and asked them if they’d smoked pot in the last month. Three said yes.
Let Yi be 1 if person i smoked pot and 0 otherwise. What’s a good guess for distribution of Yi?
Sunday, 11 April 2010
p
prob
0.0e+00
5.0e−09
1.0e−08
1.5e−08
2.0e−08
2.5e−08
3.0e−08
3.5e−08
0.0 0.2 0.4 0.6 0.8 1.0
Sunday, 11 April 2010
Aim
Given data, we want to figure out what the true parameters of the distribution are.
We also want to know how much error is associated with our estimate.
Sunday, 11 April 2010
Definitions
Parameter space: set of all allowed parameter values
Estimator: process/function which takes data and gives best guess for parameter
Point estimate: estimate for a single value
Sunday, 11 April 2010
Hat notation
Usually write the estimate of a parameter with a little hat over it. Subscript identifies type of estimator used.
µ̂ σ̂2 σ̂2µ̂MM
µ̂ML
Sunday, 11 April 2010
Lower/upper case
X1, X2, X3, ... = random variables that define an experiment. IID.
x1, x2, x3, ... = results of single experiment
So we collect x1, x2, x3, ... and want to say something about the distribution of X
Sunday, 11 April 2010
Plug-in principle
A good first guess at the true mean, variance, or any other parameter of a distribution is simply to estimate it from the data.
Two more ways: method of moments and maximum likelihood estimator
Sunday, 11 April 2010
Method of moments
Sunday, 11 April 2010
Method of moments
We know how to calculate sample moments (e.g. mean and variance of data)
We know what the moments of the distribution are in terms of the parameters.
Why not just match them up?
Sunday, 11 April 2010
Gamma distribution
X ~ Gamma(α, β)
E(X) = αβ Var(X) = αβ2
Sunday, 11 April 2010
Steps
• Write down formulas for mean and variance.
• Rewrite to use sample estimates
• Solve for the parameters
• (If there are more parameters, you’ll need to use more moments, but that won’t come up since we haven’t learned any distributions with more than two parameters)
Sunday, 11 April 2010
Your turn
What are the method of moments estimators for the mean and variance of the normal distribution?
What about the Poisson distribution? Is there anything different about the Poisson?
Sunday, 11 April 2010
But...
Let X ~ Unif(0, θ)
What is a method of moments estimator for θ?
For one experiment we got the values 3, 5, 6, 18. What is the method of moments estimate for θ? Is it a good estimator?
Sunday, 11 April 2010
Pros/cons
Pro: Simple!
Con: Doesn’t always work.
Con: Often not best estimator
Con: Don’t know anything about the error in the estimate
Sunday, 11 April 2010
Maximum likelihood
Sunday, 11 April 2010
Example
X ~ Binomial(10, p)
We repeat the random experiment 10 times and get the following value: 4 5 1 5 3 2 4 2 2 4
What is p?
Sunday, 11 April 2010
p
prob
0.0e+00
5.0e−09
1.0e−08
1.5e−08
2.0e−08
2.5e−08
3.0e−08
3.5e−08
0.0 0.2 0.4 0.6 0.8 1.0
Sunday, 11 April 2010
Maximum likelihood
Write down the joint pdf (pdf of iid random variables is?)
We have the data, so we can work out how likely each possible value of p is.
Then pick the p that is most likely.
Can do this numerically (like that graphics) or algebraically (with calculus)
Sunday, 11 April 2010
likelihood = joint pdf
Sunday, 11 April 2010
StepsWrite out likelihood (=joint pdf)
Write out log-likelihood
(Discard constants)
Find maximum:
Differentiate and set to 0
(Check second derivative is positive)
(Check end points)
Sunday, 11 April 2010
Binomial
1. Joint pdf - be careful about x’s
2. Log-likelihood - why?
3. Differentiate and set to zero
Sunday, 11 April 2010
Feedback
Sunday, 11 April 2010