19
Extremum Estimators Walter Sosa-Escudero Econ 507. Econometric Analysis. Spring 2009 May 5, 2009 Walter Sosa-Escudero Extremum Estimators

Extrem Um

Embed Size (px)

Citation preview

  • Extremum Estimators

    Walter Sosa-Escudero

    Econ 507. Econometric Analysis. Spring 2009

    May 5, 2009

    Walter Sosa-Escudero Extremum Estimators

  • Motivation

    A general class of estimators that includes all cases studied inthis course.

    Structure: an estimator is defined, its asymptotic propertiesare studied (consistency, asymptotic normality, efficiency).

    An asymptotic variance is obtained together with asymptoticnormality. A consistent estimator for it is proposed.

    Reference: Newey and McFadden (1994) provide a detailed and more general treatement.

    Walter Sosa-Escudero Extremum Estimators

  • Extremum estimators

    Definition: is an extremum estimator if there is an objectivefunction Qn() such that

    = argmax

    Qn()

    The function Qn() depends on a sample of size n. is the parameter space.

    Walter Sosa-Escudero Extremum Estimators

  • Particular cases

    1) Maximum likelihood

    z1, . . . , zn and iid sample with density function f(z; 0). Take

    Qn() =1n

    ni=1

    ln f(zi; )

    Walter Sosa-Escudero Extremum Estimators

  • 2) Non-linear least squares and OLS

    Let zi = (yi, xi) and E(y|x) = h(x; 0). Then take

    Qn() = 1n

    ni=1

    (yi h(xi; ))2

    This is the NLS estimator. Obviously the OLS estimatorcorresponds to the case h(x, 0) = x0.

    Walter Sosa-Escudero Extremum Estimators

  • GMM and TSLS

    3) GMM, IV and TSLS

    Suppose there exists a vector of functions g(z; ) such thatE(g(z; 0)) = 0, and a psd matrix W . Then, take

    Qn() = [

    1n

    ni=1

    g(zi; )

    ]W

    [1n

    ni=1

    g(zi; )

    ]This is the GMM estimator. For the IV case in the linear model,take

    g(z, ) = z(y x0)where z is the vector of instruments. The TLS estimatorcorresponds to W = n1

    ni=1 ziz

    i.

    Walter Sosa-Escudero Extremum Estimators

  • Consistency

    A general consistency result: if there is a function Q0() such that

    1 Q0() is uniquely maximized at 02 is compact.3 Q0() is continuous.4 Qn() converges uniformly in probability to Q0()

    then p 0.

    Proof: we did it when we proved consistency of the MLE estimator (the last step).

    Walter Sosa-Escudero Extremum Estimators

  • Consistency: the movie

    z N(, 1), 0 = 2Q0() = E(l()) 0.5 (5 4 + 2)Qn() 0.5 (n1

    z2i 2z + 2)

    Walter Sosa-Escudero Extremum Estimators

  • n = 50

    Walter Sosa-Escudero Extremum Estimators

  • n = 50, 100

    Walter Sosa-Escudero Extremum Estimators

  • n = 50, 100, 200

    Walter Sosa-Escudero Extremum Estimators

  • n = 50, 100, 200, 2500

    Walter Sosa-Escudero Extremum Estimators

  • Discussion

    1 (Unique maximizer at true value) This is usually anidentification assumption.

    2 (Compactness) This implies a bounded parameter space, it isa restrictive assumption.

    3 (Continuity) Usually a consequence of 4).

    4 (Uniform Convergence) Typically implies imposing primitiveconditions to use a uniform LLN (moment existence,sampling).

    Walter Sosa-Escudero Extremum Estimators

  • Identification

    MLE: information inequality: if 0 6= implies f() 6= f(0),then E(l()) is uniquely maximized by 0.NLS: the limiting function is E

    [(y h(x, ))2]). By the

    properties of conditional expectations, this is minimized by theconditional expectation, in this case h(x, 0), then foridentification in NLS we need

    6= 0 implies h(x, ) 6= h(x, 0)IV/GMM in the linear model: the rank condition guaranteesidentification E(zixi) = zx is a pK matrix, that exists, isfinite, an has full column rank.

    Non-linear GMM: recall the global vs. local identificationdiscussion.

    Walter Sosa-Escudero Extremum Estimators

  • Uniform convergence and continuity

    A general uniform LLN: if the data are i.i.d., is compact, a(zi, )is a continuous function at each wp1, and there is d(z) with||a(z, )|| d(z) for all and E[d(z)]

  • Example: Consistency of MLE: Under 1) Zi, i = 1, . . . , n, iid f(zi; 0), 2) 6= 0 f(zi; ) 6= f(zi; 0). , 3) acompact set, 4) ln f(zi; ) is continuous at each w.p.1., and4) E

    [sup |ln f(z; )|

    ]

  • Asymptotic Normality

    Assume the conditions for consistency hold, and add the followingconditions

    1 0 is an interior point of .2 Qn() is twice continuously differentiable in a neighborhoodN of 0.

    3n Qn(0) d N(0,).

    4 There is H() continuous at 0 such that Qn()converges uniformly in probability to H().

    5 H H(0) is non-singularThen

    n ( 0) d N(0, H1H)

    Walter Sosa-Escudero Extremum Estimators

  • Proof:

    Conditions 1-3 imply that satifies the FOCs

    Qn() = 0

    Take a mean value expansion around 0 and solve to get

    n( 0) = H()1

    n Qn(0)

    with H() Qn(). Since H() converges uniformly in probabilityto H(), and since

    p 0, then

    H()p H(0)

    and by continuity of matrix inversion H()1 p H(0)1. The resultfollows from 3) and Slutzkys Theorem.

    Walter Sosa-Escudero Extremum Estimators

  • Discussion

    AN is driven mostly by AN of the first derivatives.

    The result starts by linearizing the FOCs and then solving forthe normalized estimator.

    This produces two factors, one that does not explode (relatedto the inverse of the second derivatives) and the other that isasymptotically normal (the first derivatives).

    Slutzkys theorem implies normality with a sandwich typeasymptotic variance.

    Walter Sosa-Escudero Extremum Estimators