Lecture4b Optimization Tools Lagrangian Method

Embed Size (px)

Citation preview

  • 8/13/2019 Lecture4b Optimization Tools Lagrangian Method.

    1/22

    Lecture 4Optimization ToolsLagrangian Methods

    Managerial Economics

    October 14, 2011

    Thomas F. RutherfordCenter for Energy Policy and Economics

    Department of Management, Technology and EconomicsETH Zrich

  • 8/13/2019 Lecture4b Optimization Tools Lagrangian Method.

    2/22

    Good Mathematical References for Economics

    Mathematics for Economistsby Carl P. Simon and LawrenceBloom, Norton, 1994. (an essential reference)

    Optimization in Economic Theoryby Avinash K. Dixit, Oxford,

    1975. (a sentimental favorite) Mathematical methods for economic theory: a tutorialby Martin

    J. Osborne, econoimcs.utoronto.ca/osborne(openaccess, very nicely organized)

    Microeconomic Analysisby Hal Varian, Chapters 26 and 27

    (terse but useful)

  • 8/13/2019 Lecture4b Optimization Tools Lagrangian Method.

    3/22

    The First Derivative

    Letf :R R. The derivative of f atx

    be denotedDf(x

    ). Becausef(x)is a scalar function, we have:

    Df(x) = df(x)

    dx

    The first derivative can be used to approximate the value of f atpoints close tox. For small departures distances t, we have

    f(x +t) f(x) +Df(x)t.

    Alternatively, we might write:

    f(x) L(x|x) f(x) +Df(x)(x x)

    whereL(x|x)denotes the linear approximation tofanchored atx.

  • 8/13/2019 Lecture4b Optimization Tools Lagrangian Method.

    4/22

    An Example of Linear Approximation

    To illustrate how a linear approximation works, suppose thatf

    (x

    ) =sin

    (x

    ). We haveDf

    (x

    ) =cos

    (x

    ). A local approximation to f

    (x

    )is thenL(x|x) =sin(x) +cos(x)(x x)

  • 8/13/2019 Lecture4b Optimization Tools Lagrangian Method.

    5/22

    Alternative Linear Approximations tosin(x)

  • 8/13/2019 Lecture4b Optimization Tools Lagrangian Method.

    6/22

    Second Order Approximations

    Asecond orderTaylor series approximation can be employed whenthe function to be approximated has continuous second derivatives.We can define aquadraticapproximation tof(x)as:

    Q(x; x) =f(x) +Df(x)(x x) +1

    2(x x)D2f(x)(x x)

    The following figure illustrates the relationship between the underlingsine function and three different quadratic approximations.

  • 8/13/2019 Lecture4b Optimization Tools Lagrangian Method.

    7/22

    Alternative Quadratic Approximations

  • 8/13/2019 Lecture4b Optimization Tools Lagrangian Method.

    8/22

    The Gradient Vector

    Whenf(x)is a scalar function with vector arguments, e.g. m=1 orf :Rn R, thegradientof f atx is a vector whose coordinates arethe partial derivatives off atx:

    D(f(x)) =

    f(x)

    x1, . . . ,

    f(x)

    xn

    Thegradient vectoris also denoted f(x).

  • 8/13/2019 Lecture4b Optimization Tools Lagrangian Method.

    9/22

    Definition

    Aquadratic formon Rn is a real-valued function of the form:

    Q(x1, . . . , xn) =

    ij

    aijxixj

    in which each term is monomial of degree two.We can write this type of function compactly with vector-matrixnotation, i.e.

    Q(x) =xTAx

    in whichAis asymmetricmatrix.

  • 8/13/2019 Lecture4b Optimization Tools Lagrangian Method.

    10/22

    Quadratic Forms Two Dimensions

    Whenn=2, we have:

    Q(x) =a11x21 +a12x1x2+a22x

    22

    provided that

    A= a11

    12

    a121

    2a

    12 a

    22

    The Jacobian matrix of a given function provides a typical symmetricmatrices which appears in quadratic forms.

    Note that ifA is anon-symmetricsquare matrix, the associated

    quadratic form has the same value as the related symmetric matrix:

    A = 1

    2A +

    1

    2AT

  • 8/13/2019 Lecture4b Optimization Tools Lagrangian Method.

    11/22

    Definiteness and Quadratic Forms

    Recall our quadratic approximation to a function f:

    f(x) f(x) +Df(x)(x x) +1

    2(x x)D2f(x)(x x))

    Suppose that we have selected an x such thatDf(x) =0. Then thevalue off(x)is given by:

    f(x) f(x) + (x x)A(x x))

    whereA = 12

    D2f(x).

    IfA is positive definitethenx is alocal minimizer off(). IfA is negative definitethenx is alocal maximizer off().

  • 8/13/2019 Lecture4b Optimization Tools Lagrangian Method.

    12/22

    Concavity

    A function of one variable is concaveif

    f(tx+ (1 t)y) tf(x) + (1 t)f(y)

    For example, thesin(x)function is concave betweenx=0.2 andy=1.6, as illustrated in the following figure.

  • 8/13/2019 Lecture4b Optimization Tools Lagrangian Method.

    13/22

    Local Concavity of the Sine Function

  • 8/13/2019 Lecture4b Optimization Tools Lagrangian Method.

    14/22

    Convexity

    1 Iff is a convex function, thenf(x) 0 for allx

    2 Iff is a convex function, then

    f(x) f(y) +f(y)|x y|

    3 Iff is a convex function, andf(x) =0, thenx minimizes thefunctionf.

  • 8/13/2019 Lecture4b Optimization Tools Lagrangian Method.

    15/22

    Unconstrained minimization

    Iff is differentiable at a local minimumx U(open), then

    f(x) =0.

    This is anecessary condition not asufficient condition. (All localminima satisfy this condition, but there exist points which are not localminima which also satisfy this condition, e.g. local maximaor saddlepoints.)

  • 8/13/2019 Lecture4b Optimization Tools Lagrangian Method.

    16/22

    Descent directions

    f :U R differentiable

    x U (open)

    If f(x)v

  • 8/13/2019 Lecture4b Optimization Tools Lagrangian Method.

    17/22

    Equality Constrained Optimization

    min f(x)

    subject to:

    g(x) =0 (P)x Rn

    where

    f andgare differentiable onRn.

    g:Rn

    Rm

    m n

  • 8/13/2019 Lecture4b Optimization Tools Lagrangian Method.

    18/22

    Lagranges Theorem

    Theorem

    Lagrange Ifx is a local minimum of(P), and the Jacobian matrixg(x)has rank m, then there exist numbers1, . . . ,msuch that

    f(x) +m

    i=1

    igi(x) =0

    The numbers 1, . . . ,mare calledLagrange multipliers

    The function L(x, ) =f(x) +m

    i=1igi(x)is theLagrangianfor (P).

  • 8/13/2019 Lecture4b Optimization Tools Lagrangian Method.

    19/22

    Practical usefulness of Lagranges method

    Solution of a constrained optimization problem with nvariables andmconstraints can be equivalent to solving a nonlinear system of n+mequations.

    For economists, this result enormously simplifies the formulation andsolution of market equilibrium models, because we are able toincorporate multiple agents, each of which optimizes a separateobjective function subject to constraints.

  • 8/13/2019 Lecture4b Optimization Tools Lagrangian Method.

    20/22

    Geometry of Constrained Optimization

  • 8/13/2019 Lecture4b Optimization Tools Lagrangian Method.

    21/22

    Need for the regularity condition

    The assumption that rank g(x) =mis aregularity condition.

    Lagranges theorem is not valid unless the regularity condition holds.

    EXAMPLE:

    min x1

    (P) subject tox21 + (x2 1)

    2 =1

    x21 + (x2+1)2 =1

    Note:(P)has only one feasible point x= (0, 0).

    f(x) = (1, 0)

    g1(x) = (0,2)

    g2(x) = (0, 2)

    The Lagrange multipliers cannot exist here.

  • 8/13/2019 Lecture4b Optimization Tools Lagrangian Method.

    22/22

    Irregular Example: No Multipliers Exist