37
EC115 - Methods of Economic Analysis Lecture 4 Optimization with more than one variable Renshaw - Chapter 15 University of Essex - Department of Economics Week 19 Domenico Tabasso (University of Essex - Department of Economics) Lecture 4 Week 19 1 / 37

Optimization With More Than One Variable

Embed Size (px)

DESCRIPTION

economics

Citation preview

Page 1: Optimization With More Than One Variable

EC115 - Methods of Economic AnalysisLecture 4

Optimization with more than one variable

Renshaw - Chapter 15

University of Essex - Department of Economics

Week 19

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 1 / 37

Page 2: Optimization With More Than One Variable

Introduction

A function of several variables is a relation betweensome independent variables x1, x2, x3, ...xn and somedependent variable z such that:

z = f (x1, x2, x3, ...xn)

specifies the value of z given the values ofx1, x2, x3, ...xn.Today we will analyze the maximum and minimumvalues of these type of functions.We do this by building on our knowledge of partialderivatives.

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 2 / 37

Page 3: Optimization With More Than One Variable

Stationary Points

Consider the function z = f (x , y). The point(x = x0, y = y0) is called a stationary point of z if:

∂z∂x

=∂z∂y

= 0 at (x = x0, y = y0).

This is just like the definition of a stationary point for afunction of one variable: there the function w = g(x)had a stationary point at x = x0 if dw

dx = 0 at x = x0.However, when we deal with functions of more than onevariable then it turns that there are three kinds ofstationary points: maximum points, minimum pointsand saddlepoints.

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 3 / 37

Page 4: Optimization With More Than One Variable

Conditions for a Minimum

Consider the following function

z = (x2 + y 2).

The first and second order partial derivatives are givenby

∂z∂x

= 2x ;∂2z∂x2 = 2,

∂z∂y

= 2y ;∂2z∂y 2 = 2,

∂2z∂x∂y

=∂2z

∂y∂x= 0.

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 4 / 37

Page 5: Optimization With More Than One Variable

Can we use these derivatives to establish whether thefunction has a minimum or a maximum?

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 5 / 37

Page 6: Optimization With More Than One Variable

Conditions for a Minimum

The function clearly has a minimum. In order to study itwe first assume that one of the variables is a constant.

Assume y = y0 = 0. Graph this function. What is thevalue of x that makes z as small as possible?

I Here f (x , y0) = f (x , 0) = x2 which is minimized with respect to xwhen x = 0.

Assume x = x0 = 0. Graph this function. What is thevalue of y that makes z as small as possible?

I Here f (x0, y) = f (0, y) = y2 which is minimized with respect to ywhen y = 0.

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 6 / 37

Page 7: Optimization With More Than One Variable

Using the iso-x section to find necessary conditions for a minimum

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 7 / 37

Page 8: Optimization With More Than One Variable

Note that for any value of y , the function decreases forall values of x < 0 and then increases.

∂z∂x

< 0 for all x < 0 and∂z∂x

> 0 for all x > 0.

Also note that for any value of x , the functiondecreases for all values of y < 0 and then increases.

∂z∂y

< 0 for all y < 0 and∂z∂y

> 0 for all y > 0.

Note that in both cases the second order partialderivative is always positive.

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 8 / 37

Page 9: Optimization With More Than One Variable

The first order conditions (FOC) for z = f (x , y) tohave a minimum at (x = x0, y = y0) are given by:

∂z∂x

= 0 at (x = x0, y = y0),

∂z∂y

= 0 at (x = x0, y = y0).

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 9 / 37

Page 10: Optimization With More Than One Variable

The second order conditions (SOC) for z = f (x , y) tohave a minimum at (x = x0, y = y0) are given by:

∂2z∂x2 > 0 at (x = x0, y = y0),

∂2z∂y 2 > 0 at (x = x0, y = y0),

∂2z∂x2 ×

∂2z∂y 2 >

∂2z∂x∂y

× ∂2z∂y∂x

=

(∂2z

∂x∂y

)2

at (x = x0, y = y0),

where the last equality follows from Young’s Theorem.

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 10 / 37

Page 11: Optimization With More Than One Variable

The first order conditions and the first two parts of thesecond order conditions for z = f (x , y) to have aminimum at (x = x0, y = y0) are the same as the firstand second order conditions for f (x , y0) to have aminimum with respect to x at x = x0 and f (x0, y) tohave a minimum with respect to y at y = y0.

The third part of the second order conditions is neededin addition to the first two parts because we can vary xand y in other ways than just keeping one of them fixedwhile changing the other.

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 11 / 37

Page 12: Optimization With More Than One Variable

Conditions for a Maximum

Consider the following function:

z = −(x2 + y 2).

The first and second order partial derivatives are givenby:

∂z∂x

= −2x ;∂2z∂x2 = −2

∂z∂y

= −2y ;∂2z∂y 2 = −2

∂2z∂x∂y

=∂2z

∂y∂x= 0.

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 12 / 37

Page 13: Optimization With More Than One Variable

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 13 / 37

Page 14: Optimization With More Than One Variable

Assume y = y0 = 0. Graph this function. What is thevalue of x that makes z as big as possible?

I Here f (x , y0) = f (x , 0) = −x2 which is maximized with respect to xwhen x = 0.

Assume x = x0 = 0. Graph this function. What is thevalue of y that makes z as big as possible?

I Here f (x0, y) = f (0, y) = −y2 which is maximized with respect to ywhen y = 0.

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 14 / 37

Page 15: Optimization With More Than One Variable

For any value of y , the function is increasing for allvalues of x < 0 and then decreases.

∂z∂x

> 0 for all x < 0 and∂z∂x

< 0 for all x > 0.

For any value of x , the function is increasing for allvalues of y < 0 and then decreases.

∂z∂y

> 0 for all y < 0 and∂z∂y

< 0 for all y > 0.

Note that in both cases the second order own partialderivative is always negative.

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 15 / 37

Page 16: Optimization With More Than One Variable

Using the iso-y section to find necessary conditions for a maximum

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 16 / 37

Page 17: Optimization With More Than One Variable

The first order conditions for z = f (x , y) to have amaximum at (x = x0, y = y0) are given by:

∂z∂x

= 0 at (x = x0, y = y0),

∂z∂y

= 0 at (x = x0, y = y0).

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 17 / 37

Page 18: Optimization With More Than One Variable

The second order conditions for z = f (x , y) to have amaximum at (x = x0, y = y0) are given by:

∂2z∂x2 < 0 at (x = x0, y = y0),

∂2z∂y 2 < 0 at (x = x0, y = y0),

∂2z∂x2

∂2z∂y 2 >

∂2z∂x∂y

∂2z∂y∂x

=

(∂2z

∂x∂y

)2

at (x = x0, y = y0),

where, again, the last equality follows from Young’sTheorem.

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 18 / 37

Page 19: Optimization With More Than One Variable

Example of Failure of SOC for a Maximum

Consider the following function:

z = (x + y)2 − 2(x − y)2 = −x2 + 6xy − y 2.

The first and second order partial derivatives are givenby:

∂z∂x

= 6y − 2x ;∂2z∂x2 = −2;

∂z∂y

= 6x − 2y ;∂2z∂y 2 = −2;

∂2z∂x∂y

=∂2z

∂y∂x= 6.

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 19 / 37

Page 20: Optimization With More Than One Variable

Example of Failure of SOC for a Maximum

At x = y = 0 then ∂z∂x = 0, ∂2z

∂x2 = −2, ∂z∂y = 0,

∂2z∂y2 = −2.

Therefore the value of x which makes z as large aspossible when y = 0 is x = 0 and, similarly, the valueof y which makes z as large as possible when x = 0 isy = 0. In addition, x = y = 0 is the only solution tothe first order conditions.

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 20 / 37

Page 21: Optimization With More Than One Variable

Example of Failure of SOC for a Maximum

However, since ∂2z∂x∂y = ∂2z

∂y∂x = 6 then(

∂2z∂x∂y

)2= 36

while ∂2z∂x2 × ∂2z

∂y2 = 4 < 36 so that the third part of thesecond order conditions is not satisfied.

At x = y = 0 then z = 0 but at x = y = a thenz = 4a2 which is greater than zero whenever a 6= 0.Therefore, z does not have a maximum at x = y = 0.

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 21 / 37

Page 22: Optimization With More Than One Variable

Need for third part of 2nd order conditions

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 22 / 37

Page 23: Optimization With More Than One Variable

Conditions for a Saddle point

The first order conditions for z = f (x , y) to have asaddlepoint at (x = x0, y = y0) are given by:

∂z∂x

= 0 at (x = x0, y = y0),

∂z∂y

= 0 at (x = x0, y = y0).

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 23 / 37

Page 24: Optimization With More Than One Variable

The second order conditions for z = f (x , y) to have asaddlepoint at (x = x0, y = y0) are given by:

∂2z∂x2

∂2z∂y 2 <

∂2z∂x∂y

∂2z∂y∂x

=

(∂2z

∂x∂y

)2

at (x = x0, y = y0),

where, again, the last equality follows from Young’sTheorem.

This condition is automatically met if ∂2z∂x2 and ∂2z

∂y2 haveopposite signs.

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 24 / 37

Page 25: Optimization With More Than One Variable

Example of a saddle point

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 25 / 37

Page 26: Optimization With More Than One Variable

Strategy for Optimization

Identify locations of stationary points by determiningwhere first order conditions are satisfied.

I The first order conditions are necessary conditions for a stationarypoint.

For each stationary point determine whether it is of thedesired type (maximum, minimum, saddlepoint) byexamining whether or not it satisfies the appropriatesecond-order conditions.

I The appropriate second-order conditions combined with the first orderconditions are sufficient conditions for the desired type of stationarypoint.

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 26 / 37

Page 27: Optimization With More Than One Variable

A few examples - 1Find the maximum or the minimum of the followingfunction:

z = f (x , y) = 6− x2 − 2xy − 3y 2 − 3x + 4y ;

First Step: Search for all the critical points of thefunction =⇒ look for all the points that satisfy the firstorder conditions

FOCs∂z∂x

= 0⇒ −2x − 2y − 3 = 0

∂z∂y

= 0⇒ −2x − 6y + 4 = 0

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 27 / 37

Page 28: Optimization With More Than One Variable

Example 1 - Cont.

The FOCs are represented by two equations, simultaneouslyequal to 0. We solve the as a system and the solutions aregoing to represent the candidate points for maxima and/orminima.

Second Step: Solve the FOCs{−2x − 2y − 3 = 0−2x − 6y + 4 = 0

⇒{

x = −32 − y

3 + 2y − 6y + 4 = 0

⇒{

x∗ = −134

y ∗ = 74

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 28 / 37

Page 29: Optimization With More Than One Variable

Example 1 - Cont.

So the point(−13

4 , 74

)is the only candidate. Is it a max or

a min (or maybe a saddle point)?

Third Step: Check the signs of ALL the secondorder conditions:

∂2z∂x2 > < 0 ?

∂2z∂y 2 > < 0 ?

∂2z∂x2 ×

∂2z∂y 2 > <

(∂2z

∂x∂y

)2

?

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 29 / 37

Page 30: Optimization With More Than One Variable

Example 1 - Cont.In our case:

∂2z∂x2 = −2 < 0 ∀ x , y

∂2z∂y 2 = −6 < 0 ∀ x , y

It looks like we’re dealing with a maximum. Really? YES:

(−2)×(−6) =∂2z∂x2×

∂2z∂y 2 = 12 >

(∂2z

∂x∂y

)2

= (−2)2 = 4

So, the function:z = f (x , y) = 6− x2 − 2xy − 3y 2 − 3x + 4y has amaximum at

(−13

4 , 74

)Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 30 / 37

Page 31: Optimization With More Than One Variable

Economic Applications

A firm produces good Q in a competitive market and sellsit at price P . This firm need both capital and labour to beable to produce any quantity of Q. It hires labour at awage of w and rents capital at a rate of r . Let theproduction function of the firm be Cobb-Douglas:

Q = K 1/2L1/3.

Write down the firm’s profit function.What are the demands for labour and capital if thisfirm maximizes profits?Graph your answer.

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 31 / 37

Page 32: Optimization With More Than One Variable

Solution - 1

The profit function is

Π = p ∗ K 1/2L1/3 − wL− rK

and the first order conditions associated with it are:

∂Π

∂K= 0⇒ 1

2pK−1/2L1/3 − r = 0

∂Π

∂L= 0⇒ 1

3pK 1/2L−2/3 − w = 0

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 32 / 37

Page 33: Optimization With More Than One Variable

Solution - 2

Solving simultaneously the two FOCs we can obtain:

12pK−1/2L1/3

13pK 1/2L−2/3

=rw

By simplifying and solving with respect to L we get:

L =23

rw

K (1)

which we can now plug back into any of the two FOCs.

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 33 / 37

Page 34: Optimization With More Than One Variable

Solution - 3

Substituting (1) into the first FOC we obtain:

12pK−1/2

(23

rw

K)1/3

− r = 0

that can be solved for K in order to get

K ∗ =p6

144r 4w 2

Substituting K∗ into (1) we eventually obtain:

L∗ =p6

216r 3w 3

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 34 / 37

Page 35: Optimization With More Than One Variable

Solution - 4

So the point(K ∗ = p6

144r4w2 , L∗ = p6

216r3w3

)is a critical

point. Is it a max? Let’s check the second order conditions.

∂2Π∂K 2 = −1

4pK−3/2L1/3 < 0, for any positive p, K , L

∂2Π∂L2 = −2

9pK−1/2L−5/3 < 0, for any positive p, K , L

So the first two SOCs are satisfied, for any value of K andL, and hence also for K ∗, L∗ (remember that K and L areproduction inputs, so assuming that they are alwayspositive is a very mild hypothesis).

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 35 / 37

Page 36: Optimization With More Than One Variable

Solution - 5

Furthermore:

∂2Π∂K∂L = 1

6pK−1/2L−2/3 > 0, for any positive p, K , L

so:

∂2Π

∂K 2 ×∂2Π

∂L2 =

(−14pK−3/2L1/3

(−29pK−1/2L−5/3

)=

236

p2K−1L−4/3

>

136

p2K−1L−4/3 =

(∂2Π

∂K∂L

)2

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 36 / 37

Page 37: Optimization With More Than One Variable

Solution - 6

So, as all the necessary second order conditions aresatisfied, the point

(K ∗ = p6

144r4w2 , L∗ = p6

216r3w3

)indeed

represents a maximum.

Domenico Tabasso (University of Essex - Department of Economics)Lecture 4 Week 19 37 / 37