11
1 Econometrics (NA1031) Chap 5 The Multiple Regression Model

Chap 5 The Multiple Regression Model

Embed Size (px)

DESCRIPTION

Multiple regression An example of an economic model is: The econometric model: e = SALES - E(SALES)

Citation preview

Page 1: Chap 5 The Multiple Regression Model

1

Econometrics (NA1031)

Chap 5The Multiple Regression Model

Page 2: Chap 5 The Multiple Regression Model

2

Multiple regression• An example of an economic model is:

• The econometric model:

e = SALES - E(SALES)

1 2 3SALES PRICE ADVERT

2 held constantADVERT

SALESPRICESALESPRICE

1 2 3 β β β SALES E SALES e PRICE ADVERT e

1 2 3( ) β β β E SALES PRICE ADVERT

Page 3: Chap 5 The Multiple Regression Model

3

FIGURE 5.1 The multiple regression plane

Page 4: Chap 5 The Multiple Regression Model

4

Multiple regression• In a general multiple regression model, a

dependent variable y is related to a number of explanatory variables x2, x3, …, xK through a linear equation that can be written as:

• A single parameter, call it βk, measures the effect of a change in the variable xk upon the expected value of y, all other variables held constant

1 2 2 3 3β β β βK Ky x x x e

other xs held constant

βkk k

E y E yx x

Page 5: Chap 5 The Multiple Regression Model

5

Assumptions• MR1.• MR2. • MR3. • MR4. • MR5. The values of each xtk are not random and

are not exact linear functions of the other explanatory variables

• MR6.

1 2 2 , 1, ,i i K iK iy x x e i N

1 2 2( ) ( ) 0i i K iK iE y x x E e

2var( ) var( )i iy e

cov( , ) cov( , ) 0i j i jy y e e

2 21 2 2~ ( ), ~ (0, )i i K iK iy N x x e N

Page 6: Chap 5 The Multiple Regression Model

6

Estimation by OLS• Say two explanatory variables in the model:

Minimize

1 2 2 3 3β β βi i i iy x x e

2

1 2 31

21 2 2 3 3

1

β ,β ,β

β β β

N

i ii

N

i i ii

S y E y

y x x

Page 7: Chap 5 The Multiple Regression Model

7

Least squares estimators• Are random variables and have sampling

properties.• According to Gauss-Markov theorem if assumptions

MR1–MR5 hold, then the least squares estimators are the best linear unbiased estimators (BLUE) of the parameters.

• For example it can be shown that: 2

22 2

23 2 21

var( )(1 ) ( )

N

ii

br x x

2 2 3 323 2 2

2 2 3 3

( )( )

( ) ( )i i

i i

x x x xr

x x x x

Page 8: Chap 5 The Multiple Regression Model

8

Least squares estimators• We can see that:

1. Larger error variances 2 lead to larger variances of the least squares estimators

2. Larger sample sizes N imply smaller variances of the least squares estimators

3. More variation in an explanatory variable around its mean, leads to a smaller variance of the least squares estimator

4. A larger correlation between x2 and x3 leads to a larger variance of b2

Exact collinearity when correlation between x2 and x3 is perfect (i.e. =1)

Page 9: Chap 5 The Multiple Regression Model

9

Least squares estimators• We can arrange the variances and covariances in a

matrix format:

• Using estimates of these we can construct interval estimates and conduct hypothesis testing as we did for the simple regression model.

1 1 2 1 3

1 2 3 1 2 2 2 3

1 3 2 3 3

var cov , cov ,cov , , cov , var cov ,

cov , cov , var

b b b b bb b b b b b b b

b b b b b

Page 10: Chap 5 The Multiple Regression Model

10

Stata

• Start Stata

mkdir C:\PEcd C:\PEcopy http://users.du.se/~rem/chap05_15.do chap05_15.dodoedit chap05_15.do

Page 11: Chap 5 The Multiple Regression Model

11

Assignment• Exercise 5.12, 5.13.a, 5.13.b.i, 5.13.b.ii,

page 204 and 205 in the textbook.