Upload
sydney-davies
View
221
Download
0
Tags:
Embed Size (px)
Citation preview
FINANCIAL TIME-SERIES FINANCIAL TIME-SERIES ECONOMETRICSECONOMETRICS
SUN LIJIAN
Feb 23,2001
CHAPTER 1CHAPTER 1
UNIVARIATE LINEAR UNIVARIATE LINEAR STOCHASTIC PROCESSSTOCHASTIC PROCESS
Contents
1. BASIC CONCEPTS1. BASIC CONCEPTSFinancial Economics and Uncertainty
Stochastic Process, Stationarity and Autocorrelation Stochastic process (e.g., nondeterministic discrete time series)
two features: dependency and lack of replication Realizations and statistics of probability distribution:
mean,variance,autocovariance stationarity: a particular state of statistical equilibrium
strict stationarity: distribution properties unaffected by a change of time origin
weak stationarity: the first and second moments do not depend on time. Ergodicity: the conditions about the consistency between sample statistics and p
opulation statistics Autocorrelation function (correlogram) and partial autocorrelation
ACF[ ] and structure of the random process
PACF: “indirect” correlation eliminating the other past effects
)0()()( ss
Stationary linear stochastic process White noise model
; ;
Autoregressive model [AR(p)]
p: lag order, :innovation(white noise process)
stationarity(characteristic roots of must lie outside of the unit circle)
to calculate the second moments based on Yule=Walker equation
for an AR(p), there is no partial autocorrelation between and for s>p.
Moving average model[MA(q)]
stationary and non-deterministic process:
to calculate each statistics based on their definition
for an MA(q), there is no autocorrelation between and for s>p.
tptpttt uyyyy 2211
0)( tuE 2)var( tu
)1( iiL
ty sty
qtqtttt uuuuy 2211
ty sty
0),cov( stt uu
MA invertibility and AR stationarity
the PACF coefficients exhibit a geometrically decaying pattern. ARMA(p,q) model
stationarity [same as AR(p)];invertibility[same as MA(q)]
the ACF will begin to decay at lag q,while PACF to decay at lag p. Autoregressive integrated moving average Model[ARIMA(p,d,q)]
trend elimination Wold’s decomposition theorem
q
jjtj
p
iitit uyy
01
tt uLyL )()(
2. Box-Jenkins Methodology2. Box-Jenkins Methodology IdentificationEstimationDiagnostic Checking(Forecasting) Principle of Parsimony Identification(Model Building)
Plotting time series data Pattern of the ACF and PACF Test on Sample ACF and PACF (t, Q test) Nonstationarity and seasonality adjustment (integrated process) trend(mean by difference,variance by log transformation); seasonality(regular by difference,irregular by additive or multiplicative S
ARIMA)
Estimation General method covariance matrixML function Estimator(ML,QML,CML) long period needed Special Method
AR(p): OLS, Yule=Walker equation MA(q) and ARMA(p,q): Gauss=Newton method(grid-search)
2,
pyf
Diagnostic Checking(Model Selection) Residuals plot Information criteria[AIC(1969),SBIC(1978),etc.]
AIC= ; :estimator of var( ut)
SBIC=
They will be as small as possible(comparable with the same period)
SBIC has superior large sample properties(asymptotically consistent).
Overfitting and splitting analysis Forecast adequacy
Tqp /)(2ˆlog 2 2̂TTqp /)log()(ˆlog 2
3. FORECASTING3. FORECASTING Basic Concept
Optimal forecast and prediction error stationarity and convergency of forecast weight and error variance The role of forecast model AR(): make a forecast;MA():forecast error analysis Significant level and confidence intervals
Forecast Function Iterative method Solution methodology
Some Comments Efficacy of forecast(short period) Conditional forecast (start period) Large sample needed
4. SUMMARY AND CONCLUSIONS4. SUMMARY AND CONCLUSIONS By definition, an ARMA model is weak stationary in that it has a finite and tim
e-invariant mean and covariances.For an ARMA model to be stationary,the characteristic roots of the difference equation must lie inside the unit circle. Moreover,the process must have started infinitely far in the past or the process must always be in equilibrium.
A well estimated model (1) is parsimonious; (2) has coefficients that imply stationarity and invertibility;(3) fits the data well;(4) has residula that approximate a white-noise process; (5) has coefficients that do not change over the sample period; and (6) has good out-of-sample forecasts.
Appendix : TSP Programs to Accompany Chapter 2 BJIDENT (option) variables;
Option: NDIFF, NSDIFF ( NSPAN), NLAG, NLAGP Plot: series, ACF+PACF (20), Q (s-p-q-1) Output value: ACF, PACF, Q @AC, @PAC
BJEST (option) variables (start values); Option (unnecessary to specify if same):
NBACK (start condition)---MA: small value(5), AR: large (10) Start: value specification (order—AR,MA,CONST)previous estima
tion results Residuals: Q, p, periodogram (45 degree line) AIC=2logL + 2(p+q)
BJFRCST (option) variables S start variables values; Option: CONBOUND(95%), NHORIZ, ORGBEG, ORGEND)