Upload
others
View
9
Download
0
Embed Size (px)
Citation preview
TIME SERIES ANALYSIS
FOR
FORECASTING
Rupika Abeynayake
Professor in Applied Statistics
Frequently there is a time lag between awareness of an
impending event or need and occurrence of the event
This lead-time is the main reason for planning and forecasting. If
the lead-time is zero or very small, there is no need for planning
If the lead-time is long, and the outcome of the final event is
conditional on identifiable factors, planning can perform an
important role
In management and administrative situations the need for
planning is great because the lead-time for decision making
ranges from several years to a few days or hours to a few second.
Therefore, forecasting is an important aid in effective and
efficient planning
Introduction
Forecasting is a prediction of what will occur in the future, and it
is an uncertain process
One of the most powerful methodologies for generating
forecasts is time series analysis
A data set containing observations on a single phenomenon
observed over multiple time periods is called time-series. In time
series data, both values and the ordering of the data points have
meaning. For many agricultural products, data are usually
collected over time
Introduction…
Time series analysis and its applications have become
increasingly important in various fields of research, such as
business, economics, agriculture, engineering, medicine, social
sciences, politics etc.
Realization of the fact that “ Time is Money " in business
activities, the time series analysis techniques presented here,
would be a necessary tool for applying to a wide range of
managerial decisions successfully where time and money are
directly related
Introduction
On the time scale we are standing at a certain point called
the point of reference (Yt) and we look backward over past
observations (Yt-1, Yt-2,…,Yt-n+1) and foreword into the future
(Ft+1, Ft+2, …,Ft+m)
Once a forecasting model has been selected, we fit the
model to the known data and obtain the fitted values. For
the known observations this allows calculation fitted errors
(Yt-1- Ft-1)
A measure of goodness-of-fit of the model and as new
observations become available we can examine forecasting
errors (Yt+1- Ft+1)
Forecasting scenario
Measuring Forecast accuracy
n
1t
ten
1MEMean error :
n
1t
t |e|n
1MAEMean absolute error :
Mean squared error :
Percentage error (PE) :
n
1t
2
t |e|n
1MSE
PEt = 100*(Yt - Ft )/Yt
Mean percentage error (MPE) :
n
1t
tPEn
1MPE
Mean absolute percentage error (MAPE) :
n
1t
t |PE|n
1MAPE
Main components of time series data
There are four types of components in time series analysis
Seasonal component (S)
Trend component (T)
Cyclical component (C)
Irregular component ( I )
Yt = St , Tt , Ct , I
Time SeriesYt=S,T,C, I
Seasonal removing using smoothing
Trend removal using regression
Cyclical removing using % ratio I
Moving averages
Smoothing techniques are used to reduce irregularities (random
fluctuations) in time series data
Moving averages rank among the most popular techniques for the
preprocessing of time series. They are used to filter random "white
noise" from the data, to make the time series smoother
There are several methods of Moving averages
Simple Moving Averages
Double moving averages
Centered Moving Average
Weighted Moving Average
Simple Moving Averages
Moving Averages (MA) is effective and efficient approach provided
the time series is stationary in both mean and variance
The simple moving average required an odd number of
observations to be included in each average at the middle of the
data value being averaged
Takes a certain number of past periods and add them together;then divide by the number of periods gives the simple movingaverage
The following formula is used in finding the moving average oforder n, MA(n) for a period t+1
MAt+1 = [Yt + Yt-1 + ... +Yt-n+1] / n
Month Time period
Observed values
Three-month moving
average 3 MA
Five-month moving
average 5 MA
Jan 1 266.0 - -
Fab 2 145.9 --
-
-Mar 3 183.1
Apr 4 119.3 198.333
149.433
160.900
156.033
193.533
208.267
216.367
180.067
217.400
215.100
238.900
-
May 5 180.3 -
Jun 6 168.5
Jul 7 231.8
Aug 8 224.5
Sep 9 192.8
Oct 10 122.9
Nov 11 336.5
Dec 12 185.9
Jan 13 194.3 .
Feb 14 149.5 .
Example
178.92
159.42
176.60
184.88
199.58
188.10
221.70
212.52
206.48.
Example…
Actual
Smoothed
Forecast
Actual
Smoothed
Forecast
20100
350
250
150
50
Ob
s
Time
MSD:
MAD:
MAPE:
Length:
Moving Average
5165.47
57.39
32.03
3
Moving Average
Example…
Actual
Smoothed
Forecast
Actual
Smoothed
Forecast
20100
300
200
100
Ob
s
Time
MSD:
MAD:
MAPE:
Length:
Moving Average
4496.37
52.74
26.23
5
Moving Average
2.1Example
Month Time period
Observed values
Four-month moving average
4 MA
2 X 4 MA
Jan 1 266.0 -178.6
-
Fab 2 145.9 -
Mar 3 183.1 157.6 -
Apr 4 119.3 162.8 -
May 5 180.3 174.9 -
Jun 6 168.5 201.3
Jul 7 231.8 204.4
Aug 8 224.5 193.0
Sep 9 192.8 219.2
Oct 10 122.9 209.5
Nov 11 336.5 209.9
Dec 12 185.9 216.6
Jan 13 194.3 .
Feb 14 149.5 .
167.863
159.975
168.887
188.125
202.837
198.700
206.088
214.350
209.712
167.863
This method is very powerful with comparing simple moving
averages
Weighted MA(3) can be expressed as,
Weighted MA(3) = w1.Yt + w2.Yt-1 + w3.Yt-2
where w1, w2, & w3 are weights
There are many schemes selecting appropriate weights (Kendall,
Stuart, and Ord (1983)
Weights are any positive numbers such that,
w1 + w2 + w3 = 1
One of the methods of calculating weights is,
w1 = 3/(1 + 2 + 3) = 3/6, w2 = 2/6, and w3 = 1/6
Weighted Moving Average
2.Exponential Smoothing Techniques
One of the most successful forecasting methods is the exponential
smoothing (ES)
ES is an averaging technique that uses unequal weights and assigns
exponentially decreasing weights as the observations get older
There are several exponential smoothing techniques
Single Exponential Smoothing
Holt’s linear method
Holt-Winters’ trend and seasonality method
Single Exponential Smoothing
The method of single exponential forecasting takes the
forecast for the previous period and adjust it using the
forecast error. [(Forecast error = (Yt – Ft)]
Ft+1 = Ft + a (Yt - Ft)
Ft+1 = a Yt + (1 - a) Ft
where:
Yt is the actual value
Ft is the forecasted value
a is the weighting factor, which ranges from 0 to 1
t is the current time period
Choosing the Best Value for Parameter a (alpha)
In practice, the smoothing parameter is often chosen by a grid
search of the parameter space
That is, different solutions for = 0.1 to are tried starting with
0.9, with increments of 0.1.
Then is chosen so as to produce the smallest sums of squares
(or mean squares) for the residuals.
Month Timeperiod
Observedvalues
Exponentially Smoothed values
a = 0.1 a = 0.5 a = 0.9Jan 1 200.0
Feb 2 135.0 200.0 200.0 200.0
Mar 3 195.0 193.5 167.5 141.5
Apr 4 197.5 193.7 181.3 189.7
May 5 310.0 194.0 189.4 196.7
Jun 6 175.0 205.6 249.7 298.7
Jul 7 155.0 202.6 212.3 187.4
Aug 8 130.0 197.8 183.7 158.2
Sep 9 220.0 191.0 156.8 132.8
Oct 10 277.5 193.9 188.4 211.3
Nov 11 235.0 202.3 233.0 270.9
Dec 12 - 205.6 234.0 238.6
Analysis of Errors (Test period : 2 – 11)
a = 0.1 a = 0.5 a = 0.9
Mean Error 5.56 6.80 4.29
Mean Absolute Error 47.76 56.94 61.32
Mean Absolute percentage Error (MAPE)
24.58 29.20 30.81
Mean Square Error (MSE) 3438.33 4347.24 5039.37
Theil’s U-statistics 0.81 0.92 0.98
Time Series Plots
0
50
100
150
200
250
300
350
1 2 3 4 5 6 7 8 9 10 11 12
Month
Sh
ipm
en
ts Observed values
SES = 0.1
SES = 0.5
SES = 0.9
Holt’s linear method
Holt (1957) extended single exponential smoothing to linear
exponential smoothing to allow forecasting of data with trends
The forecast for Hollt’s linear exponential smoothing is found using
two smoothing constants, a and (values between 0 & 1), and
three equations
))(1( 11 tttt bLYL aa
11 )1()( tttt bLLb
mbLF ttmt
Smoothing of data
Smoothing of trend
Forecast for m period a head
The initialization process : L1 = Y1 and b1 = Y2 – Y1 or b1 = (Y4-Y1) / 3
Holt-Winters’ trend and seasonality method
Holt’s method was extended by Winters (1960) to capture
seasonality
The Holt-Winter’s method is based on three smoothing
equations, one for level, one for trend, and one for seasonality
))(1( 11
tt
st
t
t bLS
YL aa
11 )1()( tttt bLLb
st
t
t
t SL
YS )1(
mstttmt SmbLF )(
Level
Trend
Seasonal
Forecast
Example…
Actual
Predicted
Forecast
Actual
Predicted
Forecast
150100500
700
600
500
400
300
200
100
Yt
Time
MSD:
MAD:
MAPE:
Delta (season):
Gamma (trend):
Alpha (level):
Smoothing Constants
356.695
12.557
5.163
0.200
0.200
0.200
Winters' Multiplicative Model for Yt
Season Sales Average Sales Seasonal
Factor
Spring 200 250 200/250 = 0.8
Summer 350 250 350/250 = 1.4
Fall 300 250 300/250 = 1.2
Winter 150 250 150/250 = 0.6
Total 1000 1000
Seasonal Factor
Ratio-to-moving-average
Season Average Sales
(1100/4)
Next Year
Forecast
FORECAST
Spring 275 275*0.8 220
Summer 275 275*1.4 385
Fall 275 275*1.2 330
Winter 275 275*0.6 165
Total 1100 1100
If Next year expected sale increment is 10%
The table below represent the Quarterly sales Figures
Year Q1 Q2 Q3 Q4
2008 20 30 39 60
2009 40 51 62 81
2010 50 64 74 85
Time
Period
Quarter Time
index
Sales Centered
MA (4)
Sales/MA*100
2008 Q1 1 20
2008 Q2 2 30
2008 Q3 3 39
2008 Q4 4 60 39.750 150.943
2009 Q1 5 40 44.875 89.136
2009 Q2 6 51 50.375 101.241
2009 Q3 7 62 55.875 110.962
2009 Q4 8 81 59.750 135.565
2010 Q1 9 50 62.625 79.840
2010 Q2 10 64 65.750 97.338
2010 Q3 11 74 67.750 109.225
2010 Q4 12 85
Year Q1 Q2 Q3 Q4
2008 150.94
2009 89.14 101.24 110.96 135.56
2010 79.84 97.33 109.22
Mean
84.49 99.29
110.09
5 143.25
437.125
AF 0.915 0.915 0.915 0.915
Seasonal
Index
77.3085 90.85 100.73
6
131.073
7
399.9693
Adjusted Factor (AF) = 400/437.125=0.915