lecture_6 forecasting

38
Lecture 6: Forecasting (Reading: Ch15 Studenmund) (Reading: Ch22 Gujarati and Porter)

Upload: muhd-qatadah

Post on 26-Jan-2016

218 views

Category:

Documents


1 download

DESCRIPTION

mmu

TRANSCRIPT

Page 1: Lecture_6 forecasting

Lecture 6: Forecasting

(Reading: Ch15 Studenmund)

(Reading: Ch22 Gujarati and Porter)

Lecture 6: Forecasting

Page 2: Lecture_6 forecasting

Overview

• Why Forecast?

• An Overview of Forecasting Techniques

• The Basic Steps in a Forecasting Task

• Forecasting Methods• Forecasting Methods

Page 3: Lecture_6 forecasting

Why Forecast?

• To take appropriate actions and planning

• Is an integral part of the decision making process

• The accuracy of the forecasting depends on the

uncontrollable external events and controllable internal

eventsevents

Page 4: Lecture_6 forecasting

An Overview of Forecasting Techniques

Types of forecasting methods

• Quantitative

– time series

– explanatory

• Qualitative

– little or no quantitative information available, but

sufficient qualitative knowledge exits

• Unpredictable

– little or no information is available

Page 5: Lecture_6 forecasting

Basic Steps in Forecasting• Problem Definition

– how, who, what

• Gathering Information

– statistical and accumulated judgement and expertise

• Preliminary (Exploratory) Analysis

– graphing for visual inspection ⇒ statistically analysis– graphing for visual inspection ⇒ statistically analysis

• Choosing the Fitting Models

– extrapolation; exponential smoothing model; regression,

ARIMA; VAR

• Using and Evaluating Model

– fitting errors vs forecasting errors

Page 6: Lecture_6 forecasting

Basic Approaches for Forecasting

• 5 economic forecasting approaches based on time series

data:

– single equation regression model

– simultaneous equation model

– exponential smoothing methods– exponential smoothing methods

– ARIMA

– VAR

Page 7: Lecture_6 forecasting

Regression

• Applying the linear regression techniques with a set of explanatory variables to estimate the constant and slope coefficients of the model

• Next, use the regression equation to forecast future valuevalue

• Modify some or all of the explanatory variables and try again if the regression model does not give good summary statistics (R2, MSE…etc.),

Page 8: Lecture_6 forecasting

Smoothing

• To eliminate or reduce consistent short-term fluctuations

such as seasonal fluctuations.

• Useful to analyse the trends and variable behaviour

• removes only the seasonal (pattern) fluctuations but not

irregular fluctuations

Page 9: Lecture_6 forecasting

Smoothing Technique - Moving Average

• N-period moving average:

Yt = 1/n (Yt + Yt-1 +……. + Yt-n+1)

– the larger the n is , the smoother the series will be

– but moving average uses only past values with equal

weightweight

• To solve the problem:

• exponential smoothing

Yt+1= α Yt + α(1- α) Yt-1+ α(1- α)2 Yt-2 + ...

Page 10: Lecture_6 forecasting

ARIMA

• Normally, we use linear regression equations to forecast the dependent variable by plugging values of independent variables into the estimated equations and calculate the predicted value of Y

• ARIMA completely ignores dependent variables in making forecasts

• ARIMA uses current and past values of the dependent • ARIMA uses current and past values of the dependent variable to produce forecasted values

• increasing popular especially for forecasts in stock market prices based entirely on past patterns of movement of the stock prices

Page 11: Lecture_6 forecasting

When to use ARIMA?

• ignores dependent variables

– ignores theory ⇒ ARIMA is appropriate when little is

known about the dependent variable being forecasted

– when the independent variables known to be

important cannot be forecasted effectivelyimportant cannot be forecasted effectively

– when only short-term forecasts are needed

– to produce forecasts of residuals from regression

Page 12: Lecture_6 forecasting

Three Phases of the Applications of Box-Jenkins Methodology

Page 13: Lecture_6 forecasting

Data Preparation

EXAMINING TIME SERIES DATA

Autocorrelation (r)

• indicates how successive values of y relate to each other,

• Example, r(2) indicates how y values two period apart relate to each

other, and so on.

• The autocorrelations at lag 1, 2,… make up the autocorrelation • The autocorrelations at lag 1, 2,… make up the autocorrelation

function (ACF)

( )( )

( )2

1

1

=

+=

−−

=n

t

t

kt

n

kt

t

k

yy

yyyy

r

Page 14: Lecture_6 forecasting

Data Preparation

EXAMINING TIME SERIES DATA

Partial autocorrelation coefficient

• measure of the relationship between two variables when the effect of other variables has been removed or held constant.

• For time series particularly, it is used to measure the • For time series particularly, it is used to measure the degree of association between yt and yt-k, when the effects of other time lags 1,2,3,…k-1 are removed

Page 15: Lecture_6 forecasting

Data Preparation

Page 16: Lecture_6 forecasting
Page 17: Lecture_6 forecasting

EXAMINING TIME SERIES DATA

Stationary

• a series with no growth or decline in the data.

• the data fluctuate around a constant mean (stationary in the mean), and the variance remains unchanged (stationary in variance).

Data Preparation

variance).

• ACF plot can also be used to expose stationarity in the time series.

• ACF for a stationary series: different from zero for the first few lags (k<5). The ACF for all lags equal to zero

Page 18: Lecture_6 forecasting

Data Preparation

0

2 ,00 0

4 ,00 0

6 ,00 0

8 ,00 0

10 ,00 0

12 ,00 0

5 0 5 5 6 0 65 70 75 80 85 9 0 9 5 0 0 0 5

G D P

5 0 5 5 6 0 65 70 75 80 85 9 0 9 5 0 0 0 5

-1 2 0

-8 0

-4 0

0

4 0

8 0

1 2 0

1 6 0

2 0 0

5 0 5 5 6 0 6 5 7 0 7 5 8 0 8 5 9 0 9 5 0 0 0 5

D G D P

Page 19: Lecture_6 forecasting

EXAMINING TIME SERIES DATA

Test for Stationary - Dickey-Fuller Test

• Using the OLS to run the regression on the following forms and check

whether ρ=1, or δ=0 is statistically significant.

Testing regressions:

Level First-difference

• (No constant, no trend) Yt = ρρρρYt-1 + et ⇒⇒⇒⇒ ∆∆∆∆Yt = δδδδYt-1 + et

Data Preparation

• (No constant, no trend) Yt = ρρρρYt-1 + et ⇒⇒⇒⇒ ∆∆∆∆Yt = δδδδYt-1 + et

• (With constant) Yt = αααα + ρρρρYt-1 + et ⇒⇒⇒⇒ ∆∆∆∆Yt = αααα + δδδδYt-1 + et

• (Constant & trend) Yt = αααα + ββββT + ρρρρYt-1 + et ⇒⇒⇒⇒ ∆∆∆∆Yt = αααα + ββββT + δδδδYt-1 + e

• In each case the null hypothesis is

H0: ρρρρ=1 (unit root) ⇒⇒⇒⇒ H0: δδδδ=0 (unit root)

Page 20: Lecture_6 forecasting

EXAMINING TIME SERIES DATA

Test for Stationary - Dickey-Fuller Test

• Decision rule:

• If t < “tau”, H0 is rejected, it means the Yt is stationary.

• If t > “tau”, H0 is not rejected, it means the Yt is non-stationary.

Data Preparation

tstationary.

• Critical 1, 5, 10 % ττττ value are

• –2.5897, –1.9439, –1.6177 for model with no constant, trend

• –3.5064, –2.8947, –2.5842 for model with constant

• –4.0661, –3.4614, –3.1567 for model with constant and trend

Page 21: Lecture_6 forecasting

EXAMINING TIME SERIES DATA

Removing Non-Stationarity in a Time Series

• Stationary can be obtained by differencing.

• ∆yt = yt - yt-1

• Occasionally, the differenced data will not appear stationary

and it may be necessary to difference the data a second time

Data Preparation

and it may be necessary to difference the data a second time

• yt" is referred to as the series of second order differences. In

practice, it is almost never necessary to go beyond second-

order differences.

Page 22: Lecture_6 forecasting
Page 23: Lecture_6 forecasting

Model Selection: ARIMA model

• There is a huge variety of ARIMA models. Any

such model can be written using the uniform

notation ARIMA(p,d,q), where

• AR : p = order of the autoregressive part• AR : p = order of the autoregressive part

• I : d = degree of first differencing involved

• MA: q = order of the moving average part

Page 24: Lecture_6 forecasting

Model Selection: AR models

Autoregressive Models of Order One

• In the context of Box-Jenkins Modeling, the parameters of AR

models are conventionally denoted by φi.

• AR(1) or ARIMA(1,0,0): Yt = φ0 + φ1Yt-1 + εt

Higher Order Autoregressive Models

• The number of past stationary observations used in an

autoregressive model is known as the order. So, in general, a

pth order AR model is defined as follows:

• Yt = φ0 + φ1Yt-1 + φ2Yt-2 +… + φpYt-p + εt

Page 25: Lecture_6 forecasting

Model Selection: MA models

• Moving Average Models of Order One

• In the context of Box-Jenkins Modeling, the parameters of MA models are conventionally denoted by -θi.

• MA(1) or ARIMA(0,0,1): Y = θ + ε - θ ε• MA(1) or ARIMA(0,0,1): Yt = θ0 + εt - θ1εt -1

• The parameter is restricted to lie between -1 and +1

• Higher Order Moving Average Models

• In general, a qth order MA model is defined as follows: Yt = θ0 + εt - θ1εt -1- θ2εt -2- … - θqεt -q

Page 26: Lecture_6 forecasting

Expected patterns in the ACF and PACF for AR and MA

models

Type of model Typical pattern of ACF Typical pattern of PACF

AR(p) Decays exponentially or

with damped sine

wave pattern or both

Significant spikes through

lags p

MA(q) Significant spikes

through lags q

Declines exponentially

ARMA(p,q) Exponential decay Exponential decay

Page 27: Lecture_6 forecasting

Expected patterns

Page 28: Lecture_6 forecasting

ARMA• AR and MA models can be combined to form

ARMA model.

• For example, Yt = c + φ1Yt-1 + εt - θ1εt -1

combines AR(1) and MA(1) to form ARMA(1,1) or ARIMA (1,0,1).or ARIMA (1,0,1).

• A ARMA model with higher order terms is written as:

Yt = c + φ1Yt-1 + φ2Yt-2 +… + φpYt-p +εt -θ1εt -1-θ2εt -2- … - θqεt -q

Page 29: Lecture_6 forecasting

ARMA

Page 30: Lecture_6 forecasting

ARIMA

• If non-stationarity is added to ARMA model,

then we obtain ARIMA(p,d,q) model.

Page 31: Lecture_6 forecasting
Page 32: Lecture_6 forecasting
Page 33: Lecture_6 forecasting
Page 34: Lecture_6 forecasting
Page 35: Lecture_6 forecasting
Page 36: Lecture_6 forecasting
Page 37: Lecture_6 forecasting
Page 38: Lecture_6 forecasting