Auto Correlation Presentation

100002907643874 21,859 views 35 slides May 31, 2015
Slide 1
Slide 1 of 35
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35

About This Presentation

Econometrics


Slide Content

BS Statistics
6th
Semester
Regular
University of Sargodha
Session 2011-2015

Introduction
Causes of Autocorrelation
OLS Estimation
BLUE Estimator
Consequences of using OLS
Detecting Autocorrelation

1.Introduction
Autocorrelation occurs in time-series studies
when the errors associated with a given time
period carry over into future time periods.
For example, if we are predicting the growth of
stock dividends, an overestimate in one year is
likely to lead to overestimates in succeeding
years.

1.Introduction
Times series data follow a natural
ordering over time.
It is likely that such data exhibit
intercorrelation, especially if the time
interval between successive observations
is short, such as weeks or days.

1.Introduction
We expect stock market prices to move or
move down for several days in succession.
In situation like this, the assumption of no auto
or serial correlation in the error term that
underlies the CLRM will be violated.
We experience autocorrelation when
0)( ¹
jiuuE

1.Introduction
Sometimes the term autocorrelation is used
interchangeably.
However, some authors prefer to distinguish between
them.
For example, Tintner defines autocorrelation as ‘lag
correlation of a given series within itself, lagged by a
number of times units’ whereas serial correlation is the
‘lag correlation between two different series’.
We will use both term simultaneously in this lecture.

1.Introduction
There are different types of serial correlation.
With first-order serial correlation, errors in
one time period are correlated directly with
errors in the ensuing time period.
With positive serial correlation, errors in one
time period are positively correlated with errors
in the next time period.

2.Causes of Autocorrelation
Inertia - Macroeconomics data experience
cycles/business cycles.
Specification Bias- Excluded variable
Appropriate equation:
Estimated equation
Estimating the second equation implies
ttttt
uXXXY ++++=
4433221
bbbb
tttt
vXXY +++=
33221
bbb
ttt
uXv +=
44
b

2.Causes of Autocorrelation
Specification Bias- Incorrect Functional
Form
tttt
vXXY +++=
2
23221
bbb
ttt
uXY ++=
221
bb
ttt
vXu +=
2
23
b

2.Causes of Autocorrelation
Cobweb Phenomenon
In agricultural market, the supply reacts to
price with a lag of one time period because
supply decisions take time to implement. This
is known as the cobweb phenomenon.
Thus, at the beginning of this year’s planting
of crops, farmers are influenced by the price
prevailing last year.

2.Causes of Autocorrelation
Lags
The above equation is known as auto
regression because one of the explanatory
variables is the lagged value of the dependent
variable.
If you neglect the lagged the resulting error
term will reflect a systematic pattern due to the
influence of lagged consumption on current
consumption.
ttt
unConsumptionConsumptio ++=
-121
bb

2.Causes of Autocorrelation
Data Manipulation
This equation is known as the first difference form
and dynamic regression model. The previous
equation is known as the level form.
Note that the error term in the first equation is not
auto correlated but it can be shown that the error
term in the first difference form is auto correlated.
ttt
uXY ++=
21
bb 11211 ---
++=
ttt
uXY bb
ttt
vXY +D=D
2
b

2.Causes of Autocorrelation
 Nonstationarity
When dealing with time series data, we
should check whether the given time series is
stationary.
A time series is stationary if its characteristics
(e.g. mean, variance and covariance) are time
variant; that is, they do not change over time.
If that is not the case, we have a non
stationary time series.

Suppose Yt is related to X2t and X3t, but we
wrongfully do not include X3t in our model.
The effect of X3t will be captured by the
disturbances ut. If X3t like many economic series
exhibit a trend over time, then X3t depends on
X3t-1,X3t -2and so on. Similarly then ut depends
on ut-1, ut-2 and so on.

Suppose Yt is related to X2t with a quadratic
relationship:
Yt=β1+β2X22t+ut
but we wrongfully assume and estimate a straight
line:
Yt=β1+β2X2t+ut
Then the error term obtained from the straight line
will depend on X22t.

Suppose a company updates its inventory at a
given period in time.
If a systematic error occurred then the cumulative
inventory stock will exhibit accumulated
measurement errors.
These errors will show up as an auto correlated
procedure.

The simplest and most commonly observed is the
first-order autocorrelation.
Consider the multiple regression model:
Y
t=β
1+β
2X
2t+β
3X
3t+β
4X
4t+…+β
kX
kt+u
t
in which the current observation of the error term
u
t
is a function of the previous (lagged)
observation of the error term:
u
t
=ρu
t-1
+e
t

The coefficient ρis called the first-order
autocorrelation coefficient and takes values from
-1 to +1.
It is obvious that the size of ρ will determine
the strength of serial correlation.
We can have three different cases

If ρ is zero, then we have no autocorrelation.
If ρ approaches unity, the value of the previous
observation of the error becomes more important
in determining the value of the current error and
therefore high degree of autocorrelation exists. In
this case we have positive autocorrelation.
If ρ approaches -1, we have high degree of
negative autocorrelation.

Second-order when:
u
t

1
u
t-1
+ ρ
2
u
t-2
+et
Third-order when
u
t

1
u
t-1
+ ρ
2
u
t-2

3
u
t-3
+e
t
p-th order when:
u
t

1
u
t-1
+ ρ
2
u
t-2

3
u
t-3
+…+ ρ
p
u
t-p
+e
t

The OLS estimators are still unbiased and consistent.
This is because both unbiasedness and consistency do
not depend on assumption 6 which is in this case violated.
The OLS estimators will be inefficient and therefore no
longer BLUE.
The estimated variances of the regression coefficients will
be biased and inconsistent, and therefore hypothesis
testing is no longer valid. In most of the cases, the R
2
will
be overestimated and the t-statistics will tend to be higher.

There are two ways in general.
The first is the informal way which is done
through graphs and therefore we call it the
graphical method.
The second is through formal tests for
autocorrelation, like the following ones:
The Durbin Watson Test
The Breusch-Godfrey Test
Run test

The following assumptions should be
satisfied:
The regression model includes a constant
Autocorrelation is assumed to be of first-order
only
The equation does not include a lagged
dependent variable as an explanatory variable

Step 1: Estimate the model by OLS and obtain the
residuals
Step 2: Calculate the DW statistic
Step 3: Construct the table with the calculated DW
statistic and the dU, dL, 4-dU and 4-dL critical
values.
Step 4: Conclude

It is a Lagrange Multiplier Test that resolves the
drawbacks of the DW test.
Consider the model:
Y
t

1

2
X
2t

3
X
3t

4
X
4t
+…+β
k
X
kt
+u
t
where:
u
t

1
u
t-1
+ ρ
2
u
t-2

3
u
t-3
+…+ ρ
p
u
t-p
+e
t
Combining those two we get:
Y
t

1

2
X
2t

3
X
3t

4
X
4t
+…+β
k
X
kt
+

1
u
t-1
+ ρ
2
u
t-2

3
u
t-3
+…+ ρ
p
u
t-p
+e
t
Topic Nine Topic Nine Serial Correlation Serial Correlation

The null and the alternative hypotheses are:
H
0

1
= ρ
2
=…= ρ
p
=0 no autocorrelation
H
a
:at least one of the ρ’s is not zero, thus, autocorrelation
Step 1: Estimate the model and obtain the residuals
Step 2: Run the full LM model with the number of lags used
being determined by the assumed order of autocorrelation.
Step 3: Compute the LM statistic = (n-ρ)R
2
from the LM model
and compare it with the chi-square critical value.
Step 4: Conclude
Topic Nine Topic Nine Serial Correlation Serial Correlation

We have two different cases:
When ρ is known
When ρ is unknown
Consider the model
Y
t

1

2
X
2t

3
X
3t

4
X
4t
+…+β
k
X
kt
+u
t
where
u
t=ρ
1u
t-1+e
t
Topic Nine Topic Nine Serial Correlation Serial Correlation

Write the model of t-1:
Y
t-1

1

2
X
2t-1

3
X
3t-1

4
X
4t-1
+…+β
k
X
kt-1
+u
t-1
Multiply both sides by ρto get
ρY
t-1
= ρβ
1
+ ρβ
2
X
2t-1
+ ρβ
3
X
3t-1
+ ρβ
4
X
4t-1
+…+ ρβ
k
X
kt-1
+ ρu
t-1
Subtract those two equations:
Y
t
-ρY
t-1
= (1-ρ)β
1
+ β
2
(X
2t
-ρX
2t-1
)+ β
3
(X
3t
-ρX
3t-1
)+
+…+ β
k
(X
kt
-ρX
kt-1
)+(u
t
-ρu
t-1
)
or
Y*
t
= β*
1
+ β*
2
X*
2t
+ β*
3
X*
3t
+…+ β*
k
X*
kt
+e
t
Topic Nine Topic Nine Serial Correlation Serial Correlation

Where now the problem of autocorrelation is
resolved because e
t
is no longer autocorrelated.
Note that because from the transformation
we lose one observation, in order to avoid that
loss we generate Y1 and Xi1 as follows:
Y*
1
=Y
1
sqrt(1- ρ
2
)
X*
i1
=X
i1
sqrt(1-ρ
2
)
This transformation is known as the quasi-
differencing or generalised differencing.

The Cochrane-Orcutt iterative procedure.
Step 1: Estimate the regression and obtain
residuals
Step 2: Estimate ρ from regressing the residuals
to its lagged terms.
Step 3: Transform the original variables as
starred variables using the obtained from step 2.

Step 4: Run the regression again with the
transformed variables and obtain residuals.
Step 5 and on: Continue repeating steps 2 to 4
for several rounds until (stopping rule) the
estimates of from two successive iterations differ
by no more than some preselected small value,
such as 0.001.

MUHAMMAD IRFAN HUSSIAN(39)
DR ANWAR UL HAQ ( 28)
AMMER UMER (MPA) (30)
CH AAMAR LASHARI (45)

U n i v e
U n i v e
r s t
r s t Thank Thank
YouYou
Tags