class: center, middle, inverse, title-slide .title[ # Econometrics - Lecture 6 ] .subtitle[ ## Introduction to Time Series ] .author[ ### Jonas Björnerstedt ] .date[ ### 2024-11-18 ] --- ## Time series Part IV. Regression Analysis of Economic Time Series Data - Ch. 15. _Introduction to Time Series Regression and Forecasting_ - Time series concepts (today) - Testing and forecasting - Ch. 16. Estimation of Dynamic Causal Effects - Ch. 17. Additional Topics in Time Series Regression * Focus on intuition --- ## Lecture Content - Analysis of `\(Y_i\)` - Additional regressors `\(X_{ki}\)` next lecture - Autoregressive models - Stationarity Chapter 15.1 - 15.3, 15.6 (first part) --- ## Time series data - Assumption of random draws not appropriate - Same individual, at different points in time - Unobserved variables can have persistence - Errors are - mean zero - not correlated with regressors - Correlated over time - _Autocorrelation_ - `\(t\)` subscript rather than `\(i\)` --- ## Plotting autocorrellation <img src="Figures/autocorr_pos.PNG" alt="Drawing" style="width: 350px;"/> <img src="Figures/autocorr_neg.PNG" alt="Drawing" style="width: 350px;"/> <img src="Figures/autocorr_none.PNG" alt="Drawing" style="width: 350px;"/> --- ## Autocorrelation - We will assume that variances and covariances are independent of time - Covariance stationary - Variance given by `\(var(Y_t) = var(Y_{s})\)` - Definition of _autocovariance_ j: `\(cov(Y_t,Y_{t-j})\)` - Autocorrlation coefficient `\(\rho_j\)` `$$\rho_j = \frac{cov(Y_t,Y_{t-j})}{var(Y_t)}$$` - Note unusual notation p 575 --- ## 15.3 Autoregressions - In an autoregressive model past values `\(Y_{t-s}\)` affect `\(Y_{t}\)` - First order AR(1) `$$Y_t = \beta_0 + \beta_1 Y_{t-1} + u_t$$` - Second order AR(2) `$$Y_t = \beta_0 + \beta_1 Y_{t-1} + \beta_2 Y_{t-2} + u_t$$` --- ## Stationarity - (Strict) stationarity - Distribution of `\(Y_t\)` does not depend on time `\(t\)` - Covariance stationarity - Variance `\(\mathrm{Var}(Y_t) = \sigma^2_Y\)` and covariance `\(\mathrm{Cov}(Y_t,Y_{t-s})\)` do not depend on time `\(t\)` --- ## [AR(1) and AR(2) dynamic processes <sup> 🔗 </sup>](http://192.121.208.72:3939/time_series06-figs.Rmd#section-dynamic-process) - `\(Y_t\)` depends on previous values - Here `\(Y_t = 0.5 Y_{t-1}\)` ![](time_series06_files/figure-html/unnamed-chunk-2-1.png)<!-- --> ??? * Describe AR(1) * Noise * Zero, positive and negative autocorrelation * Stationarity --- ## [Random walk <sup> 🔗 </sup>](http://192.121.208.72:3939/time_series06-figs.Rmd#section-random-walk) ![](time_series06_files/figure-html/unnamed-chunk-3-1.png)<!-- --> --- ## [AR(1) Stationarity and drift <sup> 🔗 </sup>](http://192.121.208.72:3939/time_series06-figs.Rmd#section-stationarity) ![](time_series06_files/figure-html/unnamed-chunk-4-1.png)<!-- --> --- ## [Stationarity and AR(2) <sup> 🔗 </sup>](http://192.121.208.72:3939/time_series06-figs.Rmd#section-ar2) ![](time_series06_files/figure-html/unnamed-chunk-5-1.png)<!-- --> --- ## What is a trend? - Autoregressive AR(1) `$$Y_t = \beta_0 + \beta_1 Y_{t-1} + u_t$$` - Deterministic trend `$$Y_t = \beta_0 + \alpha t + u_t$$` - Stochastic trend `$$Y_t = Y_{t-1} + u_t$$` - Stochastic trend with drift `$$Y_t = \beta_0 + Y_{t-1} + u_t$$` --- ## Problems caused by stochastic trends - What happens in an AR(1) as `\(\beta_1 \rightarrow 1\)`? - Bias - Nonnormal distribution - Spurious regression - It looks like there are deterministic time trends - With repeated sampling or longer time effect disappears - Forecast difference between time trend and stochastic trend with drift --- ## Autocorrelation: correlation over time - Previously, we have assumed independent draws - `\(u_{s}\)` and `\(u_{t}\)` are independent. - Assume instead (Sequential) exogeneity: `$$E\left(u_{t}\left| Y_{t-1}, Y_{t-2},\ldots\right.\right)=0$$` - Distribution does not change over time - Stationarity: Distribution of `\(Y_t\)` does not change over time - Ergodicity: Correllation between observations far apart go to zero - Large outliers are unlikely and no perfect multicollinearity (many observations) --- ## [Plotting AR(1) <sup> 🔗 </sup>](http://192.121.208.72:3939/time_series06-figs.Rmd#section-ar1-plot) .pull-left[ #### Plot over time ![](time_series06_files/figure-html/unnamed-chunk-6-1.png)<!-- --> ] .pull-right[ #### Plot `\(Y_{t}\)` against `\(Y_{t-1}\)` ![](time_series06_files/figure-html/unnamed-chunk-7-1.png)<!-- --> ] --- ## [AR(1) Bias <sup> 🔗 </sup>](http://192.121.208.72:3939/time_series06-figs.Rmd#section-ar1-bias) - The estimate of the lag parameter will be biased - Consistent estimator - approaches true value with many observations ![](time_series06_files/figure-html/unnamed-chunk-8-1.png)<!-- --> --- ## Estimation - With large sample size, the estimate is close to the true value ```r y = ar_sim(30000, 0.5) lm(y ~ lag(y)) ``` ``` ## ## Call: ## lm(formula = y ~ lag(y)) ## ## Coefficients: ## (Intercept) lag(y) ## -0.001441 0.506487 ``` --- ## Forecasts Given an estimate, we can forecast the value of `\(Y_{T+1}\)` in period `\(T\)` `$$\hat Y_{T+1|T} = \hat\beta_0 + \hat\beta_1 Y_{T}$$` - Predict in dataset --- ## Forecast errors - Forecast error can be determined at time `\(T+1\)` by comparing with actual value `\(Y_{T+1}\)` - Root Mean Square Forecast Error RMSFE `$$\mathrm{RMSFE}= \sqrt{E\left[\left(Y_{T+1} - \hat Y_{T+1|T}\right)^2\right]}$$` - Two sources of error 1. Uncertainty in estimate of `\(\hat\beta_i\)` 1. Uncertainty due to error `\(u_i\)` - Estimated by the standard error of the regression SER --- ## Next lecture - Ch. 15. Introduction to Time Series Regression and Forecasting (testing)