This appendix introduces the statistical and econometric tools most frequently encountered in macroeconomic research. It presupposes familiarity with basic probability theory (random variables, expectations, variances, conditional distributions) and elementary linear regression (OLS, -statistics, ). The goal is to develop the tools needed to read empirical macroeconomics papers and to interpret quantitative claims critically.
D.1 Time-Series Fundamentals¶
Stationarity and Autocovariance¶
A time series is covariance-stationary if its mean, variance, and autocovariances are all finite and time-invariant:
The autocorrelation function (ACF) is . The partial autocorrelation function (PACF) is the correlation between and after removing the linear dependence explained by .
An AR(1) process , , is stationary with:
Unit Root Processes¶
A process has a unit root if (random walk) or more generally where has a root on the unit circle. Unit-root processes are non-stationary: their variance grows without bound and shocks have permanent effects. The Augmented Dickey–Fuller (ADF) test for against estimates:
and tests using a non-standard distribution (critical values from Dickey and Fuller, 1979). The lags control for serial correlation in and are chosen by AIC or BIC.
The Phillips–Perron (PP) test is a non-parametric alternative that corrects for serial correlation in without adding lags. Both tests have low power in small samples — they often fail to reject the unit root null even when the true process is stationary with a root close to one.
Cointegration¶
Two or more unit-root series are cointegrated if a linear combination of them is stationary. Cointegration implies a long-run stable relationship: the series share a common stochastic trend and deviations from the long-run relationship are transitory.
If (integrated of order 1) and and (stationary), then and are cointegrated with cointegrating vector . Economically, the money demand relation may be a cointegrating relationship: each component is non-stationary, but their specific linear combination is stationary, representing a stable long-run equilibrium.
The Engle–Granger two-step procedure: (1) regress on by OLS to obtain ; (2) test whether the residuals are stationary using ADF. The Johansen procedure tests for cointegration in a multivariate system using maximum likelihood and identifies all cointegrating vectors.
D.2 Regression with Time-Series Data¶
OLS with Serially Correlated Errors¶
When the error term in is serially correlated — which is the rule rather than the exception in time-series data — OLS remains unbiased and consistent (under stationarity and ergodicity) but its standard errors are invalid. The Newey–West estimator provides heteroskedasticity and autocorrelation consistent (HAC) standard errors:
where and (Bartlett weights). The bandwidth is typically set to or similar data-dependent rule.
Instrumental Variables in Time Series¶
The IV estimator requires: (i) the instrument is correlated with the endogenous regressor (relevance: first stage ); (ii) is uncorrelated with the error (exclusion). In time series, finding valid instruments is challenging because all lagged variables are potentially endogenous. Narrative instruments (Romer–Romer monetary policy shocks, Ramey defense news) are constructed to be exogenous from first principles.
Regression Discontinuity in Macroeconomics¶
The sharp RD estimator identifies the local average treatment effect at the threshold :
estimated by local polynomial regression in a bandwidth around . Bandwidth selection balances bias (choosing too wide a window includes observations far from the threshold where linearity may fail) against variance (too narrow a window has few observations). The optimal MSE bandwidth (Imbens and Kalyanaraman, 2012) is widely used.
D.3 Spectral Analysis¶
Spectral analysis decomposes the variance of a time series across frequencies, identifying which periodicities are most important. The spectral density of a stationary process is:
The spectral density gives the contribution to the total variance of components with frequency (period ). For an AR(1) with : the spectrum is highest at low frequencies (long cycles), reflecting the positive autocorrelation. For white noise: the spectrum is flat at — all frequencies contribute equally.
In macroeconomics, spectral analysis identifies the “business cycle frequencies”: Hodrick and Prescott (1997) define the business cycle as fluctuations with periods of 6–32 quarters (frequencies corresponding to cycles of 1.5–8 years), motivating the HP filter which removes components outside this range.
The band-pass filter (Baxter and King, 1999) directly extracts components within a specified frequency band, using a moving average filter designed in the frequency domain. It is more flexible than the HP filter and avoids the endpoint bias of the two-sided HP filter.
D.4 Decomposing Trend and Cycle¶
Beyond the HP filter (developed in Chapter 6), several approaches to decomposing macroeconomic series into trend and cyclical components have been proposed.
Beveridge–Nelson Decomposition¶
The Beveridge–Nelson (BN) decomposition (Beveridge and Nelson, 1981) decomposes an integrated series into a random walk trend and a stationary cyclical component:
where is the “permanent component” — the value the series would take in the long run if the disturbance process ended immediately:
and is the transitory “cyclical” component. Under the BN decomposition, all cycles are contemporaneously correlated with trend innovations (because they share the same shock structure), which gives very smooth cycles and jagged trends — the opposite of what one might naively expect.
Unobserved Components Models¶
An unobserved components (UC) model treats the trend and cycle as separate stochastic processes and estimates both simultaneously by the Kalman filter:
The signal-to-noise ratio governs the smoothness of the trend: small implies a smooth trend (the HP filter with large is a special case), while large implies a volatile trend. Watson (1986) and Clark (1987) estimated UC models for U.S. real GDP, finding that the business cycle component explains a substantial share of output variance.
D.5 Panel Data Econometrics¶
A balanced panel consists of observations on cross-sectional units (countries, firms, households) over time periods. The general linear panel model:
Fixed effects (FE) estimation treats as free parameters to be estimated (by including dummy variables or by within-transformation). FE controls for all time-invariant heterogeneity across units, so identification comes from within-unit variation over time. The FE estimator is consistent as for fixed but suffers from Nickell bias when is small: including lags of as regressors induces downward bias of order .
Random effects (RE) estimation treats as drawn from a distribution and estimates by generalized least squares. RE is more efficient than FE if is uncorrelated with , but inconsistent if not. The Hausman test distinguishes FE and RE: under (no correlation between and ), both are consistent but RE is efficient; under , FE is consistent but RE is not. The Hausman statistic:
Two-way fixed effects (TWFE) includes both unit fixed effects and time fixed effects . With recent treatments (staggered treatment timing), the TWFE estimator can produce sign-reversed estimates due to negative weighting of late-treated units (Callaway and Sant’Anna, 2021; Goodman-Bacon, 2021). Modern DiD methods address this by estimating group-time average treatment effects and aggregating appropriately.
D.6 Measuring Uncertainty and Risk¶
Macroeconomic models increasingly incorporate uncertainty as a state variable, requiring empirical measures. Three approaches are standard.
Cross-sectional dispersion measures disagreement among forecasters or across firms. The standard deviation of forecasts in the Survey of Professional Forecasters (SPF) or the Federal Reserve’s Greenbook is used as a proxy for aggregate uncertainty. Higher dispersion suggests that agents disagree about the state of the economy, consistent with higher effective uncertainty.
Realized volatility measures the ex-post variance of a variable computed from high-frequency data. The VIX (CBOE Volatility Index) is the market-implied standard deviation of S&P 500 returns over the next 30 days, derived from options prices. Bloom (2009) uses the VIX as a proxy for macroeconomic uncertainty in his study of investment responses to uncertainty shocks.
Text-based measures use natural language processing applied to newspapers, central bank communications, or firm earnings calls to construct indices of economic policy uncertainty. Baker, Bloom, and Davis (2016) construct the Economic Policy Uncertainty (EPU) index from: the frequency of newspaper articles mentioning economic uncertainty and policy; the number of expiring tax provisions; and the dispersion of economic forecasts. The EPU index has become a widely-used empirical proxy for policy uncertainty.
D.7 Bayesian Statistics for Macroeconomics¶
Bayesian inference updates a prior belief about parameters using observed data to obtain a posterior belief .
Prior distributions for macroeconomic parameters are typically chosen to: (i) restrict parameters to theoretically plausible ranges (e.g., : Beta distribution; : Inverse-Gamma distribution); (ii) center near values from microeconomic studies; (iii) be relatively diffuse to let the data speak.
The Metropolis–Hastings (MH) algorithm is the workhorse for sampling from posterior distributions that cannot be evaluated analytically. At iteration :
Propose from a proposal distribution.
Compute acceptance probability .
Set with probability , else .
After a burn-in period, the chain converges to samples from the posterior .
Model comparison in a Bayesian framework uses the marginal likelihood (or evidence):
The Bayes factor for model against is . A Bayes factor above 10 is considered strong evidence for . The Deviance Information Criterion (DIC) and Watanabe-Akaike Information Criterion (WAIC) are alternatives to the Bayes factor for model comparison that are less sensitive to prior specification.
See also Appendix B (Research Methods) for applications of these tools in macroeconomic research.