Anthony Garratt, Kevin Lee, M. Hashem Pesaran, and Yongcheol Shin
- Published in print:
- 2006
- Published Online:
- September 2006
- ISBN:
- 9780199296859
- eISBN:
- 9780191603853
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0199296855.003.0006
- Subject:
- Economics and Finance, Econometrics
This chapter briefly reviews the econometric methods needed for the empirical analysis of cointegrating VAR models and the associated impulse response functions, including new materials (on the ...
More
This chapter briefly reviews the econometric methods needed for the empirical analysis of cointegrating VAR models and the associated impulse response functions, including new materials (on the conditions under which error-correction models are mean-reverting, for example) that are particularly useful in practical macroeconometric modelling.Less
This chapter briefly reviews the econometric methods needed for the empirical analysis of cointegrating VAR models and the associated impulse response functions, including new materials (on the conditions under which error-correction models are mean-reverting, for example) that are particularly useful in practical macroeconometric modelling.
Anindya Banerjee, Juan J. Dolado, John W. Galbraith, and David F. Hendry
- Published in print:
- 1993
- Published Online:
- November 2003
- ISBN:
- 9780198288107
- eISBN:
- 9780191595899
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198288107.003.0002
- Subject:
- Economics and Finance, Econometrics
The focus in this chapter is on the properties of linear autoregressive‐distributed lag (ADL) models for stationary data processes, in order to understand later transformations in non‐stationary ...
More
The focus in this chapter is on the properties of linear autoregressive‐distributed lag (ADL) models for stationary data processes, in order to understand later transformations in non‐stationary models. Various equivalent transformations of ADL models are considered, especially the error‐correction, Bewley and Bardsen forms, and the estimation of long‐run multipliers (and their variances) from these models is discussed. The role of expectational variables in inference about long‐run multipliers is also investigated and potential problems are shown to be related to the general issue of the absence of weak exogeneity for the parameters of interest.Less
The focus in this chapter is on the properties of linear autoregressive‐distributed lag (ADL) models for stationary data processes, in order to understand later transformations in non‐stationary models. Various equivalent transformations of ADL models are considered, especially the error‐correction, Bewley and Bardsen forms, and the estimation of long‐run multipliers (and their variances) from these models is discussed. The role of expectational variables in inference about long‐run multipliers is also investigated and potential problems are shown to be related to the general issue of the absence of weak exogeneity for the parameters of interest.
Søren Johansen
- Published in print:
- 1995
- Published Online:
- November 2003
- ISBN:
- 9780198774501
- eISBN:
- 9780191596476
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198774508.001.0001
- Subject:
- Economics and Finance, Econometrics
This monograph is concerned with the statistical analysis of multivariate systems of non‐stationary time series of type I(1). It applies the concepts of cointegration and common trends in the ...
More
This monograph is concerned with the statistical analysis of multivariate systems of non‐stationary time series of type I(1). It applies the concepts of cointegration and common trends in the framework of the Gaussian vector autoregressive model. The main result on the structure of cointegrated processes as defined by the error correction model is Grangers representation theorem. The statistical results include derivation of the trace test for cointegrating rank, test on cointegrating relations, and test on adjustment coefficients and their asymptotic distributions.Less
This monograph is concerned with the statistical analysis of multivariate systems of non‐stationary time series of type I(1). It applies the concepts of cointegration and common trends in the framework of the Gaussian vector autoregressive model. The main result on the structure of cointegrated processes as defined by the error correction model is Grangers representation theorem. The statistical results include derivation of the trace test for cointegrating rank, test on cointegrating relations, and test on adjustment coefficients and their asymptotic distributions.
Michio Hatanaka
- Published in print:
- 1996
- Published Online:
- November 2003
- ISBN:
- 9780198773535
- eISBN:
- 9780191596360
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198773536.003.0012
- Subject:
- Economics and Finance, Econometrics
This chapter discusses co-integration and the Granger representation theorem. The theoretical structure of the Granger representation theorem is illustrated with economic interpretation by a ...
More
This chapter discusses co-integration and the Granger representation theorem. The theoretical structure of the Granger representation theorem is illustrated with economic interpretation by a bivariate process. The economic error-correction model is used to show the type of long-run relationships that be dealt with by the co-integration analysis. The recovery of the parameter of the economic error-correction model from that of the statistical error-correction model is also presented.Less
This chapter discusses co-integration and the Granger representation theorem. The theoretical structure of the Granger representation theorem is illustrated with economic interpretation by a bivariate process. The economic error-correction model is used to show the type of long-run relationships that be dealt with by the co-integration analysis. The recovery of the parameter of the economic error-correction model from that of the statistical error-correction model is also presented.
Anthony Garratt, Kevin Lee, M. Hashem Pesaran, and Yongcheol Shin
- Published in print:
- 2006
- Published Online:
- September 2006
- ISBN:
- 9780199296859
- eISBN:
- 9780191603853
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0199296855.003.0003
- Subject:
- Economics and Finance, Econometrics
This chapter describes a framework for macroeconometric modelling, which draws out the links with economic theory relating to the long run and with theory relating to the short run. It elaborates a ...
More
This chapter describes a framework for macroeconometric modelling, which draws out the links with economic theory relating to the long run and with theory relating to the short run. It elaborates a modelling strategy that can be employed to accommodate directly the theory of the long run, and notes the ways in which short-run theory can also be accommodated in national and global models. Recent literature on modelling short-run dynamics is reviewed, highlighting the difficulties in obtaining consensus on appropriate short-run restrictions and commenting on the approaches taken in the literature in examining policy shocks in general, and monetary policy in particular.Less
This chapter describes a framework for macroeconometric modelling, which draws out the links with economic theory relating to the long run and with theory relating to the short run. It elaborates a modelling strategy that can be employed to accommodate directly the theory of the long run, and notes the ways in which short-run theory can also be accommodated in national and global models. Recent literature on modelling short-run dynamics is reviewed, highlighting the difficulties in obtaining consensus on appropriate short-run restrictions and commenting on the approaches taken in the literature in examining policy shocks in general, and monetary policy in particular.
Anindya Banerjee and Massimiliano Marcellino
- Published in print:
- 2009
- Published Online:
- September 2009
- ISBN:
- 9780199237197
- eISBN:
- 9780191717314
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199237197.003.0009
- Subject:
- Economics and Finance, Econometrics
This chapter brings together several important strands of the econometrics literature: error-correction, cointegration, and dynamic factor models. It introduces the Factor-augmented Error Correction ...
More
This chapter brings together several important strands of the econometrics literature: error-correction, cointegration, and dynamic factor models. It introduces the Factor-augmented Error Correction Model (FECM), where the factors estimated from a large set of variables in levels are jointly modelled with a few key economic variables of interest. With respect to the standard ECM, the FECM protects, at least in part, from omitted variable bias and the dependence of cointegration analysis on the specific limited set of variables under analysis. It may also be in some cases a refinement of the standard Dynamic Factor Model since it allows the inclusion of error correction terms into the equations, and by allowing for cointegration prevents the errors from being non-invertible moving average processes. In addition, the FECM is a natural generalization of factor augmented VARs (FAVAR) considered by Bernanke, Boivin and Eliasz (2005) inter alia, which are specified in first differences and are therefore misspecified in the presence of cointegration. The FECM has a vast range of applicability. A set of Monte Carlo experiments and two detailed empirical examples highlight its merits in finite samples relative to standard ECM and FAVAR models. The analysis is conducted primarily within an in-sample framework, although the out-of sample implications are also explored.Less
This chapter brings together several important strands of the econometrics literature: error-correction, cointegration, and dynamic factor models. It introduces the Factor-augmented Error Correction Model (FECM), where the factors estimated from a large set of variables in levels are jointly modelled with a few key economic variables of interest. With respect to the standard ECM, the FECM protects, at least in part, from omitted variable bias and the dependence of cointegration analysis on the specific limited set of variables under analysis. It may also be in some cases a refinement of the standard Dynamic Factor Model since it allows the inclusion of error correction terms into the equations, and by allowing for cointegration prevents the errors from being non-invertible moving average processes. In addition, the FECM is a natural generalization of factor augmented VARs (FAVAR) considered by Bernanke, Boivin and Eliasz (2005) inter alia, which are specified in first differences and are therefore misspecified in the presence of cointegration. The FECM has a vast range of applicability. A set of Monte Carlo experiments and two detailed empirical examples highlight its merits in finite samples relative to standard ECM and FAVAR models. The analysis is conducted primarily within an in-sample framework, although the out-of sample implications are also explored.
Søren Johansen
- Published in print:
- 1995
- Published Online:
- November 2003
- ISBN:
- 9780198774501
- eISBN:
- 9780191596476
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198774508.003.0004
- Subject:
- Economics and Finance, Econometrics
Contains the mathematical and algebraic results needed to understand the properties of I(1) and I(2) processes generated by autoregressive and moving average models. The basic result is Grangers ...
More
Contains the mathematical and algebraic results needed to understand the properties of I(1) and I(2) processes generated by autoregressive and moving average models. The basic result is Grangers representation theorem, which gives necessary and sufficient conditions on the coefficients of the autoregressive model for the process to be integrated of order 1 and 2. We introduce the error correction model for I(1) and I(2) processes.Less
Contains the mathematical and algebraic results needed to understand the properties of I(1) and I(2) processes generated by autoregressive and moving average models. The basic result is Grangers representation theorem, which gives necessary and sufficient conditions on the coefficients of the autoregressive model for the process to be integrated of order 1 and 2. We introduce the error correction model for I(1) and I(2) processes.
David F. Hendry, J. E. H. Davidson, F. Srba, and S. Yeo
- Published in print:
- 2000
- Published Online:
- November 2003
- ISBN:
- 9780198293545
- eISBN:
- 9780191596391
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198293542.003.0009
- Subject:
- Economics and Finance, Econometrics
Simple time‐series representations dominated quarterly permanent‐income/life‐cycle models of consumption in fit and predictive accuracy. However, an ‘error‐correction’model (ECM, using the log ...
More
Simple time‐series representations dominated quarterly permanent‐income/life‐cycle models of consumption in fit and predictive accuracy. However, an ‘error‐correction’model (ECM, using the log consumption/income ratio) reconciled both the theories and the evidence, and treated as the DGP, explained the connection between the time‐series and econometric equations. Moreover, the ECM class was shown to have good properties in a pilot Monte Carlo study. While substantively focussed on modelling aggregate consumers’ expenditure, key methodological issues are addressed, including modelling strategies, parameter constancy, collinearity, seasonality, and encompassing (the explanation of other models’ results). Augmented by inflation, a constant model was developed (since known as DHSY), which predicted the first half of the 1970s.Less
Simple time‐series representations dominated quarterly permanent‐income/life‐cycle models of consumption in fit and predictive accuracy. However, an ‘error‐correction’model (ECM, using the log consumption/income ratio) reconciled both the theories and the evidence, and treated as the DGP, explained the connection between the time‐series and econometric equations. Moreover, the ECM class was shown to have good properties in a pilot Monte Carlo study. While substantively focussed on modelling aggregate consumers’ expenditure, key methodological issues are addressed, including modelling strategies, parameter constancy, collinearity, seasonality, and encompassing (the explanation of other models’ results). Augmented by inflation, a constant model was developed (since known as DHSY), which predicted the first half of the 1970s.
Søren Johansen
- Published in print:
- 1995
- Published Online:
- November 2003
- ISBN:
- 9780198774501
- eISBN:
- 9780191596476
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198774508.003.0005
- Subject:
- Economics and Finance, Econometrics
We define the basic reduced form error correction model for I(1)variables where cointegration is modelled in terms of a reduced rank hypothesis on the impact matrix. This defines the cointegrating ...
More
We define the basic reduced form error correction model for I(1)variables where cointegration is modelled in terms of a reduced rank hypothesis on the impact matrix. This defines the cointegrating vectors and adjustment coefficients. A discussion of linear hypotheses on the cointegrating relations and the identification problem is given. We also discuss hypotheses on the adjustment coefficients and discuss the hypothesis of Granger non‐causality. The deterministic terms give rise to a number of models describing different properties of the process.Less
We define the basic reduced form error correction model for I(1)variables where cointegration is modelled in terms of a reduced rank hypothesis on the impact matrix. This defines the cointegrating vectors and adjustment coefficients. A discussion of linear hypotheses on the cointegrating relations and the identification problem is given. We also discuss hypotheses on the adjustment coefficients and discuss the hypothesis of Granger non‐causality. The deterministic terms give rise to a number of models describing different properties of the process.
Youseop Shin
- Published in print:
- 2017
- Published Online:
- September 2017
- ISBN:
- 9780520293168
- eISBN:
- 9780520966383
- Item type:
- chapter
- Publisher:
- University of California Press
- DOI:
- 10.1525/california/9780520293168.003.0006
- Subject:
- Sociology, Law, Crime and Deviance
Chapter Six explains time series analysis with one or more independent variables. The dependent variable is the monthly violent crime rates and the independent variables are unemployment rates and ...
More
Chapter Six explains time series analysis with one or more independent variables. The dependent variable is the monthly violent crime rates and the independent variables are unemployment rates and inflation. This chapter discusses several topics related to the robustness of estimated models, such as how to prewhiten a time series, how to deal with trends and seasonal components, how to deal with autoregressive residuals, and how to discern changes of the dependent variable caused by independent variables from its simple continuity. This chapter also discusses the concepts of co-integration and long-memory effect and related topics such as error correction models and autoregressive distributive lags models.Less
Chapter Six explains time series analysis with one or more independent variables. The dependent variable is the monthly violent crime rates and the independent variables are unemployment rates and inflation. This chapter discusses several topics related to the robustness of estimated models, such as how to prewhiten a time series, how to deal with trends and seasonal components, how to deal with autoregressive residuals, and how to discern changes of the dependent variable caused by independent variables from its simple continuity. This chapter also discusses the concepts of co-integration and long-memory effect and related topics such as error correction models and autoregressive distributive lags models.
Bernt P. Stigum
- Published in print:
- 2014
- Published Online:
- September 2015
- ISBN:
- 9780262028585
- eISBN:
- 9780262323109
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262028585.003.0006
- Subject:
- Economics and Finance, Econometrics
Chapter VI begins with a discussion of the axioms of a formal theory-data confrontation in which the data appear as vector-valued sequences of observations of a vector-valued random process. Then it ...
More
Chapter VI begins with a discussion of the axioms of a formal theory-data confrontation in which the data appear as vector-valued sequences of observations of a vector-valued random process. Then it describes important characteristics of I(1) ARIMA processes, one of which is their tendency to display long positive and negative sojourns. This property of the process can be used to carry out meaningful empirical analyses of positively valued time series; e.g., spot and forward exchange rate: Let the observations of the exchange rates be observations of an auxiliary I(1) ARIMA process, analyse them with currently available software programs, and check if long sequences of the exchange rates have the characteristics of an I(1) ARIMA process. A second characteristic of an I(1) ARIMA process is that any multivariate version of such a process can be written as an error correction model. In present-day econometrics, statistical analyses of error correction models of vector-valued time series are used to determine the degree of cointegration of the time series. The import of such an analysis hinges on the theoretical meaningfulness of the pertinent error correction model. The chapter demonstrates that an empirically relevant error correction model need not be theoretically meaningful.Less
Chapter VI begins with a discussion of the axioms of a formal theory-data confrontation in which the data appear as vector-valued sequences of observations of a vector-valued random process. Then it describes important characteristics of I(1) ARIMA processes, one of which is their tendency to display long positive and negative sojourns. This property of the process can be used to carry out meaningful empirical analyses of positively valued time series; e.g., spot and forward exchange rate: Let the observations of the exchange rates be observations of an auxiliary I(1) ARIMA process, analyse them with currently available software programs, and check if long sequences of the exchange rates have the characteristics of an I(1) ARIMA process. A second characteristic of an I(1) ARIMA process is that any multivariate version of such a process can be written as an error correction model. In present-day econometrics, statistical analyses of error correction models of vector-valued time series are used to determine the degree of cointegration of the time series. The import of such an analysis hinges on the theoretical meaningfulness of the pertinent error correction model. The chapter demonstrates that an empirically relevant error correction model need not be theoretically meaningful.
M. Hashem Pesaran
- Published in print:
- 2015
- Published Online:
- March 2016
- ISBN:
- 9780198736912
- eISBN:
- 9780191800504
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198736912.003.0006
- Subject:
- Economics and Finance, Econometrics
Dynamic economic models typically arise as a characterization of the path of the economy around its long run equilibrium (steady states), and involve modelling expectations, learning, and adjustment ...
More
Dynamic economic models typically arise as a characterization of the path of the economy around its long run equilibrium (steady states), and involve modelling expectations, learning, and adjustment costs. A variety of dynamic specifications used in applied time series econometrics exist. This chapter reviews a number of single-equation specifications suggested by econometric literature to represent dynamics in regression models. It provides a preliminary introduction to distributed lag models, autoregressive distributed lag models, partial adjustment models, error-correction models, and adaptive, and rational expectations models. Exercises are provided at the end of the chapter.Less
Dynamic economic models typically arise as a characterization of the path of the economy around its long run equilibrium (steady states), and involve modelling expectations, learning, and adjustment costs. A variety of dynamic specifications used in applied time series econometrics exist. This chapter reviews a number of single-equation specifications suggested by econometric literature to represent dynamics in regression models. It provides a preliminary introduction to distributed lag models, autoregressive distributed lag models, partial adjustment models, error-correction models, and adaptive, and rational expectations models. Exercises are provided at the end of the chapter.
Sumner La Croix
- Published in print:
- 2019
- Published Online:
- September 2019
- ISBN:
- 9780226592091
- eISBN:
- 9780226592121
- Item type:
- chapter
- Publisher:
- University of Chicago Press
- DOI:
- 10.7208/chicago/9780226592121.003.0008
- Subject:
- Economics and Finance, Economic History
During this decade, native Hawaiians began to organize more effectively to demand that the territorial and federal governments take action to address their declining welfare by returning some of the ...
More
During this decade, native Hawaiians began to organize more effectively to demand that the territorial and federal governments take action to address their declining welfare by returning some of the best government-owned agricultural lands to them for settlement. In 1921, the U.S. Congress responded by passing the Hawaiian Homes Commission Act (HHCA), the goal of which was to lease federal government lands to Native Hawaiians for use as ranches, farms, and house lots. This chapter analyzes why this program has struggled to fulfill its goals throughout its existence and concludes that it never had a chance of success due to the poor-quality lands assigned to the program and the restrictions placed on lessee farm and pastoral activities. Modern econometric techniques are used to test hypotheses that executive and legislative support for the program and distribution of program lands were driven by the power of Hawaiians at the ballot box and by the changing value of the lands dedicated to the program.Less
During this decade, native Hawaiians began to organize more effectively to demand that the territorial and federal governments take action to address their declining welfare by returning some of the best government-owned agricultural lands to them for settlement. In 1921, the U.S. Congress responded by passing the Hawaiian Homes Commission Act (HHCA), the goal of which was to lease federal government lands to Native Hawaiians for use as ranches, farms, and house lots. This chapter analyzes why this program has struggled to fulfill its goals throughout its existence and concludes that it never had a chance of success due to the poor-quality lands assigned to the program and the restrictions placed on lessee farm and pastoral activities. Modern econometric techniques are used to test hypotheses that executive and legislative support for the program and distribution of program lands were driven by the power of Hawaiians at the ballot box and by the changing value of the lands dedicated to the program.