Duo Qin
- Published in print:
- 1997
- Published Online:
- November 2003
- ISBN:
- 9780198292876
- eISBN:
- 9780191596803
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198292872.001.0001
- Subject:
- Economics and Finance, History of Economic Thought, Econometrics
This book traces the formation of econometric theory during the period 1930–1960. It focuses upon the process of how econometrics was formed from mathematical and scientific processes in order to ...
More
This book traces the formation of econometric theory during the period 1930–1960. It focuses upon the process of how econometrics was formed from mathematical and scientific processes in order to analyse economic problems. The book deals with the advances that were achieved as well as the problems that arose in the course of the practice of econometrics as a discipline. Duo Qin examines the history of econometrics in terms of the basic issues in econometric modelling: the probability foundations, estimation, identification, testing, and model construction and specification. The book describes chronologically how these issues were formalized. Duo Qin argues that, while the probability revolution in econometrics in the early 1940s laid the basis for the systematization of econometric theory, it was actually an incomplete revolution, and its incompleteness underlay various problems and failures that occurred in applying the newly established theory to modelling practice. Model construction and hypothesis testing remained problematic because the basic problem of induction in econometrics was not properly formalized and solved. The book thus links early econometric history with many issues of interest to contemporary developments in econometrics. The story is told from the econometric perspective instead of the usual perspective in the history of economic thought (i.e. presenting the story either according to different schools or economic issues), and this approach is clearly reflected in the classification of the chapters.Less
This book traces the formation of econometric theory during the period 1930–1960. It focuses upon the process of how econometrics was formed from mathematical and scientific processes in order to analyse economic problems. The book deals with the advances that were achieved as well as the problems that arose in the course of the practice of econometrics as a discipline. Duo Qin examines the history of econometrics in terms of the basic issues in econometric modelling: the probability foundations, estimation, identification, testing, and model construction and specification. The book describes chronologically how these issues were formalized. Duo Qin argues that, while the probability revolution in econometrics in the early 1940s laid the basis for the systematization of econometric theory, it was actually an incomplete revolution, and its incompleteness underlay various problems and failures that occurred in applying the newly established theory to modelling practice. Model construction and hypothesis testing remained problematic because the basic problem of induction in econometrics was not properly formalized and solved. The book thus links early econometric history with many issues of interest to contemporary developments in econometrics. The story is told from the econometric perspective instead of the usual perspective in the history of economic thought (i.e. presenting the story either according to different schools or economic issues), and this approach is clearly reflected in the classification of the chapters.
Manuel Arellano
- Published in print:
- 2003
- Published Online:
- July 2005
- ISBN:
- 9780199245284
- eISBN:
- 9780191602481
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0199245282.001.0001
- Subject:
- Economics and Finance, Econometrics
This book reviews some of the main topics in panel data econometrics. It analyses econometric models with non-exogenous explanatory variables, and the problem of distinguishing between dynamic ...
More
This book reviews some of the main topics in panel data econometrics. It analyses econometric models with non-exogenous explanatory variables, and the problem of distinguishing between dynamic responses and unobserved heterogeneity in panel data models. The book is divided into three parts. Part I deals with static models. Part II discusses pure time series models. Part III considers dynamic conditional models.Less
This book reviews some of the main topics in panel data econometrics. It analyses econometric models with non-exogenous explanatory variables, and the problem of distinguishing between dynamic responses and unobserved heterogeneity in panel data models. The book is divided into three parts. Part I deals with static models. Part II discusses pure time series models. Part III considers dynamic conditional models.
Michio Hatanaka
- Published in print:
- 1996
- Published Online:
- November 2003
- ISBN:
- 9780198773535
- eISBN:
- 9780191596360
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198773536.003.0004
- Subject:
- Economics and Finance, Econometrics
This chapter introduces extended asymptotic theories on the unit root developed in Fuller (1976), Dickey and Fuller (1979), Phillips (1987), and Phillips and Perron (1988) among others. The theories ...
More
This chapter introduces extended asymptotic theories on the unit root developed in Fuller (1976), Dickey and Fuller (1979), Phillips (1987), and Phillips and Perron (1988) among others. The theories are explained in two steps. The first deals with the elementary but fundamental case where Δxt is i.i.d with zero mean. The second step is given in Chapter 6. It explains more advanced aspects including the case where Δxt is an ARMA.Less
This chapter introduces extended asymptotic theories on the unit root developed in Fuller (1976), Dickey and Fuller (1979), Phillips (1987), and Phillips and Perron (1988) among others. The theories are explained in two steps. The first deals with the elementary but fundamental case where Δxt is i.i.d with zero mean. The second step is given in Chapter 6. It explains more advanced aspects including the case where Δxt is an ARMA.
Michio Hatanaka
- Published in print:
- 1996
- Published Online:
- November 2003
- ISBN:
- 9780198773535
- eISBN:
- 9780191596360
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198773536.003.0005
- Subject:
- Economics and Finance, Econometrics
This chapter describes the augmented Dickey-Fuller method for the case where Δxt is i.i.d. Difference stationarity is tested as the null hypothesis against trend stationarity, assuming that {xt} may ...
More
This chapter describes the augmented Dickey-Fuller method for the case where Δxt is i.i.d. Difference stationarity is tested as the null hypothesis against trend stationarity, assuming that {xt} may possibly contain a linear deterministic trend. It also presents a method that does not work.Less
This chapter describes the augmented Dickey-Fuller method for the case where Δxt is i.i.d. Difference stationarity is tested as the null hypothesis against trend stationarity, assuming that {xt} may possibly contain a linear deterministic trend. It also presents a method that does not work.
Lawrence R. Klein (ed.)
- Published in print:
- 1991
- Published Online:
- October 2011
- ISBN:
- 9780195057720
- eISBN:
- 9780199854967
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195057720.001.0001
- Subject:
- Economics and Finance, Econometrics
One of the most important, and visible, things economists do is to forecast what will happen in the economy in the future. Each year, a number of different groups in the United States use their own ...
More
One of the most important, and visible, things economists do is to forecast what will happen in the economy in the future. Each year, a number of different groups in the United States use their own econometric models to forecast what will happen to the economy in the coming year. Some economic forecasts are more accurate than others. This book consists of chapters comparing the different models now being used. It is organized topically rather than by model. The contributors include: Roger Brimmer, Ray Fair, Bert Hickman, F. Gerard Adams, and Albert Ando. The editor provides an introduction to the volume.Less
One of the most important, and visible, things economists do is to forecast what will happen in the economy in the future. Each year, a number of different groups in the United States use their own econometric models to forecast what will happen to the economy in the coming year. Some economic forecasts are more accurate than others. This book consists of chapters comparing the different models now being used. It is organized topically rather than by model. The contributors include: Roger Brimmer, Ray Fair, Bert Hickman, F. Gerard Adams, and Albert Ando. The editor provides an introduction to the volume.
Manuel Arellano
- Published in print:
- 2003
- Published Online:
- July 2005
- ISBN:
- 9780199245284
- eISBN:
- 9780191602481
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0199245282.003.0001
- Subject:
- Economics and Finance, Econometrics
This introductory chapter begins with a brief discussion on how the term ‘panel data’ is applied to a wide range of situations in econometrics. It describes the two main objectives of this volume: ...
More
This introductory chapter begins with a brief discussion on how the term ‘panel data’ is applied to a wide range of situations in econometrics. It describes the two main objectives of this volume: the analysis of econometric models with non-exogenous explanatory variables, and the problem of distinguishing empirically between dynamic responses and unobserved heterogeneity in panel data models. An overview of the three parts of this volume is presented.Less
This introductory chapter begins with a brief discussion on how the term ‘panel data’ is applied to a wide range of situations in econometrics. It describes the two main objectives of this volume: the analysis of econometric models with non-exogenous explanatory variables, and the problem of distinguishing empirically between dynamic responses and unobserved heterogeneity in panel data models. An overview of the three parts of this volume is presented.
Alfred Maizels, Robert Bacon, and George Mavrotas
- Published in print:
- 1997
- Published Online:
- October 2011
- ISBN:
- 9780198233381
- eISBN:
- 9780191678981
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198233381.001.0001
- Subject:
- Economics and Finance, Development, Growth, and Environmental
The collapse in commodity prices since 1980 has been a major cause of the economic crisis in a large number of developing countries. This book investigates whether the commodity-producing countries, ...
More
The collapse in commodity prices since 1980 has been a major cause of the economic crisis in a large number of developing countries. This book investigates whether the commodity-producing countries, by joint action, could have prevented the price collapse by appropriate supply management. The analysis is focused on the markets for the tropical beverage crops: coffee, cocoa, and tea. Using new econometric models for each market, the impact of alternative supply management schemes on supply, consumption, prices, and export earnings is simulated for the later 1980s. The results indicate that supply management by producing countries would, indeed, have been a viable alternative to the ‘free market’ approach favoured by the developed countries. This has important implications for current international commodity policy, and, in particular, for future joint action by producing countries to overcome persistent commodity surpluses as a complement to needed diversification.Less
The collapse in commodity prices since 1980 has been a major cause of the economic crisis in a large number of developing countries. This book investigates whether the commodity-producing countries, by joint action, could have prevented the price collapse by appropriate supply management. The analysis is focused on the markets for the tropical beverage crops: coffee, cocoa, and tea. Using new econometric models for each market, the impact of alternative supply management schemes on supply, consumption, prices, and export earnings is simulated for the later 1980s. The results indicate that supply management by producing countries would, indeed, have been a viable alternative to the ‘free market’ approach favoured by the developed countries. This has important implications for current international commodity policy, and, in particular, for future joint action by producing countries to overcome persistent commodity surpluses as a complement to needed diversification.
Roger M. Barker
- Published in print:
- 2010
- Published Online:
- May 2010
- ISBN:
- 9780199576814
- eISBN:
- 9780191722509
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199576814.003.0006
- Subject:
- Business and Management, International Business, Corporate Governance and Accountability
A panel data econometric analysis of corporate governance change is undertaken utilizing a data set of fifteen nonliberal market economies covering the period 1975–2003. The results of this analysis ...
More
A panel data econometric analysis of corporate governance change is undertaken utilizing a data set of fifteen nonliberal market economies covering the period 1975–2003. The results of this analysis suggested that the interaction of partisanship and competition is a highly significant determinant of corporate governance change. In particular, significant shifts in a pro‐shareholder direction are associated with Left government – but not conservative government – in the context of high levels of competition. In contrast, neither Left nor conservative government is associated with corporate governance change in a low‐competition environment.Less
A panel data econometric analysis of corporate governance change is undertaken utilizing a data set of fifteen nonliberal market economies covering the period 1975–2003. The results of this analysis suggested that the interaction of partisanship and competition is a highly significant determinant of corporate governance change. In particular, significant shifts in a pro‐shareholder direction are associated with Left government – but not conservative government – in the context of high levels of competition. In contrast, neither Left nor conservative government is associated with corporate governance change in a low‐competition environment.
Qin Duo
- Published in print:
- 1997
- Published Online:
- November 2003
- ISBN:
- 9780198292876
- eISBN:
- 9780191596803
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198292872.003.0006
- Subject:
- Economics and Finance, History of Economic Thought, Econometrics
Addresses the issue of testing, and reveals some intrinsic problems pertaining to hypothesis testing beneath the achievements of formalizing econometrics. Theory verification through applied studies ...
More
Addresses the issue of testing, and reveals some intrinsic problems pertaining to hypothesis testing beneath the achievements of formalizing econometrics. Theory verification through applied studies forms one of the main motives for formalizing methods of model estimation and identification, and the statistical theory of hypothesis testing was accepted without much dispute quite early as the technical vehicle to fulfil this desire. However, during the adoption of the theory into econometrics in the 1940s and 1950s, the achievable domain of verification turned out to be considerably reduced, as testing in econometrics proper gradually dwindled into part of the modelling procedure and pertained to model evaluation using statistical testing tools; in the applied field, empirical modellers took on the task of discriminating between and verifying economic theories against the model results, and carried this out in an ad hoc and often non‐sequitur manner. Describes how the desire to test diverged into model evaluation in econometric theory on the one hand, and economic theory verification in practice on the other, as econometric testing theory took shape. The story begins with the early period prior to the formative movement in the first section of the chapter; the following section looks at the period in which the theme of hypothesis testing was introduced, and the first test emerged in econometrics; the last two sections report, respectively, on how model testing in applied econometrics and test design in theoretical econometrics developed and moved apart.Less
Addresses the issue of testing, and reveals some intrinsic problems pertaining to hypothesis testing beneath the achievements of formalizing econometrics. Theory verification through applied studies forms one of the main motives for formalizing methods of model estimation and identification, and the statistical theory of hypothesis testing was accepted without much dispute quite early as the technical vehicle to fulfil this desire. However, during the adoption of the theory into econometrics in the 1940s and 1950s, the achievable domain of verification turned out to be considerably reduced, as testing in econometrics proper gradually dwindled into part of the modelling procedure and pertained to model evaluation using statistical testing tools; in the applied field, empirical modellers took on the task of discriminating between and verifying economic theories against the model results, and carried this out in an ad hoc and often non‐sequitur manner. Describes how the desire to test diverged into model evaluation in econometric theory on the one hand, and economic theory verification in practice on the other, as econometric testing theory took shape. The story begins with the early period prior to the formative movement in the first section of the chapter; the following section looks at the period in which the theme of hypothesis testing was introduced, and the first test emerged in econometrics; the last two sections report, respectively, on how model testing in applied econometrics and test design in theoretical econometrics developed and moved apart.
Nancy Cartwright
- Published in print:
- 1994
- Published Online:
- November 2003
- ISBN:
- 9780198235071
- eISBN:
- 9780191597169
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198235070.001.0001
- Subject:
- Philosophy, Philosophy of Science
This book on the philosophy of science argues for an empiricism, opposed to the tradition of David Hume, in which singular rather than general causal claims are primary; causal laws express facts ...
More
This book on the philosophy of science argues for an empiricism, opposed to the tradition of David Hume, in which singular rather than general causal claims are primary; causal laws express facts about singular causes whereas the general causal claims of science are ascriptions of capacities or causal powers, capacities to make things happen. Taking science as measurement, Cartwright argues that capacities are necessary for science and that these can be measured, provided suitable conditions are met. There are case studies from both econometrics and quantum mechanics.Less
This book on the philosophy of science argues for an empiricism, opposed to the tradition of David Hume, in which singular rather than general causal claims are primary; causal laws express facts about singular causes whereas the general causal claims of science are ascriptions of capacities or causal powers, capacities to make things happen. Taking science as measurement, Cartwright argues that capacities are necessary for science and that these can be measured, provided suitable conditions are met. There are case studies from both econometrics and quantum mechanics.
Stephen Bazen
- Published in print:
- 2011
- Published Online:
- January 2012
- ISBN:
- 9780199576791
- eISBN:
- 9780191731136
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199576791.001.0001
- Subject:
- Economics and Finance, Econometrics
This book provides a presentation of the standard statistical techniques used by labour economists. It emphasizes both the input and the output of empirical analysis and covers five major topics ...
More
This book provides a presentation of the standard statistical techniques used by labour economists. It emphasizes both the input and the output of empirical analysis and covers five major topics concerning econometric methods used in labour economics: regression and related methods, choice modelling, selectivity issues, duration analysis, and policy evaluation techniques. Each of these is presented in terms of model specification, possible estimation problems, diagnostic checking, and interpretation of the output. The book aims to provide guidance to practitioners on how to use the techniques and how to make sense of the results that are produced. It covers methods that are considered to be ‘standard’ tools in labour economics, but which are often given only a brief and highly technical treatment in econometrics textbooks.Less
This book provides a presentation of the standard statistical techniques used by labour economists. It emphasizes both the input and the output of empirical analysis and covers five major topics concerning econometric methods used in labour economics: regression and related methods, choice modelling, selectivity issues, duration analysis, and policy evaluation techniques. Each of these is presented in terms of model specification, possible estimation problems, diagnostic checking, and interpretation of the output. The book aims to provide guidance to practitioners on how to use the techniques and how to make sense of the results that are produced. It covers methods that are considered to be ‘standard’ tools in labour economics, but which are often given only a brief and highly technical treatment in econometrics textbooks.
John E. Jackson
- Published in print:
- 1998
- Published Online:
- November 2003
- ISBN:
- 9780198294719
- eISBN:
- 9780191599361
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198294719.003.0032
- Subject:
- Political Science, Reference
Reviews methodological techniques available across the discipline of political science. Econometrics and political science methods include structural equation estimations, time‐series analysis, and ...
More
Reviews methodological techniques available across the discipline of political science. Econometrics and political science methods include structural equation estimations, time‐series analysis, and non‐linear models. Alternative approaches analyse public preferences, political institutions, and path dependence political economy modelling. The drawbacks of these methods are examined by questioning their underlying assumptions and examining their consequences. While there is cause for concern, solace lies in the fact that these problems are also faced across other disciplines.Less
Reviews methodological techniques available across the discipline of political science. Econometrics and political science methods include structural equation estimations, time‐series analysis, and non‐linear models. Alternative approaches analyse public preferences, political institutions, and path dependence political economy modelling. The drawbacks of these methods are examined by questioning their underlying assumptions and examining their consequences. While there is cause for concern, solace lies in the fact that these problems are also faced across other disciplines.
Qin Duo
- Published in print:
- 1997
- Published Online:
- November 2003
- ISBN:
- 9780198292876
- eISBN:
- 9780191596803
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198292872.003.0005
- Subject:
- Economics and Finance, History of Economic Thought, Econometrics
Narrates the process of how identification was formalized in econometrics. The issue of identification stemmed from the quest to know the attainability of economically meaningful relationships from ...
More
Narrates the process of how identification was formalized in econometrics. The issue of identification stemmed from the quest to know the attainability of economically meaningful relationships from statistical analysis of economic data in early estimation attempts. It arose out of the ‘correspondence’ problems ‘between economic activity, the data that activity generates, the theoretical economic model and the estimated relationship’ (Morgan, 1990). When identification theory was eventually formalized, its purpose became focused on the conditions under which a certain set of values of structural parameters could be uniquely determined from the data among all the permissible sets embodied in a mathematically complete theoretical model, usually composed of a simultaneous‐equations system. These conditions are most commonly known as the order and rank conditions in the context of linear, simultaneous‐equations models in present‐day econometrics textbooks. The emergence of identification theory played a key role in the formal establishment of the structural approach of orthodox econometrics through its links to model testing and model specification. Traces the formalization of identification theory around two interwoven themes: how the identification problem was perceived and described in connection with the other issues in econometric modelling; and how the problem was formalized and tackled with mathematical and statistical means. The first section outlines the early appearance of the identification problem and some ad hoc solutions for particular cases and model forms before the mid‐1930s; the second centres upon the initial systematic work on the issue around 1940; the third is devoted to the contribution of the Cowles group; and the completion of the theoretical framework and its overlaps with other modelling issues form the subject of the last section.Less
Narrates the process of how identification was formalized in econometrics. The issue of identification stemmed from the quest to know the attainability of economically meaningful relationships from statistical analysis of economic data in early estimation attempts. It arose out of the ‘correspondence’ problems ‘between economic activity, the data that activity generates, the theoretical economic model and the estimated relationship’ (Morgan, 1990). When identification theory was eventually formalized, its purpose became focused on the conditions under which a certain set of values of structural parameters could be uniquely determined from the data among all the permissible sets embodied in a mathematically complete theoretical model, usually composed of a simultaneous‐equations system. These conditions are most commonly known as the order and rank conditions in the context of linear, simultaneous‐equations models in present‐day econometrics textbooks. The emergence of identification theory played a key role in the formal establishment of the structural approach of orthodox econometrics through its links to model testing and model specification. Traces the formalization of identification theory around two interwoven themes: how the identification problem was perceived and described in connection with the other issues in econometric modelling; and how the problem was formalized and tackled with mathematical and statistical means. The first section outlines the early appearance of the identification problem and some ad hoc solutions for particular cases and model forms before the mid‐1930s; the second centres upon the initial systematic work on the issue around 1940; the third is devoted to the contribution of the Cowles group; and the completion of the theoretical framework and its overlaps with other modelling issues form the subject of the last section.
Stephen Bazen
- Published in print:
- 2011
- Published Online:
- January 2012
- ISBN:
- 9780199576791
- eISBN:
- 9780191731136
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199576791.003.0001
- Subject:
- Economics and Finance, Econometrics
This introductory chapter sets out the purpose of the book, which is to provide a practical guide to understanding and applying the standard econometric tools that are used in labour economics. ...
More
This introductory chapter sets out the purpose of the book, which is to provide a practical guide to understanding and applying the standard econometric tools that are used in labour economics. Emphasis is placed on both the input and the output of empirical analysis, rather than the understanding of the origins and properties of estimators and tests, topics which are more than adequately covered in recent textbooks on microeconometrics. The basic idea developed in this book is that linear regression is an important starting point for empirical analysis in labour economics.Less
This introductory chapter sets out the purpose of the book, which is to provide a practical guide to understanding and applying the standard econometric tools that are used in labour economics. Emphasis is placed on both the input and the output of empirical analysis, rather than the understanding of the origins and properties of estimators and tests, topics which are more than adequately covered in recent textbooks on microeconometrics. The basic idea developed in this book is that linear regression is an important starting point for empirical analysis in labour economics.
Stephen Bazen
- Published in print:
- 2011
- Published Online:
- January 2012
- ISBN:
- 9780199576791
- eISBN:
- 9780191731136
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199576791.003.0008
- Subject:
- Economics and Finance, Econometrics
This chapter presents some concluding thoughts. The aim of this book has been to present the main econometric techniques used by labour economists. It aims to serve as a platform for adapting ...
More
This chapter presents some concluding thoughts. The aim of this book has been to present the main econometric techniques used by labour economists. It aims to serve as a platform for adapting material already encountered in econometrics classes and textbooks to the empirical analysis of labour market phenomena. There is no conventional wisdom on how to conduct empirical analysis in labour economics. Several approaches coexist, running from structural models with very tight links to economic theory, to so-called ‘model-free’ approaches based on emulating the experimental approach. In practice, most studies in empirical labour economics lie somewhere in between these two benchmarks, and consist of estimating models which are loosely based on theoretical reasoning and specified in a flexible manner so that the data can ‘talk’.Less
This chapter presents some concluding thoughts. The aim of this book has been to present the main econometric techniques used by labour economists. It aims to serve as a platform for adapting material already encountered in econometrics classes and textbooks to the empirical analysis of labour market phenomena. There is no conventional wisdom on how to conduct empirical analysis in labour economics. Several approaches coexist, running from structural models with very tight links to economic theory, to so-called ‘model-free’ approaches based on emulating the experimental approach. In practice, most studies in empirical labour economics lie somewhere in between these two benchmarks, and consist of estimating models which are loosely based on theoretical reasoning and specified in a flexible manner so that the data can ‘talk’.
Thomas J. Sargent
- Published in print:
- 2013
- Published Online:
- October 2017
- ISBN:
- 9780691158709
- eISBN:
- 9781400847648
- Item type:
- book
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691158709.001.0001
- Subject:
- Economics and Finance, Economic History
This collection of essays uses the lens of rational expectations theory to examine how governments anticipate and plan for inflation, and provides insight into the pioneering research for which the ...
More
This collection of essays uses the lens of rational expectations theory to examine how governments anticipate and plan for inflation, and provides insight into the pioneering research for which the author was awarded the 2011 Nobel Prize in economics. Rational expectations theory is based on the simple premise that people will use all the information available to them in making economic decisions, yet applying the theory to macroeconomics and econometrics is technically demanding. This book engages with practical problems in economics in a less formal, noneconometric way, demonstrating how rational expectations can satisfactorily interpret a range of historical and contemporary events. It focuses on periods of actual or threatened depreciation in the value of a nation's currency. Drawing on historical attempts to counter inflation, from the French Revolution and the aftermath of World War I to the economic policies of Margaret Thatcher and Ronald Reagan, the book finds that there is no purely monetary cure for inflation; rather, monetary and fiscal policies must be coordinated. This fully expanded edition includes the author's 2011 Nobel lecture, “United States Then, Europe Now.” It also features new articles on the macroeconomics of the French Revolution and government budget deficits.Less
This collection of essays uses the lens of rational expectations theory to examine how governments anticipate and plan for inflation, and provides insight into the pioneering research for which the author was awarded the 2011 Nobel Prize in economics. Rational expectations theory is based on the simple premise that people will use all the information available to them in making economic decisions, yet applying the theory to macroeconomics and econometrics is technically demanding. This book engages with practical problems in economics in a less formal, noneconometric way, demonstrating how rational expectations can satisfactorily interpret a range of historical and contemporary events. It focuses on periods of actual or threatened depreciation in the value of a nation's currency. Drawing on historical attempts to counter inflation, from the French Revolution and the aftermath of World War I to the economic policies of Margaret Thatcher and Ronald Reagan, the book finds that there is no purely monetary cure for inflation; rather, monetary and fiscal policies must be coordinated. This fully expanded edition includes the author's 2011 Nobel lecture, “United States Then, Europe Now.” It also features new articles on the macroeconomics of the French Revolution and government budget deficits.
Lars Peter Hansen and Thomas J. Sargent
- Published in print:
- 2013
- Published Online:
- October 2017
- ISBN:
- 9780691042770
- eISBN:
- 9781400848188
- Item type:
- book
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691042770.001.0001
- Subject:
- Economics and Finance, History of Economic Thought
A common set of mathematical tools underlies dynamic optimization, dynamic estimation, and filtering. This book uses these tools to create a class of econometrically tractable models of prices and ...
More
A common set of mathematical tools underlies dynamic optimization, dynamic estimation, and filtering. This book uses these tools to create a class of econometrically tractable models of prices and quantities. The book presents examples from microeconomics, macroeconomics, and asset pricing. The models are cast in terms of a representative consumer. While the book demonstrates the analytical benefits acquired when an analysis with a representative consumer is possible, it also characterizes the restrictiveness of assumptions under which a representative household justifies a purely aggregative analysis. The book unites economic theory with a workable econometrics while going beyond and beneath demand and supply curves for dynamic economies. It constructs and applies competitive equilibria for a class of linear-quadratic-Gaussian dynamic economies with complete markets. The book, based on the 2012 Gorman lectures, stresses heterogeneity, aggregation, and how a common structure unites what superficially appear to be diverse applications. An appendix describes MATLAB programs that apply to the book's calculations.Less
A common set of mathematical tools underlies dynamic optimization, dynamic estimation, and filtering. This book uses these tools to create a class of econometrically tractable models of prices and quantities. The book presents examples from microeconomics, macroeconomics, and asset pricing. The models are cast in terms of a representative consumer. While the book demonstrates the analytical benefits acquired when an analysis with a representative consumer is possible, it also characterizes the restrictiveness of assumptions under which a representative household justifies a purely aggregative analysis. The book unites economic theory with a workable econometrics while going beyond and beneath demand and supply curves for dynamic economies. It constructs and applies competitive equilibria for a class of linear-quadratic-Gaussian dynamic economies with complete markets. The book, based on the 2012 Gorman lectures, stresses heterogeneity, aggregation, and how a common structure unites what superficially appear to be diverse applications. An appendix describes MATLAB programs that apply to the book's calculations.
Yannis M. Ioannides
- Published in print:
- 2012
- Published Online:
- October 2017
- ISBN:
- 9780691126852
- eISBN:
- 9781400845385
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691126852.003.0002
- Subject:
- Economics and Finance, Development, Growth, and Environmental
This chapter discusses the theory and empirics of social interactions, with particular emphasis on the role of social context in individual decisions. It begins by introducing a sequence of models ...
More
This chapter discusses the theory and empirics of social interactions, with particular emphasis on the role of social context in individual decisions. It begins by introducing a sequence of models that highlight applications in different empirical social interaction settings, including a simple static model that is used to link social interactions theory with social networks theory, notably random graph theory. A dynamic model, where the social structure accommodates a variety of social interaction motives, is then described and solved as a dynamic system of evolving individual actions. The solution links social interactions theory with spatial econometrics. The chapter examines the econometrics of social interactions in social networks and social learning in urban settings before concluding with a review of the literature on social interactions in economics.Less
This chapter discusses the theory and empirics of social interactions, with particular emphasis on the role of social context in individual decisions. It begins by introducing a sequence of models that highlight applications in different empirical social interaction settings, including a simple static model that is used to link social interactions theory with social networks theory, notably random graph theory. A dynamic model, where the social structure accommodates a variety of social interaction motives, is then described and solved as a dynamic system of evolving individual actions. The solution links social interactions theory with spatial econometrics. The chapter examines the econometrics of social interactions in social networks and social learning in urban settings before concluding with a review of the literature on social interactions in economics.
Qin Duo
- Published in print:
- 1997
- Published Online:
- November 2003
- ISBN:
- 9780198292876
- eISBN:
- 9780191596803
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198292872.003.0003
- Subject:
- Economics and Finance, History of Economic Thought, Econometrics
This chapter recounts the evolution of econometric models up to the 1940s, discussing the common criteria and principles used for model choice, and the generalization of model construction as ...
More
This chapter recounts the evolution of econometric models up to the 1940s, discussing the common criteria and principles used for model choice, and the generalization of model construction as econometrics focused on the structural modelling procedure. The first section reviews the pre‐model period, and the second looks at the emergence of models and the structural method of model construction. The initial generalization (formalization) efforts of the model‐building strategy and criteria are dealt with in the third section. Concludes with the establishment of the structural modelling procedure (the maturity of simultaneous‐equations model formulation).Less
This chapter recounts the evolution of econometric models up to the 1940s, discussing the common criteria and principles used for model choice, and the generalization of model construction as econometrics focused on the structural modelling procedure. The first section reviews the pre‐model period, and the second looks at the emergence of models and the structural method of model construction. The initial generalization (formalization) efforts of the model‐building strategy and criteria are dealt with in the third section. Concludes with the establishment of the structural modelling procedure (the maturity of simultaneous‐equations model formulation).
Qin Duo
- Published in print:
- 1997
- Published Online:
- November 2003
- ISBN:
- 9780198292876
- eISBN:
- 9780191596803
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198292872.003.0004
- Subject:
- Economics and Finance, History of Economic Thought, Econometrics
Narrates the process of how estimation was formalized. Estimation can be seen as the genesis of econometrics, since finding estimates for coefficients of economically meaningful relationships has ...
More
Narrates the process of how estimation was formalized. Estimation can be seen as the genesis of econometrics, since finding estimates for coefficients of economically meaningful relationships has always been the central motive and fulfilment of applied modelling activities. The process therefore became separated out as one of the basic steps along with model construction, identification, and testing. Subsequent research activities in estimation were confined to technical development of new optimal estimators for increasingly complicated model forms. The first section of the chapter describes the early developments in estimation methods centring around the least squares (LS) principle; how this led to the maximum‐likelihood (ML) method in a simultaneous‐equations system is the content of the second section; the third section turns to look at special problems in the context of time‐series analysis; other developments concerning errors‐in‐variables models are summed up in the fourth section; and the final completion of basic estimation theory in orthodox econometrics takes up the final section.Less
Narrates the process of how estimation was formalized. Estimation can be seen as the genesis of econometrics, since finding estimates for coefficients of economically meaningful relationships has always been the central motive and fulfilment of applied modelling activities. The process therefore became separated out as one of the basic steps along with model construction, identification, and testing. Subsequent research activities in estimation were confined to technical development of new optimal estimators for increasingly complicated model forms. The first section of the chapter describes the early developments in estimation methods centring around the least squares (LS) principle; how this led to the maximum‐likelihood (ML) method in a simultaneous‐equations system is the content of the second section; the third section turns to look at special problems in the context of time‐series analysis; other developments concerning errors‐in‐variables models are summed up in the fourth section; and the final completion of basic estimation theory in orthodox econometrics takes up the final section.