Claus Munk
- Published in print:
- 2011
- Published Online:
- September 2011
- ISBN:
- 9780199575084
- eISBN:
- 9780191728648
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199575084.003.0016
- Subject:
- Economics and Finance, Financial Economics
Numerical procedures — computer-implemented algorithms — have to be applied in order to price assets with early exercise features and assets with complicated payoff structures. This chapter ...
More
Numerical procedures — computer-implemented algorithms — have to be applied in order to price assets with early exercise features and assets with complicated payoff structures. This chapter introduces the three main types of numerical procedures used in fixed income modelling. The finite difference approach offers a numerical solution of the partial differential equations that prices have to satisfy in diffusion models. Monte Carlo simulation gives an approximation of the expected value of a random variable, which is useful since prices are linked to expected payoffs (appropriately risk-adjusted and discounted). Approximating trees provide an approximation of the stochastic process of the relevant state variables in a given model and assets are typically easy to price by backwards recursion through the tree. For all three procedures, the chapter presents examples, explains how to apply the procedures to American-style options, and discusses the computation of appropriate risk measures. The procedures are compared and strengths and weaknesses of the different procedures are explainedLess
Numerical procedures — computer-implemented algorithms — have to be applied in order to price assets with early exercise features and assets with complicated payoff structures. This chapter introduces the three main types of numerical procedures used in fixed income modelling. The finite difference approach offers a numerical solution of the partial differential equations that prices have to satisfy in diffusion models. Monte Carlo simulation gives an approximation of the expected value of a random variable, which is useful since prices are linked to expected payoffs (appropriately risk-adjusted and discounted). Approximating trees provide an approximation of the stochastic process of the relevant state variables in a given model and assets are typically easy to price by backwards recursion through the tree. For all three procedures, the chapter presents examples, explains how to apply the procedures to American-style options, and discusses the computation of appropriate risk measures. The procedures are compared and strengths and weaknesses of the different procedures are explained
David F. Hendry
- Published in print:
- 1995
- Published Online:
- November 2003
- ISBN:
- 9780198283164
- eISBN:
- 9780191596384
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198283164.003.0003
- Subject:
- Economics and Finance, Econometrics
Least squares and recursive methods for estimating the values of unknown parameters and the logic of testing in empirical modelling, are discussed. The tools needed for investigating the properties ...
More
Least squares and recursive methods for estimating the values of unknown parameters and the logic of testing in empirical modelling, are discussed. The tools needed for investigating the properties of statistics in economics, namely, large‐sample distribution theory and Monte Carlo simulation techniques, are described. Ergodicity is explained, as are tools for investigating non‐stationarity due to unit roots.Less
Least squares and recursive methods for estimating the values of unknown parameters and the logic of testing in empirical modelling, are discussed. The tools needed for investigating the properties of statistics in economics, namely, large‐sample distribution theory and Monte Carlo simulation techniques, are described. Ergodicity is explained, as are tools for investigating non‐stationarity due to unit roots.
Mark E. Harmon, Donald L. Phillips, John J. Battles, Andrew Rassweiler, Robert O. Hall Jr., and William K. Lauenroth
- Published in print:
- 2007
- Published Online:
- September 2007
- ISBN:
- 9780195168662
- eISBN:
- 9780199790128
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195168662.003.0012
- Subject:
- Biology, Ecology
Because primary production usually is estimated from several variables that are themselves subject to error in measurement, these errors propagate as the variables are combined mathematically. ...
More
Because primary production usually is estimated from several variables that are themselves subject to error in measurement, these errors propagate as the variables are combined mathematically. Following a brief overview of the various sources of error and bias associated with primary production measurements, this chapter provides a detailed description of approaches for quantifying propagation of error in productivity calculations. Monte Carlo simulation approaches are described and the problem of compounding of errors are examined. Several explicit examples are provided to illustrate uncertainty quantification in a variety of biomes.Less
Because primary production usually is estimated from several variables that are themselves subject to error in measurement, these errors propagate as the variables are combined mathematically. Following a brief overview of the various sources of error and bias associated with primary production measurements, this chapter provides a detailed description of approaches for quantifying propagation of error in productivity calculations. Monte Carlo simulation approaches are described and the problem of compounding of errors are examined. Several explicit examples are provided to illustrate uncertainty quantification in a variety of biomes.
Heinz-Peter Breuer and Francesco Petruccione
- Published in print:
- 2007
- Published Online:
- February 2010
- ISBN:
- 9780199213900
- eISBN:
- 9780191706349
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199213900.003.07
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
The formulation of the dynamics of open quantum systems by means of stochastic processes in Hilbert space leads to efficient Monte–Carlo simulation techniques which are introduced and examined in ...
More
The formulation of the dynamics of open quantum systems by means of stochastic processes in Hilbert space leads to efficient Monte–Carlo simulation techniques which are introduced and examined in this chapter. Depending on whether the stochastic process is a piecewise deterministic process or a diffusion process, the corresponding individual realizations consist of intervals of deterministic evolution periods interrupted by instantaneous quantum jumps, or of continuous, nowhere differentiable paths. Various Monte Carlo algorithms for both types of processes are described in detail, and their convergence behaviour and their numerical performance is investigated for a number of applications, such as the damped harmonic oscillator, the driven two-level system, and the damped driven Morse oscillator.Less
The formulation of the dynamics of open quantum systems by means of stochastic processes in Hilbert space leads to efficient Monte–Carlo simulation techniques which are introduced and examined in this chapter. Depending on whether the stochastic process is a piecewise deterministic process or a diffusion process, the corresponding individual realizations consist of intervals of deterministic evolution periods interrupted by instantaneous quantum jumps, or of continuous, nowhere differentiable paths. Various Monte Carlo algorithms for both types of processes are described in detail, and their convergence behaviour and their numerical performance is investigated for a number of applications, such as the damped harmonic oscillator, the driven two-level system, and the damped driven Morse oscillator.
Anindya Banerjee, Juan J. Dolado, John W. Galbraith, and David F. Hendry
- Published in print:
- 1993
- Published Online:
- November 2003
- ISBN:
- 9780198288107
- eISBN:
- 9780191595899
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198288107.003.0007
- Subject:
- Economics and Finance, Econometrics
Examines methods of testing for co‐integration in single equations via static regressions, and provides simulation estimates of the percentiles of the distributions of statistics used in these tests. ...
More
Examines methods of testing for co‐integration in single equations via static regressions, and provides simulation estimates of the percentiles of the distributions of statistics used in these tests. The finite‐sample biases of the estimates of the co‐integrating vectors and powers of the tests based on static regressions are discussed within the framework of extensive Monte Carlo simulations. Dynamic models leading to an error‐correction mechanism based test (ECM test for co‐integration) and non‐parametrically modified estimators are also considered as better ways of estimating the co‐integrating relationships.Less
Examines methods of testing for co‐integration in single equations via static regressions, and provides simulation estimates of the percentiles of the distributions of statistics used in these tests. The finite‐sample biases of the estimates of the co‐integrating vectors and powers of the tests based on static regressions are discussed within the framework of extensive Monte Carlo simulations. Dynamic models leading to an error‐correction mechanism based test (ECM test for co‐integration) and non‐parametrically modified estimators are also considered as better ways of estimating the co‐integrating relationships.
Owe Philipsen
- Published in print:
- 2011
- Published Online:
- January 2012
- ISBN:
- 9780199691609
- eISBN:
- 9780191731792
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199691609.003.0005
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
This chapter gives an introduction to lattice QCD at finite temperature and baryon density. After a discussion of some fundamental aspects and difficulties of quantum field theory at finite ...
More
This chapter gives an introduction to lattice QCD at finite temperature and baryon density. After a discussion of some fundamental aspects and difficulties of quantum field theory at finite temperature in the continuum, the lattice formulation of the partition function for the grand canonical ensbemble is introduced and its relation to the transfer matrix formalism is presented. As analytic tools for its evaluation, weak coupling perturbation theory on the lattice as well as the strong coupling expansion are discussed. Regarding Monte Carlo evaluations, similarities and differences to the situation in the vacuum are pointed out. All concepts are illustrated with various applications like the equation of state, screening masses, the free energy of static quark systems and phase transitions. In the second part, special emphasis is put on lattice QCD at finite baryon density. The sign problem is discussed and current techniques to deal with it at small baryon chemical potential are presented. The implications for the QCD phase diagram are summarized.Less
This chapter gives an introduction to lattice QCD at finite temperature and baryon density. After a discussion of some fundamental aspects and difficulties of quantum field theory at finite temperature in the continuum, the lattice formulation of the partition function for the grand canonical ensbemble is introduced and its relation to the transfer matrix formalism is presented. As analytic tools for its evaluation, weak coupling perturbation theory on the lattice as well as the strong coupling expansion are discussed. Regarding Monte Carlo evaluations, similarities and differences to the situation in the vacuum are pointed out. All concepts are illustrated with various applications like the equation of state, screening masses, the free energy of static quark systems and phase transitions. In the second part, special emphasis is put on lattice QCD at finite baryon density. The sign problem is discussed and current techniques to deal with it at small baryon chemical potential are presented. The implications for the QCD phase diagram are summarized.
David F. Hendry
- Published in print:
- 2000
- Published Online:
- November 2003
- ISBN:
- 9780198293545
- eISBN:
- 9780191596391
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198293542.001.0001
- Subject:
- Economics and Finance, Econometrics
This collection of published papers records the development of an approach to econometric modelling that has reached a highly successful stage. The methodology of modelling ‘observational data’, as ...
More
This collection of published papers records the development of an approach to econometric modelling that has reached a highly successful stage. The methodology of modelling ‘observational data’, as opposed to experimental data, which can be replicated, is analysed to highlight the fundamental flaws in various approaches, and the possibilities of others. Criteria for model adequacy are formulated (congruence and encompassing), and alternative approaches to building empirical models are compared on their ability to deliver such models. A typology of models elucidates their properties, and a taxonomy of information sources clarifies testing. Estimation is summarized by an estimator generating equation. The value of exploring the development path is to reveal by attempted applications why many widely used approaches are inadequate. The outcome is to demonstrate the viability of a general‐to‐specific approach that commences from a specification deemed more than adequate to characterize the evidence, and simplifies to a parsimonious representation that captures the main factors. By artificial Monte Carlo simulations on experiments designed by others, the success of that approach is established, leading to automatic model selection by software that can outperform practitioners.Less
This collection of published papers records the development of an approach to econometric modelling that has reached a highly successful stage. The methodology of modelling ‘observational data’, as opposed to experimental data, which can be replicated, is analysed to highlight the fundamental flaws in various approaches, and the possibilities of others. Criteria for model adequacy are formulated (congruence and encompassing), and alternative approaches to building empirical models are compared on their ability to deliver such models. A typology of models elucidates their properties, and a taxonomy of information sources clarifies testing. Estimation is summarized by an estimator generating equation. The value of exploring the development path is to reveal by attempted applications why many widely used approaches are inadequate. The outcome is to demonstrate the viability of a general‐to‐specific approach that commences from a specification deemed more than adequate to characterize the evidence, and simplifies to a parsimonious representation that captures the main factors. By artificial Monte Carlo simulations on experiments designed by others, the success of that approach is established, leading to automatic model selection by software that can outperform practitioners.
González‐Rivera Gloria and Emre Yoldas
- Published in print:
- 2010
- Published Online:
- May 2010
- ISBN:
- 9780199549498
- eISBN:
- 9780191720567
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199549498.003.0011
- Subject:
- Economics and Finance, Econometrics
This chapter develops a new set of specification tests for multivariate dynamic models based on the concept of autocontours. The chapter is organized as follows. Section 2 describes the battery of ...
More
This chapter develops a new set of specification tests for multivariate dynamic models based on the concept of autocontours. The chapter is organized as follows. Section 2 describes the battery of tests and the construction of the multivariate contours and autocontours. Section 3 offers some Monte Carlo simulation to assess the size and power of the tests in finite samples. Section 4 applies the tests to the generalized residuals of GARCH models with hypothesized normal and multivariate Student-t innovations fitted to excess returns on five size portfolios; Section 5 concludes.Less
This chapter develops a new set of specification tests for multivariate dynamic models based on the concept of autocontours. The chapter is organized as follows. Section 2 describes the battery of tests and the construction of the multivariate contours and autocontours. Section 3 offers some Monte Carlo simulation to assess the size and power of the tests in finite samples. Section 4 applies the tests to the generalized residuals of GARCH models with hypothesized normal and multivariate Student-t innovations fitted to excess returns on five size portfolios; Section 5 concludes.
David F. Hendry and Carlos Santos
- Published in print:
- 2010
- Published Online:
- May 2010
- ISBN:
- 9780199549498
- eISBN:
- 9780191720567
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199549498.003.0009
- Subject:
- Economics and Finance, Econometrics
This chapter proposes a test for ‘super exogeneity’, a concept originally developed by Rob, David, and Jean-Francois Richard. The structure of the chapter is as follows. Section 2 reconsiders which ...
More
This chapter proposes a test for ‘super exogeneity’, a concept originally developed by Rob, David, and Jean-Francois Richard. The structure of the chapter is as follows. Section 2 reconsiders which shifts in vector autoregressions (VARs) are relatively detectable, and derives the implications for testing for breaks in conditional representations. Section 3 considers super exogeneity in a regression context in order to elucidate its testable hypotheses, and discusses how super exogeneity can fail. Section 4 describes the impulse-saturation tests in Hendry et al. (2008) and Johansen and Nielsen (2009), and considers how to extend these to test super exogeneity. Section 5 provides analytic and Monte Carlo evidence on the null rejection frequencies of that procedure. Section 6 considers the power of the first stage to determine location shifts in marginal processes. Section 7 analyzes a failure of weak exogeneity under a nonconstant marginal process. Section 8 notes a co-breaking saturation-based test which builds on Krolzig and Toro (2002) and Hendry and Massmann (2007). Section 9 investigates the powers of the proposed automatic test in Monte Carlo experiments for a bivariate data generation process based on Section 7. Section 10 tests super exogeneity in the much-studied example of UK money demand; and Section 11 concludes.Less
This chapter proposes a test for ‘super exogeneity’, a concept originally developed by Rob, David, and Jean-Francois Richard. The structure of the chapter is as follows. Section 2 reconsiders which shifts in vector autoregressions (VARs) are relatively detectable, and derives the implications for testing for breaks in conditional representations. Section 3 considers super exogeneity in a regression context in order to elucidate its testable hypotheses, and discusses how super exogeneity can fail. Section 4 describes the impulse-saturation tests in Hendry et al. (2008) and Johansen and Nielsen (2009), and considers how to extend these to test super exogeneity. Section 5 provides analytic and Monte Carlo evidence on the null rejection frequencies of that procedure. Section 6 considers the power of the first stage to determine location shifts in marginal processes. Section 7 analyzes a failure of weak exogeneity under a nonconstant marginal process. Section 8 notes a co-breaking saturation-based test which builds on Krolzig and Toro (2002) and Hendry and Massmann (2007). Section 9 investigates the powers of the proposed automatic test in Monte Carlo experiments for a bivariate data generation process based on Section 7. Section 10 tests super exogeneity in the much-studied example of UK money demand; and Section 11 concludes.
David F. Hendry and Frank Srba
- Published in print:
- 2000
- Published Online:
- November 2003
- ISBN:
- 9780198293545
- eISBN:
- 9780191596391
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198293542.003.0015
- Subject:
- Economics and Finance, Econometrics
Keeping econometrics operational with suitable computer programs that implement new methods and approaches is essential. The evolution of AUTOREG follows the methodological developments, from ...
More
Keeping econometrics operational with suitable computer programs that implement new methods and approaches is essential. The evolution of AUTOREG follows the methodological developments, from unfriendly, mainframe batch programs focused on optimal estimation to interactive, menu‐driven modelling which facilitates both teaching and research in time‐series econometrics, exploiting powerful graphics (PcGive). The approach always rigorously tested model specifications, then tests were included for mis‐specification, gradually leading to ‘model building’procedures: throughout, Monte Carlo simulation programs evaluated the new methods in finite samples. Four central issues are considered: discovery, namely, finding a suitable model specification; estimation, based on solving the score equations; numerical optimization for maximizing the likelihood function; and statistical analyses of the properties of estimators.Less
Keeping econometrics operational with suitable computer programs that implement new methods and approaches is essential. The evolution of AUTOREG follows the methodological developments, from unfriendly, mainframe batch programs focused on optimal estimation to interactive, menu‐driven modelling which facilitates both teaching and research in time‐series econometrics, exploiting powerful graphics (PcGive). The approach always rigorously tested model specifications, then tests were included for mis‐specification, gradually leading to ‘model building’procedures: throughout, Monte Carlo simulation programs evaluated the new methods in finite samples. Four central issues are considered: discovery, namely, finding a suitable model specification; estimation, based on solving the score equations; numerical optimization for maximizing the likelihood function; and statistical analyses of the properties of estimators.
Anindya Banerjee, Juan J. Dolado, John W. Galbraith, and David F. Hendry
- Published in print:
- 1993
- Published Online:
- November 2003
- ISBN:
- 9780198288107
- eISBN:
- 9780191595899
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198288107.003.0003
- Subject:
- Economics and Finance, Econometrics
Presents the important properties of integrated variables and sets out some of the preliminary asymptotic theories essential for the consideration of such processes. It explores the concepts of unit ...
More
Presents the important properties of integrated variables and sets out some of the preliminary asymptotic theories essential for the consideration of such processes. It explores the concepts of unit roots, non‐stationarity, orders of integration, and near integration, and demonstrates the use of the theory in understanding the behaviour of least‐squares estimators in spurious regressions and in models involving integrated data. The theoretical analysis is accompanied by evidence from Monte Carlo simulations. Several examples are also provided to illustrate the use of Wiener distribution theory in deriving asymptotic results for such models.Less
Presents the important properties of integrated variables and sets out some of the preliminary asymptotic theories essential for the consideration of such processes. It explores the concepts of unit roots, non‐stationarity, orders of integration, and near integration, and demonstrates the use of the theory in understanding the behaviour of least‐squares estimators in spurious regressions and in models involving integrated data. The theoretical analysis is accompanied by evidence from Monte Carlo simulations. Several examples are also provided to illustrate the use of Wiener distribution theory in deriving asymptotic results for such models.
Gisele Kamanou and Jonathan Morduch
- Published in print:
- 2004
- Published Online:
- January 2005
- ISBN:
- 9780199276837
- eISBN:
- 9780191601620
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0199276838.003.0009
- Subject:
- Economics and Finance, Development, Growth, and Environmental
A method based on Monte Carlo bootstrap estimations of consumption changes was developed to measure vulnerability to poverty. The method was applied to data on Cote d’Ivoire in 1985-86. It revealed ...
More
A method based on Monte Carlo bootstrap estimations of consumption changes was developed to measure vulnerability to poverty. The method was applied to data on Cote d’Ivoire in 1985-86. It revealed potential difficulties faced by households, which were obscured when historical records were used to determine the extent of vulnerabilities.Less
A method based on Monte Carlo bootstrap estimations of consumption changes was developed to measure vulnerability to poverty. The method was applied to data on Cote d’Ivoire in 1985-86. It revealed potential difficulties faced by households, which were obscured when historical records were used to determine the extent of vulnerabilities.
Anindya Banerjee, Juan J. Dolado, John W. Galbraith, and David F. Hendry
- Published in print:
- 1993
- Published Online:
- November 2003
- ISBN:
- 9780198288107
- eISBN:
- 9780191595899
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198288107.003.0001
- Subject:
- Economics and Finance, Econometrics
Serves as an introductory overview for the rest of the book, and outlines its main aims. As a basis for the following chapters, an overview and clarification of equilibrium relationships in economic ...
More
Serves as an introductory overview for the rest of the book, and outlines its main aims. As a basis for the following chapters, an overview and clarification of equilibrium relationships in economic theory is presented. A preliminary discussion of testing for orders of integration and the estimation of long‐run relationships is provided. The chapter summarizes key concepts from time‐series analysis and the theory of stochastic processes and, in particular, the theory of Brownian motion processes. Several empirical examples are offered as illustration of these concepts.Less
Serves as an introductory overview for the rest of the book, and outlines its main aims. As a basis for the following chapters, an overview and clarification of equilibrium relationships in economic theory is presented. A preliminary discussion of testing for orders of integration and the estimation of long‐run relationships is provided. The chapter summarizes key concepts from time‐series analysis and the theory of stochastic processes and, in particular, the theory of Brownian motion processes. Several empirical examples are offered as illustration of these concepts.
Richard Wigmans
- Published in print:
- 2017
- Published Online:
- January 2018
- ISBN:
- 9780198786351
- eISBN:
- 9780191828652
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198786351.003.0002
- Subject:
- Physics, Particle Physics / Astrophysics / Cosmology, Nuclear and Plasma Physics
The processes that play a role in the absorption of different types of particles in dense matter are described, with emphasis on the aspects that are important for calorimetry. A distinction is made ...
More
The processes that play a role in the absorption of different types of particles in dense matter are described, with emphasis on the aspects that are important for calorimetry. A distinction is made between particles that develop electromagnetic showers (electrons, photons) and particles that are subject to the strong nuclear interaction, such as pions and protons. A separate section is dedicated to muons, which are typically not fully absorbed in practical calorimeters. The energy dependence of the various processes, and the consequences for the size requirements of detectors, are discussed in detail. The practical importance and limitations of Monte Carlo simulations of the shower development process are reviewed. The chapter ends with a summary of facts deriving from the physics of shower development that are important for calorimetry.Less
The processes that play a role in the absorption of different types of particles in dense matter are described, with emphasis on the aspects that are important for calorimetry. A distinction is made between particles that develop electromagnetic showers (electrons, photons) and particles that are subject to the strong nuclear interaction, such as pions and protons. A separate section is dedicated to muons, which are typically not fully absorbed in practical calorimeters. The energy dependence of the various processes, and the consequences for the size requirements of detectors, are discussed in detail. The practical importance and limitations of Monte Carlo simulations of the shower development process are reviewed. The chapter ends with a summary of facts deriving from the physics of shower development that are important for calorimetry.
M. Vidyasagar
- Published in print:
- 2014
- Published Online:
- October 2017
- ISBN:
- 9780691133157
- eISBN:
- 9781400850518
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691133157.003.0001
- Subject:
- Mathematics, Probability / Statistics
This chapter provides an introduction to probability and random variables. Probability theory is an attempt to formalize the notion of uncertainty in the outcome of an experiment. For instance, ...
More
This chapter provides an introduction to probability and random variables. Probability theory is an attempt to formalize the notion of uncertainty in the outcome of an experiment. For instance, suppose an urn contains four balls, colored red, blue, white, and green respectively. Suppose we dip our hand in the urn and pull out one of the balls “at random.” What is the likelihood that the ball we pull out will be red? The chapter first defines a random variable and probability before discussing the function of a random variable and expected value. It then considers total variation distance, joint and marginal probability distributions, independence and conditional probability distributions, Bayes' rule, and maximum likelihood estimates. Finally, it describes random variables assuming infinitely many values, focusing on Markov and Chebycheff inequalities, Hoeffding's inequality, Monte Carlo simulation, and Cramér's theorem.Less
This chapter provides an introduction to probability and random variables. Probability theory is an attempt to formalize the notion of uncertainty in the outcome of an experiment. For instance, suppose an urn contains four balls, colored red, blue, white, and green respectively. Suppose we dip our hand in the urn and pull out one of the balls “at random.” What is the likelihood that the ball we pull out will be red? The chapter first defines a random variable and probability before discussing the function of a random variable and expected value. It then considers total variation distance, joint and marginal probability distributions, independence and conditional probability distributions, Bayes' rule, and maximum likelihood estimates. Finally, it describes random variables assuming infinitely many values, focusing on Markov and Chebycheff inequalities, Hoeffding's inequality, Monte Carlo simulation, and Cramér's theorem.
Rajashri (Priya) Joshi, Tom Davis, and Bill McCoy
- Published in print:
- 2016
- Published Online:
- October 2016
- ISBN:
- 9780198785774
- eISBN:
- 9780191827594
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198785774.003.0024
- Subject:
- Economics and Finance, Financial Economics, Macro- and Monetary Economics
This chapter describes and illustrates the elements and mechanics of valuing mortgage-backed securities (MBS). It begins with the basics of valuation and then briefly reviews the sources of MBS ...
More
This chapter describes and illustrates the elements and mechanics of valuing mortgage-backed securities (MBS). It begins with the basics of valuation and then briefly reviews the sources of MBS prepayments. Next, it discusses the models and assumptions that go into generating a set of projected cash flows. Forecasting MBS prepayment speeds and, in turn, total cash flows, is a much more complex undertaking than predicting the timing of redemption of a callable corporate bond. Practitioners generally rely on econometric prepayment models and associated auxiliary models to generate speed and cash flow forecasts, which are then used to value the bond. The Monte Carlo simulation is the only viable methodology for valuing mortgage-backed securities, as closed-form solutions are unavailable, and the path-dependent nature of the embedded prepayment option generally precludes the use of lattice-based approaches.Less
This chapter describes and illustrates the elements and mechanics of valuing mortgage-backed securities (MBS). It begins with the basics of valuation and then briefly reviews the sources of MBS prepayments. Next, it discusses the models and assumptions that go into generating a set of projected cash flows. Forecasting MBS prepayment speeds and, in turn, total cash flows, is a much more complex undertaking than predicting the timing of redemption of a callable corporate bond. Practitioners generally rely on econometric prepayment models and associated auxiliary models to generate speed and cash flow forecasts, which are then used to value the bond. The Monte Carlo simulation is the only viable methodology for valuing mortgage-backed securities, as closed-form solutions are unavailable, and the path-dependent nature of the embedded prepayment option generally precludes the use of lattice-based approaches.
Robert H. Swendsen
- Published in print:
- 2012
- Published Online:
- December 2013
- ISBN:
- 9780199646944
- eISBN:
- 9780191775123
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199646944.003.0019
- Subject:
- Physics, Condensed Matter Physics / Materials
This chapter resumes the discussion of classical statistical mechanics that was begun in Part 1 of the book. Numerical methods (molecular dynamics and Monte Carlo computer simulations) of calculating ...
More
This chapter resumes the discussion of classical statistical mechanics that was begun in Part 1 of the book. Numerical methods (molecular dynamics and Monte Carlo computer simulations) of calculating thermodynamic properties from statistical mechanics are defined and investigated in the problems at the end of the chapter. The Liouville theorem is proved, and its consequences discussed. It is shown how thermodynamic identities can be derived entirely from the formalism of statistical mechanics, as well as how new identities can be derived that go beyond those in thermodynamics. The properties of the harmonic oscillator are derived explicitly because of their importance in future chapters.Less
This chapter resumes the discussion of classical statistical mechanics that was begun in Part 1 of the book. Numerical methods (molecular dynamics and Monte Carlo computer simulations) of calculating thermodynamic properties from statistical mechanics are defined and investigated in the problems at the end of the chapter. The Liouville theorem is proved, and its consequences discussed. It is shown how thermodynamic identities can be derived entirely from the formalism of statistical mechanics, as well as how new identities can be derived that go beyond those in thermodynamics. The properties of the harmonic oscillator are derived explicitly because of their importance in future chapters.
Gary Smith and Jay Cordes
- Published in print:
- 2019
- Published Online:
- September 2019
- ISBN:
- 9780198844396
- eISBN:
- 9780191879937
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198844396.003.0005
- Subject:
- Mathematics, Applied Mathematics, Numerical Analysis
Computer software, particularly deep neural networks and Monte Carlo simulations, are extremely useful for the specific tasks that they have been designed to do, and they will get even better, much ...
More
Computer software, particularly deep neural networks and Monte Carlo simulations, are extremely useful for the specific tasks that they have been designed to do, and they will get even better, much better. However, we should not assume that computers are smarter than us just because they can tell us the first 2000 digits of pi or show us a street map of every city in the world. One of the paradoxical things about computers is that they can excel at things that humans consider difficult (like calculating square roots) while failing at things that humans consider easy (like recognizing stop signs). They can’t pass simple tests like the Winograd Schema Challenge because they do not understand the world the way humans do. They have neither common sense nor wisdom. They are our tools, not our masters.Less
Computer software, particularly deep neural networks and Monte Carlo simulations, are extremely useful for the specific tasks that they have been designed to do, and they will get even better, much better. However, we should not assume that computers are smarter than us just because they can tell us the first 2000 digits of pi or show us a street map of every city in the world. One of the paradoxical things about computers is that they can excel at things that humans consider difficult (like calculating square roots) while failing at things that humans consider easy (like recognizing stop signs). They can’t pass simple tests like the Winograd Schema Challenge because they do not understand the world the way humans do. They have neither common sense nor wisdom. They are our tools, not our masters.
Diana B. Petitti
- Published in print:
- 1999
- Published Online:
- September 2009
- ISBN:
- 9780195133646
- eISBN:
- 9780199863761
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195133646.003.10
- Subject:
- Public Health and Epidemiology, Public Health, Epidemiology
This chapter describes how to select probability estimates for a decision analysis and how to justify the choice of probabilities. Because decision analytic models are often the basis for ...
More
This chapter describes how to select probability estimates for a decision analysis and how to justify the choice of probabilities. Because decision analytic models are often the basis for cost-effectiveness analysis, the information on estimation of probabilities information is directly relevant to cost-effectiveness analysis. The chapter explains why probability estimation should be based on systematic review. It briefly discusses Monte Carlo simulation and the confidence profile, a Baysian method, as methods for estimating uncertainty in the expected outcome of a decision analysis.Less
This chapter describes how to select probability estimates for a decision analysis and how to justify the choice of probabilities. Because decision analytic models are often the basis for cost-effectiveness analysis, the information on estimation of probabilities information is directly relevant to cost-effectiveness analysis. The chapter explains why probability estimation should be based on systematic review. It briefly discusses Monte Carlo simulation and the confidence profile, a Baysian method, as methods for estimating uncertainty in the expected outcome of a decision analysis.
Lev Pitaevskii and Sandro Stringari
- Published in print:
- 2016
- Published Online:
- March 2016
- ISBN:
- 9780198758884
- eISBN:
- 9780191818721
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198758884.003.0016
- Subject:
- Physics, Condensed Matter Physics / Materials
This is the first of a series of chapters devoted to interacting Fermi gases, with special focus here on the effects of superfluidity. It starts with a brief discussion of the ideal Fermi gas and ...
More
This is the first of a series of chapters devoted to interacting Fermi gases, with special focus here on the effects of superfluidity. It starts with a brief discussion of the ideal Fermi gas and then focuses on the properties of dilute interacting Fermi gases. Topics include the weakly repulsive Fermi gas; the gas of composite bosons; the BCS limit of a weakly interacting Fermi gas; the strongly interacting, but still dilute, unitary Fermi gas where the scattering length is much larger than the average interatomic distance; the BCS–BEC crossover; and the Bogoliubov–de Gennes approach. A discussion of the main physical quantities predicted by mean-field theory and by quantum Monte Carlo simulations, including, in particular, the equation of state, the momentum distribution, and the condensation of pairs, is also presented.Less
This is the first of a series of chapters devoted to interacting Fermi gases, with special focus here on the effects of superfluidity. It starts with a brief discussion of the ideal Fermi gas and then focuses on the properties of dilute interacting Fermi gases. Topics include the weakly repulsive Fermi gas; the gas of composite bosons; the BCS limit of a weakly interacting Fermi gas; the strongly interacting, but still dilute, unitary Fermi gas where the scattering length is much larger than the average interatomic distance; the BCS–BEC crossover; and the Bogoliubov–de Gennes approach. A discussion of the main physical quantities predicted by mean-field theory and by quantum Monte Carlo simulations, including, in particular, the equation of state, the momentum distribution, and the condensation of pairs, is also presented.