Fred Campano and Dominick Salvatore
- Published in print:
- 2006
- Published Online:
- May 2006
- ISBN:
- 9780195300918
- eISBN:
- 9780199783441
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0195300912.001.0001
- Subject:
- Economics and Finance, Development, Growth, and Environmental
Intended as an introductory textbook for advanced undergraduates and first year graduate students, this book leads the reader from familiar basic micro- and macroeconomic concepts in the introduction ...
More
Intended as an introductory textbook for advanced undergraduates and first year graduate students, this book leads the reader from familiar basic micro- and macroeconomic concepts in the introduction to not so familiar concepts relating to income distribution in the subsequent chapters. The income concept and household sample surveys are examined first, followed by descriptive statistics techniques commonly used to present the survey results. The commonality found in the shape of the income density function leads to statistical modeling, parameter estimation, and goodness of fit tests. Alternative models are then introduced along with the related summary measures of income distribution, including the Gini coefficient. This is followed by a sequence of chapters that deal with normative issues such as inequality, poverty, and country comparisons. The remaining chapters cover an assortment of topics including: economic development and globalization and their impact on income distribution, redistribution of income, and integrating macroeconomic models with income distribution models.Less
Intended as an introductory textbook for advanced undergraduates and first year graduate students, this book leads the reader from familiar basic micro- and macroeconomic concepts in the introduction to not so familiar concepts relating to income distribution in the subsequent chapters. The income concept and household sample surveys are examined first, followed by descriptive statistics techniques commonly used to present the survey results. The commonality found in the shape of the income density function leads to statistical modeling, parameter estimation, and goodness of fit tests. Alternative models are then introduced along with the related summary measures of income distribution, including the Gini coefficient. This is followed by a sequence of chapters that deal with normative issues such as inequality, poverty, and country comparisons. The remaining chapters cover an assortment of topics including: economic development and globalization and their impact on income distribution, redistribution of income, and integrating macroeconomic models with income distribution models.
Željko Ivezi, Andrew J. Connolly, Jacob T. VanderPlas, Alexander Gray, Željko Ivezi, Andrew J. Connolly, Jacob T. VanderPlas, and Alexander Gray
- Published in print:
- 2014
- Published Online:
- October 2017
- ISBN:
- 9780691151687
- eISBN:
- 9781400848911
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691151687.003.0006
- Subject:
- Physics, Particle Physics / Astrophysics / Cosmology
Inferring the probability density function (pdf) from a sample of data is known as density estimation. The same methodology is often called data smoothing. Density estimation in the one-dimensional ...
More
Inferring the probability density function (pdf) from a sample of data is known as density estimation. The same methodology is often called data smoothing. Density estimation in the one-dimensional case has been discussed in the previous chapters. This chapter extends it to multidimensional cases. Density estimation is one of the most critical components of extracting knowledge from data. For example, given a pdf estimated from point data, we can generate simulated distributions of data and compare them against observations. If we can identify regions of low probability within the pdf, we have a mechanism for the detection of unusual or anomalous sources. If our point data can be separated into subsamples using provided class labels, we can estimate the pdf for each subsample and use the resulting set of pdfs to classify new points: the probability that a new point belongs to each subsample/class is proportional to the pdf of each class evaluated at the position of the point.Less
Inferring the probability density function (pdf) from a sample of data is known as density estimation. The same methodology is often called data smoothing. Density estimation in the one-dimensional case has been discussed in the previous chapters. This chapter extends it to multidimensional cases. Density estimation is one of the most critical components of extracting knowledge from data. For example, given a pdf estimated from point data, we can generate simulated distributions of data and compare them against observations. If we can identify regions of low probability within the pdf, we have a mechanism for the detection of unusual or anomalous sources. If our point data can be separated into subsamples using provided class labels, we can estimate the pdf for each subsample and use the resulting set of pdfs to classify new points: the probability that a new point belongs to each subsample/class is proportional to the pdf of each class evaluated at the position of the point.
Fred Campano and Dominick Salvatore
- Published in print:
- 2006
- Published Online:
- May 2006
- ISBN:
- 9780195300918
- eISBN:
- 9780199783441
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0195300912.003.0003
- Subject:
- Economics and Finance, Development, Growth, and Environmental
This chapter starts with Pareto’s law and the observation by other economists of the consistency of right-hand skewness in income distributions in both developing and developed countries. This leads ...
More
This chapter starts with Pareto’s law and the observation by other economists of the consistency of right-hand skewness in income distributions in both developing and developed countries. This leads to the modeling of income distribution with probability density functions. This is illustrated using the lognormal model.Less
This chapter starts with Pareto’s law and the observation by other economists of the consistency of right-hand skewness in income distributions in both developing and developed countries. This leads to the modeling of income distribution with probability density functions. This is illustrated using the lognormal model.
Dennis Sherwood and Jon Cooper
- Published in print:
- 2010
- Published Online:
- January 2011
- ISBN:
- 9780199559046
- eISBN:
- 9780191595028
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199559046.003.0009
- Subject:
- Physics, Crystallography: Physics
This chapter discusses the underlying principles X-ray scattering by a distribution of electrons. The theory is then extended to the diffraction of X-rays by an infinite lattice of molecular motifs ...
More
This chapter discusses the underlying principles X-ray scattering by a distribution of electrons. The theory is then extended to the diffraction of X-rays by an infinite lattice of molecular motifs and the concept of the structure factor, which describes the diffraction pattern, is introduced. The theory of calculating the electron density distribution within the unit cell by Fourier inversion of the structure factors is then covered. The fact that only the amplitudes and not the phases of the structure factors can be measured experimentally represents the major practical problem of diffraction analysis that is known as the phase problem. The process of calculating structure factors from a known structure can be simplified by treating each atom as a scattering centre that is assigned a scattering factor related to its atomic number. In the second half of the chapter, the rules relating the symmetry of the diffraction pattern to that of the crystal are derived, and the basis for the inherent inversion symmetry of the diffraction pattern, known as Friedel's law, is described. Situations where this important law breaks down are touched upon due to their importance in solving the phase problem. The phenomenon of systematic absences, essentially missing diffraction spots, and how they can yield key information on the symmetry of the crystal is explained.Less
This chapter discusses the underlying principles X-ray scattering by a distribution of electrons. The theory is then extended to the diffraction of X-rays by an infinite lattice of molecular motifs and the concept of the structure factor, which describes the diffraction pattern, is introduced. The theory of calculating the electron density distribution within the unit cell by Fourier inversion of the structure factors is then covered. The fact that only the amplitudes and not the phases of the structure factors can be measured experimentally represents the major practical problem of diffraction analysis that is known as the phase problem. The process of calculating structure factors from a known structure can be simplified by treating each atom as a scattering centre that is assigned a scattering factor related to its atomic number. In the second half of the chapter, the rules relating the symmetry of the diffraction pattern to that of the crystal are derived, and the basis for the inherent inversion symmetry of the diffraction pattern, known as Friedel's law, is described. Situations where this important law breaks down are touched upon due to their importance in solving the phase problem. The phenomenon of systematic absences, essentially missing diffraction spots, and how they can yield key information on the symmetry of the crystal is explained.
Stephen Figlewski
- Published in print:
- 2010
- Published Online:
- May 2010
- ISBN:
- 9780199549498
- eISBN:
- 9780191720567
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199549498.003.0015
- Subject:
- Economics and Finance, Econometrics
This chapter presents a new methodology for extracting complete well-behaved risk-neutral density (RND) functions from options market prices, and illustrates the potential of this tool for ...
More
This chapter presents a new methodology for extracting complete well-behaved risk-neutral density (RND) functions from options market prices, and illustrates the potential of this tool for understanding how expectations and risk preferences are incorporated into prices in the US stock market. It reviews a variety of techniques for obtaining smooth densities from a set of observed options prices and selects one that offers good performance. This procedure is then modified to incorporate the market's bid-ask spread into the estimation. The chapter shows how the tails of the RND obtained from the options market may be extended and completed by appending tails from a generalized extreme value (GEV) distribution. The procedure is employed in order to estimate RNDs for the S&P 500 stock index from 1996-2008, and develops several interesting results.Less
This chapter presents a new methodology for extracting complete well-behaved risk-neutral density (RND) functions from options market prices, and illustrates the potential of this tool for understanding how expectations and risk preferences are incorporated into prices in the US stock market. It reviews a variety of techniques for obtaining smooth densities from a set of observed options prices and selects one that offers good performance. This procedure is then modified to incorporate the market's bid-ask spread into the estimation. The chapter shows how the tails of the RND obtained from the options market may be extended and completed by appending tails from a generalized extreme value (GEV) distribution. The procedure is employed in order to estimate RNDs for the S&P 500 stock index from 1996-2008, and develops several interesting results.
Bernard Van Praag
- Published in print:
- 2007
- Published Online:
- May 2008
- ISBN:
- 9780199226146
- eISBN:
- 9780191718595
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199226146.003.0007
- Subject:
- Economics and Finance, Behavioural Economics
This chapter examines the theory that the individual's current satisfaction will depend on his or her own past experience and expected future. The study assumes that both the past and the future have ...
More
This chapter examines the theory that the individual's current satisfaction will depend on his or her own past experience and expected future. The study assumes that both the past and the future have an effect, and that the impact distribution is described by a mass-density function on the time axis. Estimates of this function show that its position and shape depend on age and other individual variables. The young and elderly place more weight on the past, while those in mid-life give more weight to the future.Less
This chapter examines the theory that the individual's current satisfaction will depend on his or her own past experience and expected future. The study assumes that both the past and the future have an effect, and that the impact distribution is described by a mass-density function on the time axis. Estimates of this function show that its position and shape depend on age and other individual variables. The young and elderly place more weight on the past, while those in mid-life give more weight to the future.
Wai-Kee Li, Gong-Du Zhou, and Thomas Chung Wai Mak
- Published in print:
- 2008
- Published Online:
- May 2008
- ISBN:
- 9780199216949
- eISBN:
- 9780191711992
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199216949.003.0001
- Subject:
- Physics, Crystallography: Physics
This chapter covers the rudiments of quantum theory, including the dual nature of light and matter, the Uncertainty Principle and probability concept, the electronic wavefunction, and the probability ...
More
This chapter covers the rudiments of quantum theory, including the dual nature of light and matter, the Uncertainty Principle and probability concept, the electronic wavefunction, and the probability density function. Numerical examples are given to show that given the electronic wavefunction of a system, the probability of finding an electron in a volume element around a certain point in space can be readily calculated. Finally, the electronic wave equation, the Schrödinger equation, is introduced. This discussion is followed by the solution of a few particle-in-a-box problems, with the ‘box’ having the shape of a wire (one-dimensional), a cube, a ring, or a triangle. Where possible, the solutions of these problems are then applied to a chemical system.Less
This chapter covers the rudiments of quantum theory, including the dual nature of light and matter, the Uncertainty Principle and probability concept, the electronic wavefunction, and the probability density function. Numerical examples are given to show that given the electronic wavefunction of a system, the probability of finding an electron in a volume element around a certain point in space can be readily calculated. Finally, the electronic wave equation, the Schrödinger equation, is introduced. This discussion is followed by the solution of a few particle-in-a-box problems, with the ‘box’ having the shape of a wire (one-dimensional), a cube, a ring, or a triangle. Where possible, the solutions of these problems are then applied to a chemical system.
James Davidson
- Published in print:
- 1994
- Published Online:
- November 2003
- ISBN:
- 9780198774037
- eISBN:
- 9780191596117
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198774036.003.0008
- Subject:
- Economics and Finance, Econometrics
Specializing the concepts of Ch. 7 to the case of real variables, this chapter introduces distribution functions, discrete and continuous distributions, and describes examples such as the binomial, ...
More
Specializing the concepts of Ch. 7 to the case of real variables, this chapter introduces distribution functions, discrete and continuous distributions, and describes examples such as the binomial, uniform, Gaussian, and Cauchy distributions. It then treats multivariate distributions and the concept of independence.Less
Specializing the concepts of Ch. 7 to the case of real variables, this chapter introduces distribution functions, discrete and continuous distributions, and describes examples such as the binomial, uniform, Gaussian, and Cauchy distributions. It then treats multivariate distributions and the concept of independence.
Bernard Van Praag
- Published in print:
- 2004
- Published Online:
- January 2005
- ISBN:
- 9780198286547
- eISBN:
- 9780191718601
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198286546.003.0007
- Subject:
- Economics and Finance, Microeconomics
This chapter examines the theory that the individual’s current satisfaction will depend on his or her own past experience and expected future. The study assumes that both the past and the future have ...
More
This chapter examines the theory that the individual’s current satisfaction will depend on his or her own past experience and expected future. The study assumes that both the past and the future have an effect, and that the impact distribution is described by a mass-density function on the time axis. Estimations of the function showed that it varies its position and shape depending on age and other individual variables. The young and elderly place more weight on the past, while those in mid-life give more weight to the future.Less
This chapter examines the theory that the individual’s current satisfaction will depend on his or her own past experience and expected future. The study assumes that both the past and the future have an effect, and that the impact distribution is described by a mass-density function on the time axis. Estimations of the function showed that it varies its position and shape depending on age and other individual variables. The young and elderly place more weight on the past, while those in mid-life give more weight to the future.
Thomas P. Trappenberg
- Published in print:
- 2019
- Published Online:
- January 2020
- ISBN:
- 9780198828044
- eISBN:
- 9780191883873
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198828044.003.0006
- Subject:
- Neuroscience, Behavioral Neuroscience
The discussion provides a refresher of probability theory, in particular with respect to the formulations that build the theoretical language of modern machine learning. Probability theory is the ...
More
The discussion provides a refresher of probability theory, in particular with respect to the formulations that build the theoretical language of modern machine learning. Probability theory is the formalism of random numbers, and this chapter outlines what these are and how they are characterized by probability density or probability mass functions. How such functions have traditionally been characterized is covered, and a review of how to work with such mathematical objects such as transforming density functions and how to measure differences between density function is presented. Definitions and basic operations with multiple random variables, including the Bayes law, are covered. The chapter ends with an outline of some important approximation techniques of so-called Monte Carlo methods.Less
The discussion provides a refresher of probability theory, in particular with respect to the formulations that build the theoretical language of modern machine learning. Probability theory is the formalism of random numbers, and this chapter outlines what these are and how they are characterized by probability density or probability mass functions. How such functions have traditionally been characterized is covered, and a review of how to work with such mathematical objects such as transforming density functions and how to measure differences between density function is presented. Definitions and basic operations with multiple random variables, including the Bayes law, are covered. The chapter ends with an outline of some important approximation techniques of so-called Monte Carlo methods.
Theodore R. Holford
- Published in print:
- 2002
- Published Online:
- September 2009
- ISBN:
- 9780195124408
- eISBN:
- 9780199864270
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195124408.003.0002
- Subject:
- Public Health and Epidemiology, Public Health, Epidemiology
This chapter discusses the concept of a model for the disease process, emphasizing simple and very general models that can be used to study a variety of diseases. Quantities that are used to describe ...
More
This chapter discusses the concept of a model for the disease process, emphasizing simple and very general models that can be used to study a variety of diseases. Quantities that are used to describe this fundamental model are the hazard function, the survival function, and the failure time density function. Each of these quantities can be derived from one of the others, so that they are really different expressions of the same underlying concept. Two forms are often used for the hazard: a constant hazard and a Weibull hazard. In many applications, however, it is not feasible to describe adequately the hazard by a simple mathematical function, which is one reason for the availability of non-parametric or semiparametric methods.Less
This chapter discusses the concept of a model for the disease process, emphasizing simple and very general models that can be used to study a variety of diseases. Quantities that are used to describe this fundamental model are the hazard function, the survival function, and the failure time density function. Each of these quantities can be derived from one of the others, so that they are really different expressions of the same underlying concept. Two forms are often used for the hazard: a constant hazard and a Weibull hazard. In many applications, however, it is not feasible to describe adequately the hazard by a simple mathematical function, which is one reason for the availability of non-parametric or semiparametric methods.
M. Hashem Pesaran
- Published in print:
- 2015
- Published Online:
- March 2016
- ISBN:
- 9780198736912
- eISBN:
- 9780191800504
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198736912.003.0013
- Subject:
- Economics and Finance, Econometrics
Spectral analysis provides an alternative to the time domain approach to time series analysis. This approach views a stochastic process as a weighted sum of the periodic functions sin(·) and cos(·) ...
More
Spectral analysis provides an alternative to the time domain approach to time series analysis. This approach views a stochastic process as a weighted sum of the periodic functions sin(·) and cos(·) with different frequencies. This chapter discusses the spectral representation theorem; relates the spectral density function to the autocovariance generating function, properties of the spectral density function; and spectral density of distributed lag models. Exercises are provided at the end of the chapter.Less
Spectral analysis provides an alternative to the time domain approach to time series analysis. This approach views a stochastic process as a weighted sum of the periodic functions sin(·) and cos(·) with different frequencies. This chapter discusses the spectral representation theorem; relates the spectral density function to the autocovariance generating function, properties of the spectral density function; and spectral density of distributed lag models. Exercises are provided at the end of the chapter.
Timothy E. Essington
- Published in print:
- 2021
- Published Online:
- November 2021
- ISBN:
- 9780192843470
- eISBN:
- 9780191926112
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780192843470.003.0007
- Subject:
- Biology, Biomathematics / Statistics and Data Analysis / Complexity Studies
The chapter “Random Variables and Probability” serves as both a review and a reference on probability. The random variable is the core concept in understanding probability, parameter estimation, and ...
More
The chapter “Random Variables and Probability” serves as both a review and a reference on probability. The random variable is the core concept in understanding probability, parameter estimation, and model selection. This chapter reviews the basic idea of a random variable and discusses the two main kinds of random variables: discrete random variables and continuous random variables. It covers the distinction between discrete and continuous random variables and outlines the most common probability mass or density functions used in ecology. Advanced sections cover distributions such as the gamma distribution, Student’s t-distribution, the beta distribution, the beta-binomial distribution, and zero-inflated models.Less
The chapter “Random Variables and Probability” serves as both a review and a reference on probability. The random variable is the core concept in understanding probability, parameter estimation, and model selection. This chapter reviews the basic idea of a random variable and discusses the two main kinds of random variables: discrete random variables and continuous random variables. It covers the distinction between discrete and continuous random variables and outlines the most common probability mass or density functions used in ecology. Advanced sections cover distributions such as the gamma distribution, Student’s t-distribution, the beta distribution, the beta-binomial distribution, and zero-inflated models.
Daniel T. Gillespie and Effrosyni Seitaridou
- Published in print:
- 2012
- Published Online:
- January 2013
- ISBN:
- 9780199664504
- eISBN:
- 9780191748516
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199664504.003.0002
- Subject:
- Physics, Soft Matter / Biological Physics
In extending the analysis of diffusion beyond the traditional approach described in Chapter 1, the chapter here finds it necessary to work with random variables. This chapter gives a self-contained, ...
More
In extending the analysis of diffusion beyond the traditional approach described in Chapter 1, the chapter here finds it necessary to work with random variables. This chapter gives a self-contained, selective introduction to random variable theory, presenting some definitions and results that will be used repeatedly throughout the rest of the book. The presentation features utilitarian definitions of probability and random variables, and derivations of the mathematical rules they obey. Topics covered include: the probability density function and the cumulative distribution function; the uniform, exponential, normal, Cauchy, and binomial random variables; moments, means, variances, standard deviations, covariances, and correlations; the Dirac delta function and sure random variables; multivariate random variables; statistical independence; the random variable transformation theorem and its many useful consequences, such as the central limit theorem and linear combination theorems for normal random variables; the bivariate normal random variable; and the computer generation of random numbers by the inversion method. The presentation assumes fluency in standard calculus, but it does not require a knowledge of more advanced topics in mathematics.Less
In extending the analysis of diffusion beyond the traditional approach described in Chapter 1, the chapter here finds it necessary to work with random variables. This chapter gives a self-contained, selective introduction to random variable theory, presenting some definitions and results that will be used repeatedly throughout the rest of the book. The presentation features utilitarian definitions of probability and random variables, and derivations of the mathematical rules they obey. Topics covered include: the probability density function and the cumulative distribution function; the uniform, exponential, normal, Cauchy, and binomial random variables; moments, means, variances, standard deviations, covariances, and correlations; the Dirac delta function and sure random variables; multivariate random variables; statistical independence; the random variable transformation theorem and its many useful consequences, such as the central limit theorem and linear combination theorems for normal random variables; the bivariate normal random variable; and the computer generation of random numbers by the inversion method. The presentation assumes fluency in standard calculus, but it does not require a knowledge of more advanced topics in mathematics.
Steven J. Osterlind
- Published in print:
- 2019
- Published Online:
- January 2019
- ISBN:
- 9780198831600
- eISBN:
- 9780191869532
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198831600.003.0009
- Subject:
- Mathematics, Logic / Computer Science / Mathematical Philosophy
This chapter is all about Carl Gauss, his life, and his accomplishments, including his work in plotting the orbits for Ceres, which he did while still a teenager and which set his reputation. The ...
More
This chapter is all about Carl Gauss, his life, and his accomplishments, including his work in plotting the orbits for Ceres, which he did while still a teenager and which set his reputation. The chapter tells, too, how and when he invented and used his method of least squares and of his dispute with Legendre on who invented it first. One of his most significant accomplishments is his devising (and proof) of the normal probability density function, or, more familiarly, the standard normal curve. This is described and its import and application to modern times is discussed. Also, there is a brief discussion of biographical events and details of his life, such as his reclusive nature in his hometown of Göttingen, and his caring for his ailing mother and then his first and second wives. Some details of his impact today and lasting accomplishments are also provided.Less
This chapter is all about Carl Gauss, his life, and his accomplishments, including his work in plotting the orbits for Ceres, which he did while still a teenager and which set his reputation. The chapter tells, too, how and when he invented and used his method of least squares and of his dispute with Legendre on who invented it first. One of his most significant accomplishments is his devising (and proof) of the normal probability density function, or, more familiarly, the standard normal curve. This is described and its import and application to modern times is discussed. Also, there is a brief discussion of biographical events and details of his life, such as his reclusive nature in his hometown of Göttingen, and his caring for his ailing mother and then his first and second wives. Some details of his impact today and lasting accomplishments are also provided.
Marco Bittelli, Gaylon S. Campbell, and Fausto Tomei
- Published in print:
- 2015
- Published Online:
- August 2015
- ISBN:
- 9780199683093
- eISBN:
- 9780191763175
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199683093.003.0007
- Subject:
- Physics, Geophysics, Atmospheric and Environmental Physics
This chapter address the spatial variability of soil properties. First it addresses classic statistical methods used in geostatistics. Frequency distributions and probability density functions are ...
More
This chapter address the spatial variability of soil properties. First it addresses classic statistical methods used in geostatistics. Frequency distributions and probability density functions are presented. The most common transformations used in soil physics are derived and concepts behind spatial correlations are shown. A discussion of stochastic modelling is presented and scaling methods are described. In particular, the Miller–Miller scaling approach is described and fields of spatial variations are generated using Fourier transform methods. A numerical code is presented to apply different distribution models, such as Gaussian or exponential, for the generation of spatial fields of soil property variations.Less
This chapter address the spatial variability of soil properties. First it addresses classic statistical methods used in geostatistics. Frequency distributions and probability density functions are presented. The most common transformations used in soil physics are derived and concepts behind spatial correlations are shown. A discussion of stochastic modelling is presented and scaling methods are described. In particular, the Miller–Miller scaling approach is described and fields of spatial variations are generated using Fourier transform methods. A numerical code is presented to apply different distribution models, such as Gaussian or exponential, for the generation of spatial fields of soil property variations.
Therese M. Donovan and Ruth M. Mickey
- Published in print:
- 2019
- Published Online:
- July 2019
- ISBN:
- 9780198841296
- eISBN:
- 9780191876820
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198841296.003.0011
- Subject:
- Biology, Biomathematics / Statistics and Data Analysis / Complexity Studies
This chapter introduces the gamma-Poisson conjugate. Many Bayesian analyses consider alternative parameter values as hypotheses. The prior distribution for an unknown parameter can be represented by ...
More
This chapter introduces the gamma-Poisson conjugate. Many Bayesian analyses consider alternative parameter values as hypotheses. The prior distribution for an unknown parameter can be represented by a continuous probability density function when the number of hypotheses is infinite. There are special cases where a Bayesian prior probability distribution for an unknown parameter of interest can be quickly updated to a posterior distribution of the same form as the prior. In the “Shark Attack Problem,” a gamma distribution is used as the prior distribution of λ, the mean number of shark attacks in a given year. Poisson data are then collected to determine the number of attacks in a given year. The prior distribution is updated to the posterior distribution in light of this new information. In short, a gamma prior distribution + Poisson data → gamma posterior distribution. The gamma distribution is said to be “conjugate to” the Poisson distribution.Less
This chapter introduces the gamma-Poisson conjugate. Many Bayesian analyses consider alternative parameter values as hypotheses. The prior distribution for an unknown parameter can be represented by a continuous probability density function when the number of hypotheses is infinite. There are special cases where a Bayesian prior probability distribution for an unknown parameter of interest can be quickly updated to a posterior distribution of the same form as the prior. In the “Shark Attack Problem,” a gamma distribution is used as the prior distribution of λ, the mean number of shark attacks in a given year. Poisson data are then collected to determine the number of attacks in a given year. The prior distribution is updated to the posterior distribution in light of this new information. In short, a gamma prior distribution + Poisson data → gamma posterior distribution. The gamma distribution is said to be “conjugate to” the Poisson distribution.
Daniel T. Gillespie and Effrosyni Seitaridou
- Published in print:
- 2012
- Published Online:
- January 2013
- ISBN:
- 9780199664504
- eISBN:
- 9780191748516
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199664504.003.0003
- Subject:
- Physics, Soft Matter / Biological Physics
Before it was universally accepted that a fluid consists of many moving molecules, Fick's Law and the diffusion equation were widely regarded as statements in continuum mechanics. With the molecular ...
More
Before it was universally accepted that a fluid consists of many moving molecules, Fick's Law and the diffusion equation were widely regarded as statements in continuum mechanics. With the molecular theory in mind, Einstein derived the diffusion equation from a model of random molecular motion instead of from a continuity equation and Fick's Law. This chapter presents Einstein's derivation and examine its strengths and weaknesses. The chapter then deduces some of the implications of Einstein's model of diffusion: its novel probabilistic perspective on the classical diffusion equation; its predictions for the mean of the square of the displacement of a single solute molecule in a given time, and the consequent single-molecule interpretation of the diffusion coefficient; its formulas for the covariance and the correlation of the displacement; its implied formula for the diffusion coefficient of one solute molecule relative to another; and its implied notion of single-molecule probability flux. Finally, The chapter takes the first step toward deriving a formula for the stochastic rate of a diffusion-controlled bimolecular chemical reaction, a derivation which will be will completed in Chapter 4.Less
Before it was universally accepted that a fluid consists of many moving molecules, Fick's Law and the diffusion equation were widely regarded as statements in continuum mechanics. With the molecular theory in mind, Einstein derived the diffusion equation from a model of random molecular motion instead of from a continuity equation and Fick's Law. This chapter presents Einstein's derivation and examine its strengths and weaknesses. The chapter then deduces some of the implications of Einstein's model of diffusion: its novel probabilistic perspective on the classical diffusion equation; its predictions for the mean of the square of the displacement of a single solute molecule in a given time, and the consequent single-molecule interpretation of the diffusion coefficient; its formulas for the covariance and the correlation of the displacement; its implied formula for the diffusion coefficient of one solute molecule relative to another; and its implied notion of single-molecule probability flux. Finally, The chapter takes the first step toward deriving a formula for the stochastic rate of a diffusion-controlled bimolecular chemical reaction, a derivation which will be will completed in Chapter 4.
M. Hashem Pesaran
- Published in print:
- 2015
- Published Online:
- March 2016
- ISBN:
- 9780198736912
- eISBN:
- 9780191800504
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198736912.003.0014
- Subject:
- Economics and Finance, Econometrics
This chapter begins with the problem of estimating the mean and autocovariances of a stationary process. It then considers the estimation of autoregressive and moving average processes as well as the ...
More
This chapter begins with the problem of estimating the mean and autocovariances of a stationary process. It then considers the estimation of autoregressive and moving average processes as well as the estimation of spectral density functions. The analysis is also related to the standard ordinary least squares (OLS) regression models. It shows that when the errors are serially correlated, the OLS estimators of models with lagged dependent variables are inconsistent, and derives an asymptotic expression for the bias. Exercises are provided at the end of the chapter.Less
This chapter begins with the problem of estimating the mean and autocovariances of a stationary process. It then considers the estimation of autoregressive and moving average processes as well as the estimation of spectral density functions. The analysis is also related to the standard ordinary least squares (OLS) regression models. It shows that when the errors are serially correlated, the OLS estimators of models with lagged dependent variables are inconsistent, and derives an asymptotic expression for the bias. Exercises are provided at the end of the chapter.
Peter Richmond, Jürgen Mimkes, and Stefan Hutzler
- Published in print:
- 2013
- Published Online:
- December 2013
- ISBN:
- 9780199674701
- eISBN:
- 9780191780066
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199674701.003.0021
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
This chapter discusses how income and wealth are distributed in society, discussing factors such as the form of the distribution density function and whether or not it is dependent on time, history, ...
More
This chapter discusses how income and wealth are distributed in society, discussing factors such as the form of the distribution density function and whether or not it is dependent on time, history, or locations. Vilfredo Pareto noticed that the rich end of the wealth distribution followed a power law, and that this feature seemed to be universal. To Pareto, and to most physicists, the existence of such a power law suggests that some fundamental dynamics is in play. Pareto himself proposed that people, in the course of their life, could move through the distribution in both directions, and this idea that a static distribution does not imply a static society has formed the basis for the recent studies by a number of physicists.Less
This chapter discusses how income and wealth are distributed in society, discussing factors such as the form of the distribution density function and whether or not it is dependent on time, history, or locations. Vilfredo Pareto noticed that the rich end of the wealth distribution followed a power law, and that this feature seemed to be universal. To Pareto, and to most physicists, the existence of such a power law suggests that some fundamental dynamics is in play. Pareto himself proposed that people, in the course of their life, could move through the distribution in both directions, and this idea that a static distribution does not imply a static society has formed the basis for the recent studies by a number of physicists.