R. Duncan Luce
- Published in print:
- 1991
- Published Online:
- January 2008
- ISBN:
- 9780195070019
- eISBN:
- 9780199869879
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195070019.003.0001
- Subject:
- Psychology, Cognitive Models and Architectures
This chapter begins with a discussion of the study of response times. Response times are treated as observations of a random variable. The mathematics of stochastic processes is then used to ...
More
This chapter begins with a discussion of the study of response times. Response times are treated as observations of a random variable. The mathematics of stochastic processes is then used to understand the process. This gives rise to the distributions of these random variables. Generating functions and elementary concepts of stochastic processes are discussed.Less
This chapter begins with a discussion of the study of response times. Response times are treated as observations of a random variable. The mathematics of stochastic processes is then used to understand the process. This gives rise to the distributions of these random variables. Generating functions and elementary concepts of stochastic processes are discussed.
Željko Ivezi, Andrew J. Connolly, Jacob T. VanderPlas, Alexander Gray, Željko Ivezi, Andrew J. Connolly, Jacob T. VanderPlas, and Alexander Gray
- Published in print:
- 2014
- Published Online:
- October 2017
- ISBN:
- 9780691151687
- eISBN:
- 9781400848911
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691151687.003.0003
- Subject:
- Physics, Particle Physics / Astrophysics / Cosmology
This chapter reviews notation and basic concepts in probability and statistics. The coverage of various topics cannot be complete, and it is aimed at concepts needed to understand material covered in ...
More
This chapter reviews notation and basic concepts in probability and statistics. The coverage of various topics cannot be complete, and it is aimed at concepts needed to understand material covered in the book. For an in-depth discussion of probability and statistics, the chapter suggests that readers please refer to numerous readily available textbooks, such as Bar89, Lup93, WJ03, Wass10, mentioned in this book. The chapter begins with a brief overview of probability and random variables. It then reviews the most common univariate and multivariate distribution functions, and correlation coefficients. It also summarizes the central limit theorem and discusses how to generate mock samples (random number generation) for a given distribution function.Less
This chapter reviews notation and basic concepts in probability and statistics. The coverage of various topics cannot be complete, and it is aimed at concepts needed to understand material covered in the book. For an in-depth discussion of probability and statistics, the chapter suggests that readers please refer to numerous readily available textbooks, such as Bar89, Lup93, WJ03, Wass10, mentioned in this book. The chapter begins with a brief overview of probability and random variables. It then reviews the most common univariate and multivariate distribution functions, and correlation coefficients. It also summarizes the central limit theorem and discusses how to generate mock samples (random number generation) for a given distribution function.
M. Vidyasagar
- Published in print:
- 2014
- Published Online:
- October 2017
- ISBN:
- 9780691133157
- eISBN:
- 9781400850518
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691133157.003.0001
- Subject:
- Mathematics, Probability / Statistics
This chapter provides an introduction to probability and random variables. Probability theory is an attempt to formalize the notion of uncertainty in the outcome of an experiment. For instance, ...
More
This chapter provides an introduction to probability and random variables. Probability theory is an attempt to formalize the notion of uncertainty in the outcome of an experiment. For instance, suppose an urn contains four balls, colored red, blue, white, and green respectively. Suppose we dip our hand in the urn and pull out one of the balls “at random.” What is the likelihood that the ball we pull out will be red? The chapter first defines a random variable and probability before discussing the function of a random variable and expected value. It then considers total variation distance, joint and marginal probability distributions, independence and conditional probability distributions, Bayes' rule, and maximum likelihood estimates. Finally, it describes random variables assuming infinitely many values, focusing on Markov and Chebycheff inequalities, Hoeffding's inequality, Monte Carlo simulation, and Cramér's theorem.Less
This chapter provides an introduction to probability and random variables. Probability theory is an attempt to formalize the notion of uncertainty in the outcome of an experiment. For instance, suppose an urn contains four balls, colored red, blue, white, and green respectively. Suppose we dip our hand in the urn and pull out one of the balls “at random.” What is the likelihood that the ball we pull out will be red? The chapter first defines a random variable and probability before discussing the function of a random variable and expected value. It then considers total variation distance, joint and marginal probability distributions, independence and conditional probability distributions, Bayes' rule, and maximum likelihood estimates. Finally, it describes random variables assuming infinitely many values, focusing on Markov and Chebycheff inequalities, Hoeffding's inequality, Monte Carlo simulation, and Cramér's theorem.
Ken Binmore
- Published in print:
- 2007
- Published Online:
- May 2007
- ISBN:
- 9780195300574
- eISBN:
- 9780199783748
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195300574.003.0021
- Subject:
- Economics and Finance, Microeconomics
The most successful area of game theory is auction design. This chapter offers a short review of the subject. Various traditional auctions are described. After showing that most of these generate the ...
More
The most successful area of game theory is auction design. This chapter offers a short review of the subject. Various traditional auctions are described. After showing that most of these generate the same expected revenue to the seller under appropriate conditions, a version of the revenue equivalence theorem is proved. Reserve prices are briefly considered. An extended example of the use of the principles of mechanism design in constructing an optimal auction is offered, and the case of common-value auctions is reviewed. The chapter ends with a discussion of the important case of multiunit auctions, and concludes that treasury bonds could be sold on a more scientific basis than at present.Less
The most successful area of game theory is auction design. This chapter offers a short review of the subject. Various traditional auctions are described. After showing that most of these generate the same expected revenue to the seller under appropriate conditions, a version of the revenue equivalence theorem is proved. Reserve prices are briefly considered. An extended example of the use of the principles of mechanism design in constructing an optimal auction is offered, and the case of common-value auctions is reviewed. The chapter ends with a discussion of the important case of multiunit auctions, and concludes that treasury bonds could be sold on a more scientific basis than at present.
M. Vidyasagar
- Published in print:
- 2014
- Published Online:
- October 2017
- ISBN:
- 9780691133157
- eISBN:
- 9781400850518
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691133157.003.0002
- Subject:
- Mathematics, Probability / Statistics
This chapter provides an introduction to some elementary aspects of information theory, including entropy in its various forms. Entropy refers to the level of uncertainty associated with a random ...
More
This chapter provides an introduction to some elementary aspects of information theory, including entropy in its various forms. Entropy refers to the level of uncertainty associated with a random variable (or more precisely, the probability distribution of the random variable). When there are two or more random variables, it is worthwhile to study the conditional entropy of one random variable with respect to another. The last concept is relative entropy, also known as the Kullback–Leibler divergence, which measures the “disparity” between two probability distributions. The chapter first considers convex and concave functions before discussing the properties of the entropy function, conditional entropy, uniqueness of the entropy function, and the Kullback–Leibler divergence.Less
This chapter provides an introduction to some elementary aspects of information theory, including entropy in its various forms. Entropy refers to the level of uncertainty associated with a random variable (or more precisely, the probability distribution of the random variable). When there are two or more random variables, it is worthwhile to study the conditional entropy of one random variable with respect to another. The last concept is relative entropy, also known as the Kullback–Leibler divergence, which measures the “disparity” between two probability distributions. The chapter first considers convex and concave functions before discussing the properties of the entropy function, conditional entropy, uniqueness of the entropy function, and the Kullback–Leibler divergence.
Daniel T. Gillespie and Effrosyni Seitaridou
- Published in print:
- 2012
- Published Online:
- January 2013
- ISBN:
- 9780199664504
- eISBN:
- 9780191748516
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199664504.003.0002
- Subject:
- Physics, Soft Matter / Biological Physics
In extending the analysis of diffusion beyond the traditional approach described in Chapter 1, the chapter here finds it necessary to work with random variables. This chapter gives a self-contained, ...
More
In extending the analysis of diffusion beyond the traditional approach described in Chapter 1, the chapter here finds it necessary to work with random variables. This chapter gives a self-contained, selective introduction to random variable theory, presenting some definitions and results that will be used repeatedly throughout the rest of the book. The presentation features utilitarian definitions of probability and random variables, and derivations of the mathematical rules they obey. Topics covered include: the probability density function and the cumulative distribution function; the uniform, exponential, normal, Cauchy, and binomial random variables; moments, means, variances, standard deviations, covariances, and correlations; the Dirac delta function and sure random variables; multivariate random variables; statistical independence; the random variable transformation theorem and its many useful consequences, such as the central limit theorem and linear combination theorems for normal random variables; the bivariate normal random variable; and the computer generation of random numbers by the inversion method. The presentation assumes fluency in standard calculus, but it does not require a knowledge of more advanced topics in mathematics.Less
In extending the analysis of diffusion beyond the traditional approach described in Chapter 1, the chapter here finds it necessary to work with random variables. This chapter gives a self-contained, selective introduction to random variable theory, presenting some definitions and results that will be used repeatedly throughout the rest of the book. The presentation features utilitarian definitions of probability and random variables, and derivations of the mathematical rules they obey. Topics covered include: the probability density function and the cumulative distribution function; the uniform, exponential, normal, Cauchy, and binomial random variables; moments, means, variances, standard deviations, covariances, and correlations; the Dirac delta function and sure random variables; multivariate random variables; statistical independence; the random variable transformation theorem and its many useful consequences, such as the central limit theorem and linear combination theorems for normal random variables; the bivariate normal random variable; and the computer generation of random numbers by the inversion method. The presentation assumes fluency in standard calculus, but it does not require a knowledge of more advanced topics in mathematics.
Heinz-Peter Breuer and Francesco Petruccione
- Published in print:
- 2007
- Published Online:
- February 2010
- ISBN:
- 9780199213900
- eISBN:
- 9780191706349
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199213900.003.01
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
This chapter contains a survey of classical probability theory and stochastic processes. It starts with a description of the fundamental concepts of probability space and Kolmogorov axioms. These ...
More
This chapter contains a survey of classical probability theory and stochastic processes. It starts with a description of the fundamental concepts of probability space and Kolmogorov axioms. These concepts are then used to define random variables and stochastic processes. The mathematical formulation of the special class of Markov processes through classical master equations is given, including deterministic processes (Liouville equation), jump processes (Pauli master equation), and diffusion processes (Fokker–Planck equation). Special stochastic processes which play an important role in the developments of the following chapters, such as piecewise deterministic processes and Lévy processes, are described in detail together with their basic physical properties and various mathematical formulations in terms of master equations, path integral representation, and stochastic differential equations.Less
This chapter contains a survey of classical probability theory and stochastic processes. It starts with a description of the fundamental concepts of probability space and Kolmogorov axioms. These concepts are then used to define random variables and stochastic processes. The mathematical formulation of the special class of Markov processes through classical master equations is given, including deterministic processes (Liouville equation), jump processes (Pauli master equation), and diffusion processes (Fokker–Planck equation). Special stochastic processes which play an important role in the developments of the following chapters, such as piecewise deterministic processes and Lévy processes, are described in detail together with their basic physical properties and various mathematical formulations in terms of master equations, path integral representation, and stochastic differential equations.
Partha P. Mitra and Hemant Bokil
- Published in print:
- 2007
- Published Online:
- May 2009
- ISBN:
- 9780195178081
- eISBN:
- 9780199864829
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195178081.003.0014
- Subject:
- Neuroscience, Techniques, Molecular and Cellular Systems
In the neuroscience literature, significant use has been made of entropy as a measure of variability, and mutual information as a measure of association. Part of the attraction of these measures ...
More
In the neuroscience literature, significant use has been made of entropy as a measure of variability, and mutual information as a measure of association. Part of the attraction of these measures arises from their use in statistical physics and in communication theory. The idea is that they are free of distributional assumptions and have elevated theoretical status compared with second-moment measures that may be associated with Gaussian distributions. However, while these measures are theoretically elegant and have desirable invariance properties, apart from difficulties of estimation, they are by construction not informative about the shape of the distributions or the nature of the functional relationships between variables. This chapter presents a brief review of the relevant information theoretic approaches, including links to Gaussian processes and inhomogeneous Poisson processes.Less
In the neuroscience literature, significant use has been made of entropy as a measure of variability, and mutual information as a measure of association. Part of the attraction of these measures arises from their use in statistical physics and in communication theory. The idea is that they are free of distributional assumptions and have elevated theoretical status compared with second-moment measures that may be associated with Gaussian distributions. However, while these measures are theoretically elegant and have desirable invariance properties, apart from difficulties of estimation, they are by construction not informative about the shape of the distributions or the nature of the functional relationships between variables. This chapter presents a brief review of the relevant information theoretic approaches, including links to Gaussian processes and inhomogeneous Poisson processes.
Ken Binmore
- Published in print:
- 2007
- Published Online:
- May 2007
- ISBN:
- 9780195300574
- eISBN:
- 9780199783748
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195300574.003.0003
- Subject:
- Economics and Finance, Microeconomics
This chapter shows how risk can be introduced into the rules of a game by admitting chance moves. The Monty Hall problem is offered as an example. Elementary probability theory is reviewed, and ...
More
This chapter shows how risk can be introduced into the rules of a game by admitting chance moves. The Monty Hall problem is offered as an example. Elementary probability theory is reviewed, and random variables are explained in terms of lottery tickets. The games of Duel and Parcheesi are analyzed as non-trivial examples of games with chance moves.Less
This chapter shows how risk can be introduced into the rules of a game by admitting chance moves. The Monty Hall problem is offered as an example. Elementary probability theory is reviewed, and random variables are explained in terms of lottery tickets. The games of Duel and Parcheesi are analyzed as non-trivial examples of games with chance moves.
Robert J Marks II
- Published in print:
- 2009
- Published Online:
- November 2020
- ISBN:
- 9780195335927
- eISBN:
- 9780197562567
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780195335927.003.0009
- Subject:
- Computer Science, Mathematical Theory of Computation
In this Chapter, we present application of Fourier analysis to probability, random variables and stochastic processes [1089, 1097, 1387, 1329]. Arandom variable, X, is the assignment of a number to ...
More
In this Chapter, we present application of Fourier analysis to probability, random variables and stochastic processes [1089, 1097, 1387, 1329]. Arandom variable, X, is the assignment of a number to the outcome of a random experiment. We can, for example, flip a coin and assign an outcome of a heads as X = 1 and a tails X = 0. Often the number is equated to the numerical outcome of the experiment, such as the number of dots on the face of a rolled die or the measurement of a voltage in a noisy circuit. The cumulative distribution function is defined by FX(x) = Pr[X ≤ x]. (4.1) The probability density function is the derivative fX(x) = d /dxFX(x). Our treatment of random variables focuses on use of Fourier analysis. Due to this viewpoint, the development we use is unconventional and begins immediately in the next section with discussion of properties of the probability density function.
Less
In this Chapter, we present application of Fourier analysis to probability, random variables and stochastic processes [1089, 1097, 1387, 1329]. Arandom variable, X, is the assignment of a number to the outcome of a random experiment. We can, for example, flip a coin and assign an outcome of a heads as X = 1 and a tails X = 0. Often the number is equated to the numerical outcome of the experiment, such as the number of dots on the face of a rolled die or the measurement of a voltage in a noisy circuit. The cumulative distribution function is defined by FX(x) = Pr[X ≤ x]. (4.1) The probability density function is the derivative fX(x) = d /dxFX(x). Our treatment of random variables focuses on use of Fourier analysis. Due to this viewpoint, the development we use is unconventional and begins immediately in the next section with discussion of properties of the probability density function.
ZIHENG YANG
- Published in print:
- 2006
- Published Online:
- April 2010
- ISBN:
- 9780198567028
- eISBN:
- 9780191728280
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198567028.003.0009
- Subject:
- Biology, Evolutionary Biology / Genetics
This chapter discusses basic techniques of computer simulation. Topics covered include random number generator, generation of continuous random variables, generation of discrete random variables, and ...
More
This chapter discusses basic techniques of computer simulation. Topics covered include random number generator, generation of continuous random variables, generation of discrete random variables, and simulating molecular evolution. Exercises are provided at the end of the chapter.Less
This chapter discusses basic techniques of computer simulation. Topics covered include random number generator, generation of continuous random variables, generation of discrete random variables, and simulating molecular evolution. Exercises are provided at the end of the chapter.
Robert M. Mazo
- Published in print:
- 2008
- Published Online:
- January 2010
- ISBN:
- 9780199556441
- eISBN:
- 9780191705625
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199556441.003.0002
- Subject:
- Physics, Condensed Matter Physics / Materials
This chapter begins with a review of elementary probability theory, conditional probability, and statistical independence. It explains the notion of a random variable and its distribution function or ...
More
This chapter begins with a review of elementary probability theory, conditional probability, and statistical independence. It explains the notion of a random variable and its distribution function or probability density function. It then introduces the concepts of mathematical expectation and variance and discusses several distributions often met in practice: the binomial, Gaussian, and Poisson distributions. The characteristic function of a random variable is defined and applied to the determination of the distribution of the sum of independent random variables. The central limit theorem is described but not proved.Less
This chapter begins with a review of elementary probability theory, conditional probability, and statistical independence. It explains the notion of a random variable and its distribution function or probability density function. It then introduces the concepts of mathematical expectation and variance and discusses several distributions often met in practice: the binomial, Gaussian, and Poisson distributions. The characteristic function of a random variable is defined and applied to the determination of the distribution of the sum of independent random variables. The central limit theorem is described but not proved.
Claus Munk
- Published in print:
- 2011
- Published Online:
- September 2011
- ISBN:
- 9780199575084
- eISBN:
- 9780191728648
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199575084.003.0003
- Subject:
- Economics and Finance, Financial Economics
The price of an asset at a future point in time will typically be unknown, i.e. a random variable. In order to describe the uncertain evolution in the price of the asset over time, we need a ...
More
The price of an asset at a future point in time will typically be unknown, i.e. a random variable. In order to describe the uncertain evolution in the price of the asset over time, we need a collection of random variables, namely one random variable for each point in time. Such a collection of random variables is called a stochastic process. Modern finance models therefore apply stochastic processes to represent the evolution in prices — as well as interest rates and other relevant quantities — over time. This is also the case for the dynamic interest rate models presented in this book. This chapter gives an introduction to stochastic processes and the mathematical tools needed to do calculations with stochastic processes, the so-called stochastic calculus, focusing on processes and results that will become important in later chapters.Less
The price of an asset at a future point in time will typically be unknown, i.e. a random variable. In order to describe the uncertain evolution in the price of the asset over time, we need a collection of random variables, namely one random variable for each point in time. Such a collection of random variables is called a stochastic process. Modern finance models therefore apply stochastic processes to represent the evolution in prices — as well as interest rates and other relevant quantities — over time. This is also the case for the dynamic interest rate models presented in this book. This chapter gives an introduction to stochastic processes and the mathematical tools needed to do calculations with stochastic processes, the so-called stochastic calculus, focusing on processes and results that will become important in later chapters.
Arno Berger and Theodore P. Hill
- Published in print:
- 2015
- Published Online:
- October 2017
- ISBN:
- 9780691163062
- eISBN:
- 9781400866588
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691163062.003.0008
- Subject:
- Mathematics, Probability / Statistics
Benford's law arises naturally in a variety of stochastic settings, including products of independent random variables, mixtures of random samples from different distributions, and iterations of ...
More
Benford's law arises naturally in a variety of stochastic settings, including products of independent random variables, mixtures of random samples from different distributions, and iterations of random maps. This chapter provides the concepts and tools to analyze significant digits and significands for these basic random processes. Benford's law also arises in many other important fields of stochastics, such as geometric Brownian motion, random matrices, and Bayesian models, and the chapter may serve as a preparation for specialized literature on these advanced topics. By Theorem 4.2 a random variable X is Benford if and only if log ¦X¦ is uniformly distributed modulo one.Less
Benford's law arises naturally in a variety of stochastic settings, including products of independent random variables, mixtures of random samples from different distributions, and iterations of random maps. This chapter provides the concepts and tools to analyze significant digits and significands for these basic random processes. Benford's law also arises in many other important fields of stochastics, such as geometric Brownian motion, random matrices, and Bayesian models, and the chapter may serve as a preparation for specialized literature on these advanced topics. By Theorem 4.2 a random variable X is Benford if and only if log ¦X¦ is uniformly distributed modulo one.
James Davidson
- Published in print:
- 1994
- Published Online:
- November 2003
- ISBN:
- 9780198774037
- eISBN:
- 9780191596117
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198774036.003.0008
- Subject:
- Economics and Finance, Econometrics
Specializing the concepts of Ch. 7 to the case of real variables, this chapter introduces distribution functions, discrete and continuous distributions, and describes examples such as the binomial, ...
More
Specializing the concepts of Ch. 7 to the case of real variables, this chapter introduces distribution functions, discrete and continuous distributions, and describes examples such as the binomial, uniform, Gaussian, and Cauchy distributions. It then treats multivariate distributions and the concept of independence.Less
Specializing the concepts of Ch. 7 to the case of real variables, this chapter introduces distribution functions, discrete and continuous distributions, and describes examples such as the binomial, uniform, Gaussian, and Cauchy distributions. It then treats multivariate distributions and the concept of independence.
M. D. Edge
- Published in print:
- 2019
- Published Online:
- October 2019
- ISBN:
- 9780198827627
- eISBN:
- 9780191866463
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198827627.003.0006
- Subject:
- Biology, Biomathematics / Statistics and Data Analysis / Complexity Studies
In this chapter, the behavior of random variables is summarized using the concepts of expectation, variance, and covariance. The expectation is a measurement of the location of a random variable’s ...
More
In this chapter, the behavior of random variables is summarized using the concepts of expectation, variance, and covariance. The expectation is a measurement of the location of a random variable’s distribution. The variance and its square root, the standard deviation, are measurements of the spread of a random variable’s distribution. Covariance and correlation are measurements of the extent of linear relationship between two random variables. The chapter also describe two important theorems that describe the distribution of means of samples from a distribution. As the sample size becomes larger, the distribution of the sample mean becomes bunched more tightly around the expectation—this is the law of large numbers—and the distribution of the sample mean approaches the shape of a normal distribution—this is the central limit theorem. Finally, a model describing a linear relationship between two random variables is considered, and the properties of those two random variables are analyzed.Less
In this chapter, the behavior of random variables is summarized using the concepts of expectation, variance, and covariance. The expectation is a measurement of the location of a random variable’s distribution. The variance and its square root, the standard deviation, are measurements of the spread of a random variable’s distribution. Covariance and correlation are measurements of the extent of linear relationship between two random variables. The chapter also describe two important theorems that describe the distribution of means of samples from a distribution. As the sample size becomes larger, the distribution of the sample mean becomes bunched more tightly around the expectation—this is the law of large numbers—and the distribution of the sample mean approaches the shape of a normal distribution—this is the central limit theorem. Finally, a model describing a linear relationship between two random variables is considered, and the properties of those two random variables are analyzed.
Nicholas J. J. Smith
- Published in print:
- 2010
- Published Online:
- May 2010
- ISBN:
- 9780199570386
- eISBN:
- 9780191722134
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199570386.003.0029
- Subject:
- Philosophy, Logic/Philosophy of Mathematics, Metaphysics/Epistemology
A number of authors have noted that vagueness engenders degrees of belief, but that these degrees of belief do not behave like subjective probabilities. So should we countenance two different kinds ...
More
A number of authors have noted that vagueness engenders degrees of belief, but that these degrees of belief do not behave like subjective probabilities. So should we countenance two different kinds of degree of belief: the kind arising from vagueness, and the familiar kind arising from uncertainty, which obey the laws of probability? This chapter argues that we cannot coherently countenance two different kinds of degree of belief. Instead, it presents a framework in which there is a single notion of degree of belief, which in certain circumstances behaves like a subjective probability assignment and in other circumstances does not. The core idea is that one's degree of belief in a proposition is one's expectation of its degree of truth.Less
A number of authors have noted that vagueness engenders degrees of belief, but that these degrees of belief do not behave like subjective probabilities. So should we countenance two different kinds of degree of belief: the kind arising from vagueness, and the familiar kind arising from uncertainty, which obey the laws of probability? This chapter argues that we cannot coherently countenance two different kinds of degree of belief. Instead, it presents a framework in which there is a single notion of degree of belief, which in certain circumstances behaves like a subjective probability assignment and in other circumstances does not. The core idea is that one's degree of belief in a proposition is one's expectation of its degree of truth.
Thomas P. Trappenberg
- Published in print:
- 2019
- Published Online:
- January 2020
- ISBN:
- 9780198828044
- eISBN:
- 9780191883873
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198828044.003.0006
- Subject:
- Neuroscience, Behavioral Neuroscience
The discussion provides a refresher of probability theory, in particular with respect to the formulations that build the theoretical language of modern machine learning. Probability theory is the ...
More
The discussion provides a refresher of probability theory, in particular with respect to the formulations that build the theoretical language of modern machine learning. Probability theory is the formalism of random numbers, and this chapter outlines what these are and how they are characterized by probability density or probability mass functions. How such functions have traditionally been characterized is covered, and a review of how to work with such mathematical objects such as transforming density functions and how to measure differences between density function is presented. Definitions and basic operations with multiple random variables, including the Bayes law, are covered. The chapter ends with an outline of some important approximation techniques of so-called Monte Carlo methods.Less
The discussion provides a refresher of probability theory, in particular with respect to the formulations that build the theoretical language of modern machine learning. Probability theory is the formalism of random numbers, and this chapter outlines what these are and how they are characterized by probability density or probability mass functions. How such functions have traditionally been characterized is covered, and a review of how to work with such mathematical objects such as transforming density functions and how to measure differences between density function is presented. Definitions and basic operations with multiple random variables, including the Bayes law, are covered. The chapter ends with an outline of some important approximation techniques of so-called Monte Carlo methods.
Melvin Lax, Wei Cai, and Min Xu
- Published in print:
- 2006
- Published Online:
- January 2010
- ISBN:
- 9780198567769
- eISBN:
- 9780191718359
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198567769.003.0010
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
This chapter discusses the Langevin treatment of the Fokker–Planck process and diffusion. The form of Langevin equation used is different from the stochastic differential equation using Ito's ...
More
This chapter discusses the Langevin treatment of the Fokker–Planck process and diffusion. The form of Langevin equation used is different from the stochastic differential equation using Ito's calculus lemma. The transform of the Langevin equation obeys the ordinary calculus rule, hence can be easily performed and some misleadings can be avoided. The origin of the difference between this approach and that using Ito's lemma comes from the different definitions of the stochastic integral. This chapter also discusses drift velocity, an example with an exact solution, use of Langevin equation for a general random variable, extension of this equation to the multiple dimensional case, and means of products of random variables and noise source.Less
This chapter discusses the Langevin treatment of the Fokker–Planck process and diffusion. The form of Langevin equation used is different from the stochastic differential equation using Ito's calculus lemma. The transform of the Langevin equation obeys the ordinary calculus rule, hence can be easily performed and some misleadings can be avoided. The origin of the difference between this approach and that using Ito's lemma comes from the different definitions of the stochastic integral. This chapter also discusses drift velocity, an example with an exact solution, use of Langevin equation for a general random variable, extension of this equation to the multiple dimensional case, and means of products of random variables and noise source.
Michael R. Powers
- Published in print:
- 2014
- Published Online:
- November 2015
- ISBN:
- 9780231153676
- eISBN:
- 9780231527057
- Item type:
- chapter
- Publisher:
- Columbia University Press
- DOI:
- 10.7312/columbia/9780231153676.003.0002
- Subject:
- Economics and Finance, Development, Growth, and Environmental
Every manifestation of risk is associated with one or more unknown quantities. For instance, in the case of the inevitable death of an individual, it is primarily the time of death that is unknown. ...
More
Every manifestation of risk is associated with one or more unknown quantities. For instance, in the case of the inevitable death of an individual, it is primarily the time of death that is unknown. In the case of damage to a building or other piece of property, it is a combination of the incidence of damage (i.e. whether or not it occurs), along with both the timing and the amount of damage. This chapter discusses the modeling of an unknown quantity using random variables and probability functions. It explains the two principal pairings of probability interpretations and estimation methods: (1) the frequency/classical approach, often called frequentism, which requires both the possibility of repeated trials and a large number of actual repeated trials; and (2) the subjective/judgmental approach, often called Bayesianism, which works for unique trials and any number of repeated trials. Of these two, the latter approach is more flexible.Less
Every manifestation of risk is associated with one or more unknown quantities. For instance, in the case of the inevitable death of an individual, it is primarily the time of death that is unknown. In the case of damage to a building or other piece of property, it is a combination of the incidence of damage (i.e. whether or not it occurs), along with both the timing and the amount of damage. This chapter discusses the modeling of an unknown quantity using random variables and probability functions. It explains the two principal pairings of probability interpretations and estimation methods: (1) the frequency/classical approach, often called frequentism, which requires both the possibility of repeated trials and a large number of actual repeated trials; and (2) the subjective/judgmental approach, often called Bayesianism, which works for unique trials and any number of repeated trials. Of these two, the latter approach is more flexible.