N. Thompson Hobbs and Mevin B. Hooten
- Published in print:
- 2015
- Published Online:
- October 2017
- ISBN:
- 9780691159287
- eISBN:
- 9781400866557
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691159287.003.0003
- Subject:
- Biology, Ecology
This chapter describes the rules of probability as well as probability distributions. Because models are inherently, deliberately approximate, there comes a need to understand the approximation ...
More
This chapter describes the rules of probability as well as probability distributions. Because models are inherently, deliberately approximate, there comes a need to understand the approximation inherent in models in terms of uncertainty. Thus, equipped with a proper understanding of the principles of probability, ecologists can analyze the particular research problem at hand regardless of its idiosyncrasies. These analyses extend logically from first principles rather than from a particular statistical recipe. The chapter starts with the definition of probability and develops a logical progression of concepts extending from it to a fully specified and implemented Bayesian analysis appropriate for a broad range of research problems in ecology.Less
This chapter describes the rules of probability as well as probability distributions. Because models are inherently, deliberately approximate, there comes a need to understand the approximation inherent in models in terms of uncertainty. Thus, equipped with a proper understanding of the principles of probability, ecologists can analyze the particular research problem at hand regardless of its idiosyncrasies. These analyses extend logically from first principles rather than from a particular statistical recipe. The chapter starts with the definition of probability and develops a logical progression of concepts extending from it to a fully specified and implemented Bayesian analysis appropriate for a broad range of research problems in ecology.
M. Vidyasagar
- Published in print:
- 2014
- Published Online:
- October 2017
- ISBN:
- 9780691133157
- eISBN:
- 9781400850518
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691133157.003.0001
- Subject:
- Mathematics, Probability / Statistics
This chapter provides an introduction to probability and random variables. Probability theory is an attempt to formalize the notion of uncertainty in the outcome of an experiment. For instance, ...
More
This chapter provides an introduction to probability and random variables. Probability theory is an attempt to formalize the notion of uncertainty in the outcome of an experiment. For instance, suppose an urn contains four balls, colored red, blue, white, and green respectively. Suppose we dip our hand in the urn and pull out one of the balls “at random.” What is the likelihood that the ball we pull out will be red? The chapter first defines a random variable and probability before discussing the function of a random variable and expected value. It then considers total variation distance, joint and marginal probability distributions, independence and conditional probability distributions, Bayes' rule, and maximum likelihood estimates. Finally, it describes random variables assuming infinitely many values, focusing on Markov and Chebycheff inequalities, Hoeffding's inequality, Monte Carlo simulation, and Cramér's theorem.Less
This chapter provides an introduction to probability and random variables. Probability theory is an attempt to formalize the notion of uncertainty in the outcome of an experiment. For instance, suppose an urn contains four balls, colored red, blue, white, and green respectively. Suppose we dip our hand in the urn and pull out one of the balls “at random.” What is the likelihood that the ball we pull out will be red? The chapter first defines a random variable and probability before discussing the function of a random variable and expected value. It then considers total variation distance, joint and marginal probability distributions, independence and conditional probability distributions, Bayes' rule, and maximum likelihood estimates. Finally, it describes random variables assuming infinitely many values, focusing on Markov and Chebycheff inequalities, Hoeffding's inequality, Monte Carlo simulation, and Cramér's theorem.
Steven Sloman
- Published in print:
- 2005
- Published Online:
- January 2007
- ISBN:
- 9780195183115
- eISBN:
- 9780199870950
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195183115.003.0004
- Subject:
- Philosophy, Philosophy of Mind
According to human perception, the world is full of causal systems composed of autonomous mechanisms that generate events as effects of other events. This chapter attempts to make this idea more ...
More
According to human perception, the world is full of causal systems composed of autonomous mechanisms that generate events as effects of other events. This chapter attempts to make this idea more precise and therefore more clear by making it more formal. It introduces the causal model framework as an abstract language for representing causal systems. It discusses the three parts of a causal model: the causal system in the world (i.e., the system being represented); the probability distribution that describes how likely events are to happen and how likely they are to occur with other events — how certain we can be about each event and combination of events; and a graph that depicts the causal relations in the system.Less
According to human perception, the world is full of causal systems composed of autonomous mechanisms that generate events as effects of other events. This chapter attempts to make this idea more precise and therefore more clear by making it more formal. It introduces the causal model framework as an abstract language for representing causal systems. It discusses the three parts of a causal model: the causal system in the world (i.e., the system being represented); the probability distribution that describes how likely events are to happen and how likely they are to occur with other events — how certain we can be about each event and combination of events; and a graph that depicts the causal relations in the system.
M. Vidyasagar
- Published in print:
- 2014
- Published Online:
- October 2017
- ISBN:
- 9780691133157
- eISBN:
- 9781400850518
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691133157.003.0002
- Subject:
- Mathematics, Probability / Statistics
This chapter provides an introduction to some elementary aspects of information theory, including entropy in its various forms. Entropy refers to the level of uncertainty associated with a random ...
More
This chapter provides an introduction to some elementary aspects of information theory, including entropy in its various forms. Entropy refers to the level of uncertainty associated with a random variable (or more precisely, the probability distribution of the random variable). When there are two or more random variables, it is worthwhile to study the conditional entropy of one random variable with respect to another. The last concept is relative entropy, also known as the Kullback–Leibler divergence, which measures the “disparity” between two probability distributions. The chapter first considers convex and concave functions before discussing the properties of the entropy function, conditional entropy, uniqueness of the entropy function, and the Kullback–Leibler divergence.Less
This chapter provides an introduction to some elementary aspects of information theory, including entropy in its various forms. Entropy refers to the level of uncertainty associated with a random variable (or more precisely, the probability distribution of the random variable). When there are two or more random variables, it is worthwhile to study the conditional entropy of one random variable with respect to another. The last concept is relative entropy, also known as the Kullback–Leibler divergence, which measures the “disparity” between two probability distributions. The chapter first considers convex and concave functions before discussing the properties of the entropy function, conditional entropy, uniqueness of the entropy function, and the Kullback–Leibler divergence.
Cascos Ignacio
- Published in print:
- 2009
- Published Online:
- February 2010
- ISBN:
- 9780199232574
- eISBN:
- 9780191716393
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199232574.003.0012
- Subject:
- Mathematics, Geometry / Topology
This chapter presents several ways to measure the degree of centrality of a point with respect to a multivariate probability distribution or a data cloud. Such degree of centrality is called depth, ...
More
This chapter presents several ways to measure the degree of centrality of a point with respect to a multivariate probability distribution or a data cloud. Such degree of centrality is called depth, and it can be used to extend a wide range of univariate techniques that are based on the natural order on the real line to the multivariate setting.Less
This chapter presents several ways to measure the degree of centrality of a point with respect to a multivariate probability distribution or a data cloud. Such degree of centrality is called depth, and it can be used to extend a wide range of univariate techniques that are based on the natural order on the real line to the multivariate setting.
M. Vidyasagar
- Published in print:
- 2014
- Published Online:
- October 2017
- ISBN:
- 9780691133157
- eISBN:
- 9781400850518
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691133157.003.0004
- Subject:
- Mathematics, Probability / Statistics
This chapter deals with Markov processes. It first defines the “Markov property” and shows that all the relevant information about a Markov process assuming values in a finite set of cardinality n ...
More
This chapter deals with Markov processes. It first defines the “Markov property” and shows that all the relevant information about a Markov process assuming values in a finite set of cardinality n can be captured by a nonnegative n x n matrix known as the state transition matrix, and an n-dimensional probability distribution of the initial state. It then invokes the results of the previous chapter on nonnegative matrices to analyze the temporal evolution of Markov processes. It also estimates the state transition matrix and considers the dynamics of stationary Markov chains, recurrent and transient states, hitting probability and mean hitting times, and the ergodicity of Markov chains.Less
This chapter deals with Markov processes. It first defines the “Markov property” and shows that all the relevant information about a Markov process assuming values in a finite set of cardinality n can be captured by a nonnegative n x n matrix known as the state transition matrix, and an n-dimensional probability distribution of the initial state. It then invokes the results of the previous chapter on nonnegative matrices to analyze the temporal evolution of Markov processes. It also estimates the state transition matrix and considers the dynamics of stationary Markov chains, recurrent and transient states, hitting probability and mean hitting times, and the ergodicity of Markov chains.
N. Thompson Hobbs and Mevin B. Hooten
- Published in print:
- 2015
- Published Online:
- October 2017
- ISBN:
- 9780691159287
- eISBN:
- 9781400866557
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691159287.003.0004
- Subject:
- Biology, Ecology
This chapter is an overview of likelihood and maximum likelihood. Likelihood forms the fundamental link between models and data in the Bayesian framework. In addition, maximum likelihood is a widely ...
More
This chapter is an overview of likelihood and maximum likelihood. Likelihood forms the fundamental link between models and data in the Bayesian framework. In addition, maximum likelihood is a widely used alternative to Bayesian methods for estimating parameters in ecological models. Though is possible to learn Bayesian modeling with a bare-bones treatment of likelihood, the chapter emphasizes the importance of this concept in Bayesian analysis. A significant aspect of likelihood within the Bayesian framework can be found in the similarities and differences between Bayesian analysis and analysis based on maximum likelihood. In addition, the chapter also considers the relationship between a probability distribution and a likelihood function.Less
This chapter is an overview of likelihood and maximum likelihood. Likelihood forms the fundamental link between models and data in the Bayesian framework. In addition, maximum likelihood is a widely used alternative to Bayesian methods for estimating parameters in ecological models. Though is possible to learn Bayesian modeling with a bare-bones treatment of likelihood, the chapter emphasizes the importance of this concept in Bayesian analysis. A significant aspect of likelihood within the Bayesian framework can be found in the similarities and differences between Bayesian analysis and analysis based on maximum likelihood. In addition, the chapter also considers the relationship between a probability distribution and a likelihood function.
Gary E. Bowman
- Published in print:
- 2007
- Published Online:
- January 2008
- ISBN:
- 9780199228928
- eISBN:
- 9780191711206
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199228928.003.0003
- Subject:
- Physics, Condensed Matter Physics / Materials
This chapter first discusses statistical techniques and concepts — including probability, probability distribution, average and expectation value, and uncertainty — that are relevant to quantum ...
More
This chapter first discusses statistical techniques and concepts — including probability, probability distribution, average and expectation value, and uncertainty — that are relevant to quantum mechanics. The physical interpretation of probability in quantum mechanics is then discussed. It is argued that by adopting the statistical interpretation, one can think clearly about quantum mechanics without committing to any of the various physical interpretations, and without tackling the conceptual hurdles that they present. The history and substance of interpretational issues in quantum mechanics is then briefly discussed, the point being to introduce these profound, unresolved questions without taking sides in ongoing scientific debates. The chapter closes with a personal reflection on Albert Einstein, the most revered and fascinating figure in physics, and the one most famously at odds with the conventional interpretation of quantum mechanics.Less
This chapter first discusses statistical techniques and concepts — including probability, probability distribution, average and expectation value, and uncertainty — that are relevant to quantum mechanics. The physical interpretation of probability in quantum mechanics is then discussed. It is argued that by adopting the statistical interpretation, one can think clearly about quantum mechanics without committing to any of the various physical interpretations, and without tackling the conceptual hurdles that they present. The history and substance of interpretational issues in quantum mechanics is then briefly discussed, the point being to introduce these profound, unresolved questions without taking sides in ongoing scientific debates. The chapter closes with a personal reflection on Albert Einstein, the most revered and fascinating figure in physics, and the one most famously at odds with the conventional interpretation of quantum mechanics.
Melvin Lax, Wei Cai, and Min Xu
- Published in print:
- 2006
- Published Online:
- January 2010
- ISBN:
- 9780198567769
- eISBN:
- 9780191718359
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198567769.003.0002
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
A random or stochastic process is a random variable that evolves in time by some random mechanism (of course, the time variable can be replaced by a space variable, or some other variable, in ...
More
A random or stochastic process is a random variable that evolves in time by some random mechanism (of course, the time variable can be replaced by a space variable, or some other variable, in application). The variable can have a discrete set of values at a given time, or a continuum of values may be available. Likewise, the time variable can be discrete or continuous. A stochastic process is regarded as completely described if the probability distribution is known for all possible sets of times. A stationary process is one which has no absolute time origin. All probabilities are independent of a shift in the origin of time. This chapter discusses multitime probability description, conditional probabilities, stationary, Gaussian, and Markovian processes, and the Chapman–Kolmogorov condition.Less
A random or stochastic process is a random variable that evolves in time by some random mechanism (of course, the time variable can be replaced by a space variable, or some other variable, in application). The variable can have a discrete set of values at a given time, or a continuum of values may be available. Likewise, the time variable can be discrete or continuous. A stochastic process is regarded as completely described if the probability distribution is known for all possible sets of times. A stationary process is one which has no absolute time origin. All probabilities are independent of a shift in the origin of time. This chapter discusses multitime probability description, conditional probabilities, stationary, Gaussian, and Markovian processes, and the Chapman–Kolmogorov condition.
M. Vidyasagar
- Published in print:
- 2014
- Published Online:
- October 2017
- ISBN:
- 9780691133157
- eISBN:
- 9781400850518
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691133157.003.0005
- Subject:
- Mathematics, Probability / Statistics
This chapter provides an introduction to large deviation theory. It begins with an overview of the motivatio
n for the problem under study, focusing on probability distributions and how to construct ...
More
This chapter provides an introduction to large deviation theory. It begins with an overview of the motivatio
n for the problem under study, focusing on probability distributions and how to construct an empirical distribution. It then considers the notion of a lower semi-continuous function and that of a lower semi-continuous relaxation before discussing the large deviation property for i.i.d. samples. In particular, it describes Sanov's theorem for a finite alphabet and proceeds by analyzing large deviation property for Markov chains, taking into account stationary distributions, entropy and relative entropy rates, the rate function for doubleton frequencies, and the rate function for singleton frequencies.Less
This chapter provides an introduction to large deviation theory. It begins with an overview of the motivatio
n for the problem under study, focusing on probability distributions and how to construct an empirical distribution. It then considers the notion of a lower semi-continuous function and that of a lower semi-continuous relaxation before discussing the large deviation property for i.i.d. samples. In particular, it describes Sanov's theorem for a finite alphabet and proceeds by analyzing large deviation property for Markov chains, taking into account stationary distributions, entropy and relative entropy rates, the rate function for doubleton frequencies, and the rate function for singleton frequencies.
STEPHEN M. BARNETT and PAUL M. RADMORE
- Published in print:
- 2002
- Published Online:
- February 2010
- ISBN:
- 9780198563617
- eISBN:
- 9780191714245
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198563617.003.0004
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
This chapter considers methods for describing the quantum statistics of a single mode of the electromagnetic field. Some of these methods may be extended to multimode fields, and this is discussed in ...
More
This chapter considers methods for describing the quantum statistics of a single mode of the electromagnetic field. Some of these methods may be extended to multimode fields, and this is discussed in the last section of the chapter. The moment generating function is developed for studying the photon number statistics of a single field mode. The quantum properties of optical phase are described using the optical phase operator. The characteristic functions and quasi-probability distributions provide a complete statistical description of the field. These rely on the properties the coherent states and the Glauber displacement operator.Less
This chapter considers methods for describing the quantum statistics of a single mode of the electromagnetic field. Some of these methods may be extended to multimode fields, and this is discussed in the last section of the chapter. The moment generating function is developed for studying the photon number statistics of a single field mode. The quantum properties of optical phase are described using the optical phase operator. The characteristic functions and quasi-probability distributions provide a complete statistical description of the field. These rely on the properties the coherent states and the Glauber displacement operator.
Michael P. Clements
- Published in print:
- 2009
- Published Online:
- September 2009
- ISBN:
- 9780199237197
- eISBN:
- 9780191717314
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199237197.003.0008
- Subject:
- Economics and Finance, Econometrics
This chapter asks whether the different types of forecasts made by individual survey respondents are mutually consistent, using the SPF survey data. It compares the point forecasts and central ...
More
This chapter asks whether the different types of forecasts made by individual survey respondents are mutually consistent, using the SPF survey data. It compares the point forecasts and central tendencies of probability distributions matched by individual respondent, and compares the forecast probabilities of declines in output with the probabilities implied by the probability distributions. When the expected associations between these different types of forecasts do not hold for some individuals, the chapter considers whether the discrepancies observed are consistent with rational behaviour by agents with asymmetric loss functions.Less
This chapter asks whether the different types of forecasts made by individual survey respondents are mutually consistent, using the SPF survey data. It compares the point forecasts and central tendencies of probability distributions matched by individual respondent, and compares the forecast probabilities of declines in output with the probabilities implied by the probability distributions. When the expected associations between these different types of forecasts do not hold for some individuals, the chapter considers whether the discrepancies observed are consistent with rational behaviour by agents with asymmetric loss functions.
Stephen J. Blundell and Katherine M. Blundell
- Published in print:
- 2009
- Published Online:
- January 2010
- ISBN:
- 9780199562091
- eISBN:
- 9780191718236
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199562091.003.0003
- Subject:
- Physics, Particle Physics / Astrophysics / Cosmology
This chapter defines some basic concepts in probability theory. It begins by stating that the probability of occurrence of a particular event, taken from a finite set of possible events, is zero if ...
More
This chapter defines some basic concepts in probability theory. It begins by stating that the probability of occurrence of a particular event, taken from a finite set of possible events, is zero if that event is impossible, is one if that event is certain, and takes a value somewhere in between zero and one if that event is possible but not certain. It considers two different types of probability distribution: discrete and continuous. Variance, linear transformation, independent variables, and binomial distribution are also discussed.Less
This chapter defines some basic concepts in probability theory. It begins by stating that the probability of occurrence of a particular event, taken from a finite set of possible events, is zero if that event is impossible, is one if that event is certain, and takes a value somewhere in between zero and one if that event is possible but not certain. It considers two different types of probability distribution: discrete and continuous. Variance, linear transformation, independent variables, and binomial distribution are also discussed.
Peter Coles
- Published in print:
- 2006
- Published Online:
- January 2010
- ISBN:
- 9780198567622
- eISBN:
- 9780191718250
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198567622.003.0004
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
There are two competing interpretations of probability. One is the view that probabilities should be interpreted as frequencies in some large ensemble of repeated experiments under identical ...
More
There are two competing interpretations of probability. One is the view that probabilities should be interpreted as frequencies in some large ensemble of repeated experiments under identical conditions. The general term given to this interpretation of probability is frequentist, which is favoured by experimental scientists and observational astronomers. The principal alternative to frequentism is the Bayesian interpretation, represented by Bayes' theorem. Probability theory becomes not a branch of experimental science but a branch of logic. There is no other way to reason consistently in the face of uncertainty than probability theory. The maximum entropy principle involves the assignment of measure of the lack of information contained in a probability distribution.Less
There are two competing interpretations of probability. One is the view that probabilities should be interpreted as frequencies in some large ensemble of repeated experiments under identical conditions. The general term given to this interpretation of probability is frequentist, which is favoured by experimental scientists and observational astronomers. The principal alternative to frequentism is the Bayesian interpretation, represented by Bayes' theorem. Probability theory becomes not a branch of experimental science but a branch of logic. There is no other way to reason consistently in the face of uncertainty than probability theory. The maximum entropy principle involves the assignment of measure of the lack of information contained in a probability distribution.
J. C. Gower and G. B. Dijksterhuis
- Published in print:
- 2004
- Published Online:
- September 2007
- ISBN:
- 9780198510581
- eISBN:
- 9780191708961
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198510581.003.0012
- Subject:
- Mathematics, Probability / Statistics
This chapter discusses what little is known about probability models for Procrustes problems. For most practical purposes, it relies on data sampling procedures like permutation tests, jack-knifing, ...
More
This chapter discusses what little is known about probability models for Procrustes problems. For most practical purposes, it relies on data sampling procedures like permutation tests, jack-knifing, or boot-strapping.Less
This chapter discusses what little is known about probability models for Procrustes problems. For most practical purposes, it relies on data sampling procedures like permutation tests, jack-knifing, or boot-strapping.
Heinz-Peter Breuer and Francesco Petruccione
- Published in print:
- 2007
- Published Online:
- February 2010
- ISBN:
- 9780199213900
- eISBN:
- 9780191706349
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199213900.003.05
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
This chapter characterizes the statistical properties of a quantum mechanical ensemble in terms of a density matrix. However, if selective measurements of one or several observables are carried out ...
More
This chapter characterizes the statistical properties of a quantum mechanical ensemble in terms of a density matrix. However, if selective measurements of one or several observables are carried out on the ensemble, it will split into a number of sub-ensembles, each sub-ensemble being conditioned on a particular outcome of the measurements. The mathematical description of the collection of sub-ensembles thus created leads to probability distributions on projective Hilbert space. The chapter develops an appropriate mathematical framework which enables the general formulation of such a distribution, and leads to the concepts of stochastic state vectors and stochastic density matrices. These concepts are required in later chapters to construct appropriate stochastic differential equations describing the continuous monitoring of open quantum systems.Less
This chapter characterizes the statistical properties of a quantum mechanical ensemble in terms of a density matrix. However, if selective measurements of one or several observables are carried out on the ensemble, it will split into a number of sub-ensembles, each sub-ensemble being conditioned on a particular outcome of the measurements. The mathematical description of the collection of sub-ensembles thus created leads to probability distributions on projective Hilbert space. The chapter develops an appropriate mathematical framework which enables the general formulation of such a distribution, and leads to the concepts of stochastic state vectors and stochastic density matrices. These concepts are required in later chapters to construct appropriate stochastic differential equations describing the continuous monitoring of open quantum systems.
Gary E. Bowman
- Published in print:
- 2007
- Published Online:
- January 2008
- ISBN:
- 9780199228928
- eISBN:
- 9780191711206
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199228928.003.0007
- Subject:
- Physics, Condensed Matter Physics / Materials
Commutators and uncertainty relations have sparked much debate and confusion as to their physical meaning. As such, this chapter focuses partly on historical and interpretational aspects. Because it ...
More
Commutators and uncertainty relations have sparked much debate and confusion as to their physical meaning. As such, this chapter focuses partly on historical and interpretational aspects. Because it is central to the uncertainty relations, the commutator — including its manifestation in matrix mechanics — is introduced first. It is emphasized that the commutator of two operators is generally another operator. The general form of the uncertainty relations is then introduced. These relations reflect the fact that probability distributions for different observables are generally not independent. Moreover, application of the uncertainty relations generally requires knowledge of the quantum state — a fact that is obscured by the well known position-momentum uncertainty relation. Various proposed physical interpretations of the uncertainty relations are then discussed. The chapter concludes with a brief reflection on the significance of the uncertainty relations.Less
Commutators and uncertainty relations have sparked much debate and confusion as to their physical meaning. As such, this chapter focuses partly on historical and interpretational aspects. Because it is central to the uncertainty relations, the commutator — including its manifestation in matrix mechanics — is introduced first. It is emphasized that the commutator of two operators is generally another operator. The general form of the uncertainty relations is then introduced. These relations reflect the fact that probability distributions for different observables are generally not independent. Moreover, application of the uncertainty relations generally requires knowledge of the quantum state — a fact that is obscured by the well known position-momentum uncertainty relation. Various proposed physical interpretations of the uncertainty relations are then discussed. The chapter concludes with a brief reflection on the significance of the uncertainty relations.
Gidon Eshel
- Published in print:
- 2011
- Published Online:
- October 2017
- ISBN:
- 9780691128917
- eISBN:
- 9781400840632
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691128917.003.0007
- Subject:
- Environmental Science, Environmental Studies
This chapter focuses on the relevance of statistics in deterministic science. While most physical phenomena of concern to natural sciences are governed by fundamental, mostly known, physics, their ...
More
This chapter focuses on the relevance of statistics in deterministic science. While most physical phenomena of concern to natural sciences are governed by fundamental, mostly known, physics, their application to such complex systems as the ocean, atmosphere, or ecosystems is monumentally difficult. For most such systems, the full problems—the values of all relevant variables at all space and time locations—are essentially intractable, even with the fastest computers. Hence, there is always more to the focus of inquiry that cannot be modeled; we must somehow fill in the gaps. This is where statistics come in. Until the state of the physical system under investigation is fully quantified (i.e., until the value of every dynamical variable is perfectly known at every point in space and time), there is a certain amount of indeterminacy in every statement made about the state of the system. The remainder of the chapter discusses probability distributions and degrees of freedom.Less
This chapter focuses on the relevance of statistics in deterministic science. While most physical phenomena of concern to natural sciences are governed by fundamental, mostly known, physics, their application to such complex systems as the ocean, atmosphere, or ecosystems is monumentally difficult. For most such systems, the full problems—the values of all relevant variables at all space and time locations—are essentially intractable, even with the fastest computers. Hence, there is always more to the focus of inquiry that cannot be modeled; we must somehow fill in the gaps. This is where statistics come in. Until the state of the physical system under investigation is fully quantified (i.e., until the value of every dynamical variable is perfectly known at every point in space and time), there is a certain amount of indeterminacy in every statement made about the state of the system. The remainder of the chapter discusses probability distributions and degrees of freedom.
Lawrence Leemis
- Published in print:
- 2015
- Published Online:
- October 2017
- ISBN:
- 9780691147611
- eISBN:
- 9781400866595
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691147611.003.0004
- Subject:
- Mathematics, Probability / Statistics
This chapter switches from the traditional analysis of Benford's law using data sets to a search for probability distributions that obey Benford's law. It begins by briefly discussing the origins of ...
More
This chapter switches from the traditional analysis of Benford's law using data sets to a search for probability distributions that obey Benford's law. It begins by briefly discussing the origins of Benford's law through the independent efforts of Simon Newcomb (1835–1909) and Frank Benford, Jr. (1883–1948), both of whom made their discoveries through empirical data. Although Benford's law applies to a wide variety of data sets, none of the popular parametric distributions, such as the exponential and normal distributions, agree exactly with Benford's law. The chapter thus highlights the failures of several of these well-known probability distributions in conforming to Benford's law, considers what types of probability distributions might produce data that obey Benford's law, and looks at some of the geometry associated with these probability distributions.Less
This chapter switches from the traditional analysis of Benford's law using data sets to a search for probability distributions that obey Benford's law. It begins by briefly discussing the origins of Benford's law through the independent efforts of Simon Newcomb (1835–1909) and Frank Benford, Jr. (1883–1948), both of whom made their discoveries through empirical data. Although Benford's law applies to a wide variety of data sets, none of the popular parametric distributions, such as the exponential and normal distributions, agree exactly with Benford's law. The chapter thus highlights the failures of several of these well-known probability distributions in conforming to Benford's law, considers what types of probability distributions might produce data that obey Benford's law, and looks at some of the geometry associated with these probability distributions.
Therese M. Donovan and Ruth M. Mickey
- Published in print:
- 2019
- Published Online:
- July 2019
- ISBN:
- 9780198841296
- eISBN:
- 9780191876820
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198841296.003.0001
- Subject:
- Biology, Biomathematics / Statistics and Data Analysis / Complexity Studies
In this chapter, the concept of probability is introduced. The rolling of a die is an example of a random process: the face that comes up is subject to chance. In probability, the goal is to quantify ...
More
In this chapter, the concept of probability is introduced. The rolling of a die is an example of a random process: the face that comes up is subject to chance. In probability, the goal is to quantify such a random process. That is, we want to assign a number to it. This chapter introduces some basic terms used in the study of probability; by the end of the chapter, the reader will be able to define the following terms: sample space, outcome, discrete outcome, event, probability, probability distribution, trial, empirical distribution, and Law of Large Numbers. Using an example, the chapter focuses on a single characteristic and introduces basic vocabulary associated with probability.Less
In this chapter, the concept of probability is introduced. The rolling of a die is an example of a random process: the face that comes up is subject to chance. In probability, the goal is to quantify such a random process. That is, we want to assign a number to it. This chapter introduces some basic terms used in the study of probability; by the end of the chapter, the reader will be able to define the following terms: sample space, outcome, discrete outcome, event, probability, probability distribution, trial, empirical distribution, and Law of Large Numbers. Using an example, the chapter focuses on a single characteristic and introduces basic vocabulary associated with probability.