Richard Swinburne (ed.)
- Published in print:
- 2005
- Published Online:
- January 2012
- ISBN:
- 9780197263419
- eISBN:
- 9780191734175
- Item type:
- book
- Publisher:
- British Academy
- DOI:
- 10.5871/bacad/9780197263419.001.0001
- Subject:
- Philosophy, Logic/Philosophy of Mathematics
Bayes' theorem is a tool for assessing how probable evidence makes some hypothesis. The papers in this book consider the worth and applicability of the theorem. The book sets out the philosophical ...
More
Bayes' theorem is a tool for assessing how probable evidence makes some hypothesis. The papers in this book consider the worth and applicability of the theorem. The book sets out the philosophical issues: Elliott Sober argues that there are other criteria for assessing hypotheses; Colin Howson, Philip Dawid, and John Earman consider how the theorem can be used in statistical science, in weighing evidence in criminal trials, and in assessing evidence for the occurrence of miracles; and David Miller argues for the worth of the probability calculus as a tool for measuring propensities in nature rather than the strength of evidence. The book ends with the original paper containing the theorem, presented to the Royal Society in 1763.Less
Bayes' theorem is a tool for assessing how probable evidence makes some hypothesis. The papers in this book consider the worth and applicability of the theorem. The book sets out the philosophical issues: Elliott Sober argues that there are other criteria for assessing hypotheses; Colin Howson, Philip Dawid, and John Earman consider how the theorem can be used in statistical science, in weighing evidence in criminal trials, and in assessing evidence for the occurrence of miracles; and David Miller argues for the worth of the probability calculus as a tool for measuring propensities in nature rather than the strength of evidence. The book ends with the original paper containing the theorem, presented to the Royal Society in 1763.
Bernard Gert, Charles M. Culver, and K. Danner Clouser
- Published in print:
- 2006
- Published Online:
- September 2006
- ISBN:
- 9780195159066
- eISBN:
- 9780199786466
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0195159063.003.0008
- Subject:
- Philosophy, Moral Philosophy
This chapter shows the importance of recognizing the probabilistic nature of medical diagnosis and treatment. It discusses the possibly serious effects of physicians not understanding Bayes theorem, ...
More
This chapter shows the importance of recognizing the probabilistic nature of medical diagnosis and treatment. It discusses the possibly serious effects of physicians not understanding Bayes theorem, and hence, not appreciating the importance of knowing the prevalence of a disorder in the population to be treated or screened. It shows the importance of doctors knowing about volume-outcome studies, geographical variation studies, and practice guidelines.Less
This chapter shows the importance of recognizing the probabilistic nature of medical diagnosis and treatment. It discusses the possibly serious effects of physicians not understanding Bayes theorem, and hence, not appreciating the importance of knowing the prevalence of a disorder in the population to be treated or screened. It shows the importance of doctors knowing about volume-outcome studies, geographical variation studies, and practice guidelines.
Richard Swinburne
- Published in print:
- 2003
- Published Online:
- November 2003
- ISBN:
- 9780199257461
- eISBN:
- 9780191598616
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0199257469.003.0014
- Subject:
- Philosophy, Philosophy of Religion
Summarizes the argument of the book. Given only a moderate amount of evidence from natural theology in favour of the existence of God and his having reason to become incarnate among humans, there is ...
More
Summarizes the argument of the book. Given only a moderate amount of evidence from natural theology in favour of the existence of God and his having reason to become incarnate among humans, there is far more evidence that Jesus led the sort of life that God Incarnate would lead, and that that life was culminated by a super‐miracle, than there is for any other prophet in human history. In consequence, the overall balance of evidence in favour of the Resurrection having occurred is very strong. This is elucidated in the formalism of the probability calculus by Bayes's theorem.Less
Summarizes the argument of the book. Given only a moderate amount of evidence from natural theology in favour of the existence of God and his having reason to become incarnate among humans, there is far more evidence that Jesus led the sort of life that God Incarnate would lead, and that that life was culminated by a super‐miracle, than there is for any other prophet in human history. In consequence, the overall balance of evidence in favour of the Resurrection having occurred is very strong. This is elucidated in the formalism of the probability calculus by Bayes's theorem.
José M. Bernardo, M. J. Bayarri, James O. Berger, A. P. Dawid, David Heckerman, Adrian F. M. Smith, and Mike West (eds)
- Published in print:
- 2011
- Published Online:
- January 2012
- ISBN:
- 9780199694587
- eISBN:
- 9780191731921
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199694587.001.0001
- Subject:
- Mathematics, Probability / Statistics
The Valencia International Meetings on Bayesian Statistics – established in 1979 and held every four years – have been the forum for a definitive overview of current concerns and activities in ...
More
The Valencia International Meetings on Bayesian Statistics – established in 1979 and held every four years – have been the forum for a definitive overview of current concerns and activities in Bayesian statistics. These are the edited Proceedings of the Ninth meeting, and contain the invited papers each followed by their discussion and a rejoinder by the author(s). In the tradition of the earlier editions, this encompasses an enormous range of theoretical and applied research, highlighting the breadth, vitality and impact of Bayesian thinking in interdisciplinary research across many fields as well as the corresponding growth and vitality of core theory and methodology. The Valencia 9 invited papers cover a broad range of topics, including foundational and core theoretical issues in statistics, the continued development of new and refined computational methods for complex Bayesian modelling, substantive applications of flexible Bayesian modelling, and new developments in the theory and methodology of graphical modelling. They also describe advances in methodology for specific applied fields, including financial econometrics and portfolio decision making, public policy applications for drug surveillance, studies in the physical and environmental sciences, astronomy and astrophysics, climate change studies, molecular biosciences, statistical genetics or stochastic dynamic networks in systems biology.Less
The Valencia International Meetings on Bayesian Statistics – established in 1979 and held every four years – have been the forum for a definitive overview of current concerns and activities in Bayesian statistics. These are the edited Proceedings of the Ninth meeting, and contain the invited papers each followed by their discussion and a rejoinder by the author(s). In the tradition of the earlier editions, this encompasses an enormous range of theoretical and applied research, highlighting the breadth, vitality and impact of Bayesian thinking in interdisciplinary research across many fields as well as the corresponding growth and vitality of core theory and methodology. The Valencia 9 invited papers cover a broad range of topics, including foundational and core theoretical issues in statistics, the continued development of new and refined computational methods for complex Bayesian modelling, substantive applications of flexible Bayesian modelling, and new developments in the theory and methodology of graphical modelling. They also describe advances in methodology for specific applied fields, including financial econometrics and portfolio decision making, public policy applications for drug surveillance, studies in the physical and environmental sciences, astronomy and astrophysics, climate change studies, molecular biosciences, statistical genetics or stochastic dynamic networks in systems biology.
Christopher G. Small and Jinfang Wang
- Published in print:
- 2003
- Published Online:
- September 2007
- ISBN:
- 9780198506881
- eISBN:
- 9780191709258
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198506881.003.0008
- Subject:
- Mathematics, Probability / Statistics
This chapter demonstrates that the numerical methods of earlier chapters are not constrained by statistical philosophy. The theory of Bayesian estimating functions is developed. It is shown that this ...
More
This chapter demonstrates that the numerical methods of earlier chapters are not constrained by statistical philosophy. The theory of Bayesian estimating functions is developed. It is shown that this theory has Bayesian analogues for many of the concepts introduced in earlier chapters. While point estimation is often considered of secondary importance to Bayesians, the Bayesian estimating function methodology does have important applications in areas such as actuarial science.Less
This chapter demonstrates that the numerical methods of earlier chapters are not constrained by statistical philosophy. The theory of Bayesian estimating functions is developed. It is shown that this theory has Bayesian analogues for many of the concepts introduced in earlier chapters. While point estimation is often considered of secondary importance to Bayesians, the Bayesian estimating function methodology does have important applications in areas such as actuarial science.
Colin Howson
- Published in print:
- 2000
- Published Online:
- November 2003
- ISBN:
- 9780198250371
- eISBN:
- 9780191597749
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198250371.003.0012
- Subject:
- Philosophy, Philosophy of Science
The coda discusses Hume's famous account of miracles in the Enquiry, and his criterion for belief in miracles, and shows that it amounts to a simple argument in the probability calculus.
The coda discusses Hume's famous account of miracles in the Enquiry, and his criterion for belief in miracles, and shows that it amounts to a simple argument in the probability calculus.
Robert E. Goodin
- Published in print:
- 2003
- Published Online:
- November 2003
- ISBN:
- 9780199256174
- eISBN:
- 9780191599354
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0199256179.003.0006
- Subject:
- Political Science, Political Theory
Shows how Bayesian thinking should make democratic outcomes so rationally compelling. Bayes's formula provides a mathematical expression for specifying exactly how we ought rationally to update our a ...
More
Shows how Bayesian thinking should make democratic outcomes so rationally compelling. Bayes's formula provides a mathematical expression for specifying exactly how we ought rationally to update our a priori beliefs in light of subsequent evidence, and the proposal is that voters are modelled in like fashion: votes, let us suppose, constitute (among other things) ‘reports’ of the voter's experiences and perceptions; further suppose that voters accord ‘evidentiary value’ to the reports they receive from one another through those votes; and further suppose that voters are rational, and that part and parcel of their being rational is being prepared to revise their opinions in light of further evidence (including evidence emanating from one another's votes‐cum‐reports). In this process, each of us treats our own experiences and perceptions as one source of evidence, and regards our own report as right; in that sense, we are perfectly sincere when we vote in a particular way, although we also acknowledge that our own experiences and perspectives are particular and peculiar, and hence our own perceptions are themselves inconclusive; because of that, voters striving to behave rationally should sincerely want to adjust their a priori beliefs in the light of all other experiences and perceptions that are reported at an election. Bayesian updating of that sort may well lead people who started out believing (and voting) one way to end up believing (and genuinely wanting implemented) the opposite way, just so long as sufficiently many votes‐cum‐reports point in that different direction; in other words, Bayesian reasoning can, and in politically typical cases ought to, provide people with a compelling reason to accede to the majority verdict. In this way, Bayesianism ‘rationalizes’ majority rule in a pretty strong sense; indeed if anything, it underwrites majoritarianism too strongly.Less
Shows how Bayesian thinking should make democratic outcomes so rationally compelling. Bayes's formula provides a mathematical expression for specifying exactly how we ought rationally to update our a priori beliefs in light of subsequent evidence, and the proposal is that voters are modelled in like fashion: votes, let us suppose, constitute (among other things) ‘reports’ of the voter's experiences and perceptions; further suppose that voters accord ‘evidentiary value’ to the reports they receive from one another through those votes; and further suppose that voters are rational, and that part and parcel of their being rational is being prepared to revise their opinions in light of further evidence (including evidence emanating from one another's votes‐cum‐reports). In this process, each of us treats our own experiences and perceptions as one source of evidence, and regards our own report as right; in that sense, we are perfectly sincere when we vote in a particular way, although we also acknowledge that our own experiences and perspectives are particular and peculiar, and hence our own perceptions are themselves inconclusive; because of that, voters striving to behave rationally should sincerely want to adjust their a priori beliefs in the light of all other experiences and perceptions that are reported at an election. Bayesian updating of that sort may well lead people who started out believing (and voting) one way to end up believing (and genuinely wanting implemented) the opposite way, just so long as sufficiently many votes‐cum‐reports point in that different direction; in other words, Bayesian reasoning can, and in politically typical cases ought to, provide people with a compelling reason to accede to the majority verdict. In this way, Bayesianism ‘rationalizes’ majority rule in a pretty strong sense; indeed if anything, it underwrites majoritarianism too strongly.
Ken Binmore
- Published in print:
- 2007
- Published Online:
- May 2007
- ISBN:
- 9780195300574
- eISBN:
- 9780199783748
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195300574.003.0013
- Subject:
- Economics and Finance, Microeconomics
This chapter is about Bayesian decision theory. It explains why game theorists model players' beliefs using subjective probability distributions, and how these beliefs are updated using Bayes' rule ...
More
This chapter is about Bayesian decision theory. It explains why game theorists model players' beliefs using subjective probability distributions, and how these beliefs are updated using Bayes' rule as further information is received during the play of a game. A skeptical assessment of Bayesian decision theory as a solution to the general problem of scientific induction is then offered, suggesting that we stick to Leonard Savage's view that his theory properly applies only in the context of a small world. The chapter ends with a brief review of the common prior assumption and the idea of subjective equilibria.Less
This chapter is about Bayesian decision theory. It explains why game theorists model players' beliefs using subjective probability distributions, and how these beliefs are updated using Bayes' rule as further information is received during the play of a game. A skeptical assessment of Bayesian decision theory as a solution to the general problem of scientific induction is then offered, suggesting that we stick to Leonard Savage's view that his theory properly applies only in the context of a small world. The chapter ends with a brief review of the common prior assumption and the idea of subjective equilibria.
Ken Binmore
- Published in print:
- 2007
- Published Online:
- May 2007
- ISBN:
- 9780195300574
- eISBN:
- 9780199783748
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195300574.003.0015
- Subject:
- Economics and Finance, Microeconomics
This chapter describes John Harsanyi's theory of so-called games of incomplete information using Poker as a motivating example. The chapter begins by analyzing a simplified version of Von Neumann's ...
More
This chapter describes John Harsanyi's theory of so-called games of incomplete information using Poker as a motivating example. The chapter begins by analyzing a simplified version of Von Neumann's second Poker model. The general theory of incomplete information is then described. Russian Roulette and Cournot Duopoly with incomplete information about costs are used as examples. Harsanyi's purification of mixed strategies is briefly described. The finitely repeated Prisoner's Dilemma, in which the number of repetitions is not common knowledge, is given as an example with incomplete information about the rules of a game.Less
This chapter describes John Harsanyi's theory of so-called games of incomplete information using Poker as a motivating example. The chapter begins by analyzing a simplified version of Von Neumann's second Poker model. The general theory of incomplete information is then described. Russian Roulette and Cournot Duopoly with incomplete information about costs are used as examples. Harsanyi's purification of mixed strategies is briefly described. The finitely repeated Prisoner's Dilemma, in which the number of repetitions is not common knowledge, is given as an example with incomplete information about the rules of a game.
Jie W Weiss and David J Weiss
- Published in print:
- 2008
- Published Online:
- January 2009
- ISBN:
- 9780195322989
- eISBN:
- 9780199869206
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195322989.003.0010
- Subject:
- Psychology, Cognitive Psychology
Bayes Nets (BNs) and Influence Diagrams (IDs), new tools that use graphic user interfaces to facilitate representation of complex inference and decision structures, it are the core elements of new ...
More
Bayes Nets (BNs) and Influence Diagrams (IDs), new tools that use graphic user interfaces to facilitate representation of complex inference and decision structures, it are the core elements of new computer technologies that make the 21st century the Century of Bayes. BNs are a way of representing a set of related uncertainties. They facilitate Bayesian inference by separating structural information from parameters. Hailfinder is a BN that predicts severe summer weather in Eastern Colorado. Its design led to a number of novel ideas about how to build such BNs. This chapter argues that psychologists should care about these new tools because the rapid development of BN and ID programs will create a market for skilled elicitors and for ways of teaching domain experts how to make appropriate judgments. Psychologists should care about these new tools if they would prefer not to be omitted from the sellers in that market. Psychologists should also care because these new tools constitute a new, normatively appropriate way of performing important intellectual tasks. Psychologists who want to explain how people perform intellectual tasks should not be indifferent to the development of tools that will help people to do well in three tasks that, if we are to believe a generation of researchers in cognitive psychology, they now do poorly: evaluation, inference, and decision.Less
Bayes Nets (BNs) and Influence Diagrams (IDs), new tools that use graphic user interfaces to facilitate representation of complex inference and decision structures, it are the core elements of new computer technologies that make the 21st century the Century of Bayes. BNs are a way of representing a set of related uncertainties. They facilitate Bayesian inference by separating structural information from parameters. Hailfinder is a BN that predicts severe summer weather in Eastern Colorado. Its design led to a number of novel ideas about how to build such BNs. This chapter argues that psychologists should care about these new tools because the rapid development of BN and ID programs will create a market for skilled elicitors and for ways of teaching domain experts how to make appropriate judgments. Psychologists should care about these new tools if they would prefer not to be omitted from the sellers in that market. Psychologists should also care because these new tools constitute a new, normatively appropriate way of performing important intellectual tasks. Psychologists who want to explain how people perform intellectual tasks should not be indifferent to the development of tools that will help people to do well in three tasks that, if we are to believe a generation of researchers in cognitive psychology, they now do poorly: evaluation, inference, and decision.
Jie W Weiss and David J Weiss
- Published in print:
- 2008
- Published Online:
- January 2009
- ISBN:
- 9780195322989
- eISBN:
- 9780199869206
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195322989.003.0032
- Subject:
- Psychology, Cognitive Psychology
This chapter focuses on decision technology—the rules and tools that help us make wiser decisions. It begins by reviewing the three rules that are at the heart of most traditional decision ...
More
This chapter focuses on decision technology—the rules and tools that help us make wiser decisions. It begins by reviewing the three rules that are at the heart of most traditional decision technology: multiattribute utility, Bayes' theorem, and subjective expected utility maximization. A comprehensive nineteen-step model is presented to show how to make best use of all three rules. The remainder of the chapter explores recently developed tools of decision technology.Less
This chapter focuses on decision technology—the rules and tools that help us make wiser decisions. It begins by reviewing the three rules that are at the heart of most traditional decision technology: multiattribute utility, Bayes' theorem, and subjective expected utility maximization. A comprehensive nineteen-step model is presented to show how to make best use of all three rules. The remainder of the chapter explores recently developed tools of decision technology.
Shoutir Kishore Chatterjee
- Published in print:
- 2003
- Published Online:
- September 2007
- ISBN:
- 9780198525318
- eISBN:
- 9780191711657
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198525318.003.0007
- Subject:
- Mathematics, Probability / Statistics
Around the middle of the 18th century, Bayes conceived the idea of treating an unknown parameter as a subjective random variable distributed according to a prior, and inferring about it from its ...
More
Around the middle of the 18th century, Bayes conceived the idea of treating an unknown parameter as a subjective random variable distributed according to a prior, and inferring about it from its conditional (posterior) distribution given the observations. He considered the particular case of a binomial parameter subject to a uniform prior, and following the pro-subjective approach used the posterior to derive an interval estimate. Later, Laplace stated the result in its general form and employed it extensively for pro-subjective inference of various types in different situations, often basing his computation on the asymptotic normality of the posterior distribution. In a novel application, Laplace used pro-subjective reasoning and the data from a sample survey to estimate the size of the population of France.Less
Around the middle of the 18th century, Bayes conceived the idea of treating an unknown parameter as a subjective random variable distributed according to a prior, and inferring about it from its conditional (posterior) distribution given the observations. He considered the particular case of a binomial parameter subject to a uniform prior, and following the pro-subjective approach used the posterior to derive an interval estimate. Later, Laplace stated the result in its general form and employed it extensively for pro-subjective inference of various types in different situations, often basing his computation on the asymptotic normality of the posterior distribution. In a novel application, Laplace used pro-subjective reasoning and the data from a sample survey to estimate the size of the population of France.
Richard Swinburne
- Published in print:
- 2004
- Published Online:
- September 2007
- ISBN:
- 9780199271672
- eISBN:
- 9780191709357
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199271672.003.0004
- Subject:
- Philosophy, Philosophy of Religion
An explanatory hypothesis (whether of the personal or scientific kind) is probable in so far as it makes probable the occurrence of many observed phenomena, the occurrence of which is not probable ...
More
An explanatory hypothesis (whether of the personal or scientific kind) is probable in so far as it makes probable the occurrence of many observed phenomena, the occurrence of which is not probable otherwise; and in so far as it is simple, and fits with background knowledge. This account of the probability of hypothesis is given precise form by Bayes's Theorem.Less
An explanatory hypothesis (whether of the personal or scientific kind) is probable in so far as it makes probable the occurrence of many observed phenomena, the occurrence of which is not probable otherwise; and in so far as it is simple, and fits with background knowledge. This account of the probability of hypothesis is given precise form by Bayes's Theorem.
N. Thompson Hobbs and Mevin B. Hooten
- Published in print:
- 2015
- Published Online:
- October 2017
- ISBN:
- 9780691159287
- eISBN:
- 9781400866557
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691159287.003.0005
- Subject:
- Biology, Ecology
This chapter lays out the basic principles of Bayesian inference, building on the concepts of probability developed in Chapter 3. It seeks to use the rules of probability to show how Bayes' theorem ...
More
This chapter lays out the basic principles of Bayesian inference, building on the concepts of probability developed in Chapter 3. It seeks to use the rules of probability to show how Bayes' theorem works, by making use of the conditional rule of probability and the law of total probability. The chapter begins with the central, underpinning tenet of the Bayesian view: the world can be divided into quantities that are observed and quantities that are unobserved. Unobserved quantities include parameters in models, latent states predicted by models, missing data, effect sizes, future states, and data before they are observed. We wish to learn about these quantities using observations. The Bayesian framework for achieving that understanding is applied in exactly the same way regardless of the specifics of the research problem at hand or the nature of the unobserved quantities.Less
This chapter lays out the basic principles of Bayesian inference, building on the concepts of probability developed in Chapter 3. It seeks to use the rules of probability to show how Bayes' theorem works, by making use of the conditional rule of probability and the law of total probability. The chapter begins with the central, underpinning tenet of the Bayesian view: the world can be divided into quantities that are observed and quantities that are unobserved. Unobserved quantities include parameters in models, latent states predicted by models, missing data, effect sizes, future states, and data before they are observed. We wish to learn about these quantities using observations. The Bayesian framework for achieving that understanding is applied in exactly the same way regardless of the specifics of the research problem at hand or the nature of the unobserved quantities.
Jon Williamson
- Published in print:
- 2010
- Published Online:
- September 2010
- ISBN:
- 9780199228003
- eISBN:
- 9780191711060
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199228003.003.0002
- Subject:
- Mathematics, Probability / Statistics, Logic / Computer Science / Mathematical Philosophy
This chapter reviews the various interpretations of probability (§2.2), assessing them in the light of certain desiderata for a philosophical theory of probability that are presented in §2.1. It is ...
More
This chapter reviews the various interpretations of probability (§2.2), assessing them in the light of certain desiderata for a philosophical theory of probability that are presented in §2.1. It is suggested that present‐day objective Bayesianism, which is characterized in §2.3, is close to early views of probability, such as those of Jakob Bernoulli and Thomas Bayes.Less
This chapter reviews the various interpretations of probability (§2.2), assessing them in the light of certain desiderata for a philosophical theory of probability that are presented in §2.1. It is suggested that present‐day objective Bayesianism, which is characterized in §2.3, is close to early views of probability, such as those of Jakob Bernoulli and Thomas Bayes.
M. Vidyasagar
- Published in print:
- 2014
- Published Online:
- October 2017
- ISBN:
- 9780691133157
- eISBN:
- 9781400850518
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691133157.003.0001
- Subject:
- Mathematics, Probability / Statistics
This chapter provides an introduction to probability and random variables. Probability theory is an attempt to formalize the notion of uncertainty in the outcome of an experiment. For instance, ...
More
This chapter provides an introduction to probability and random variables. Probability theory is an attempt to formalize the notion of uncertainty in the outcome of an experiment. For instance, suppose an urn contains four balls, colored red, blue, white, and green respectively. Suppose we dip our hand in the urn and pull out one of the balls “at random.” What is the likelihood that the ball we pull out will be red? The chapter first defines a random variable and probability before discussing the function of a random variable and expected value. It then considers total variation distance, joint and marginal probability distributions, independence and conditional probability distributions, Bayes' rule, and maximum likelihood estimates. Finally, it describes random variables assuming infinitely many values, focusing on Markov and Chebycheff inequalities, Hoeffding's inequality, Monte Carlo simulation, and Cramér's theorem.Less
This chapter provides an introduction to probability and random variables. Probability theory is an attempt to formalize the notion of uncertainty in the outcome of an experiment. For instance, suppose an urn contains four balls, colored red, blue, white, and green respectively. Suppose we dip our hand in the urn and pull out one of the balls “at random.” What is the likelihood that the ball we pull out will be red? The chapter first defines a random variable and probability before discussing the function of a random variable and expected value. It then considers total variation distance, joint and marginal probability distributions, independence and conditional probability distributions, Bayes' rule, and maximum likelihood estimates. Finally, it describes random variables assuming infinitely many values, focusing on Markov and Chebycheff inequalities, Hoeffding's inequality, Monte Carlo simulation, and Cramér's theorem.
David Hodgson
- Published in print:
- 2012
- Published Online:
- May 2012
- ISBN:
- 9780199845309
- eISBN:
- 9780199932269
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199845309.003.0004
- Subject:
- Philosophy, Philosophy of Mind, General
In Chapter 3, I consider one very important aspect of our rationality, namely our ability to engage in plausible reasoning; that is, reasoning in which the premises or data do not entail the ...
More
In Chapter 3, I consider one very important aspect of our rationality, namely our ability to engage in plausible reasoning; that is, reasoning in which the premises or data do not entail the conclusions by virtue of applicable rules, but rather support them as a matter of reasonable albeit fallible judgment. I argue that even the scientific method depends on plausible reasoning, and that plausible reasoning cannot be fully explained in terms of rules for good reasoning. I discuss Bayes’ theorem, and its merits and limitations. I conclude by introducing the possibility that underlying plausible reasoning there are physical structures and algorithmic processes selected by evolution, and foreshadowing my contention that this cannot fully account for plausible reasoning.Less
In Chapter 3, I consider one very important aspect of our rationality, namely our ability to engage in plausible reasoning; that is, reasoning in which the premises or data do not entail the conclusions by virtue of applicable rules, but rather support them as a matter of reasonable albeit fallible judgment. I argue that even the scientific method depends on plausible reasoning, and that plausible reasoning cannot be fully explained in terms of rules for good reasoning. I discuss Bayes’ theorem, and its merits and limitations. I conclude by introducing the possibility that underlying plausible reasoning there are physical structures and algorithmic processes selected by evolution, and foreshadowing my contention that this cannot fully account for plausible reasoning.
Bas C. van Fraassen
- Published in print:
- 1989
- Published Online:
- November 2003
- ISBN:
- 9780198248606
- eISBN:
- 9780191597459
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198248601.003.0013
- Subject:
- Philosophy, Philosophy of Science
While it was argued earlier in the book that no rule‐governed notion of rational opinion change could be adequate, there are certainly patterns of normal opinion change (updating in response to new ...
More
While it was argued earlier in the book that no rule‐governed notion of rational opinion change could be adequate, there are certainly patterns of normal opinion change (updating in response to new data or new constraints accepted in response to experience), which have a rule‐following form. The basic example is Simple Conditionalization (often characterized as the application of Bayes's rule or Bayes's theorem, sometimes called Bayesian Conditionalization, and sometimes accepted as the sole admissible form of opinion change), but more advanced patterns (beginning with Jeffrey Conditionalization) have been described in the literature, as well as challenged there, e.g. by Isaac Levi. The question of what can justify such rules is addressed using symmetry arguments, and the (hidden or explicit) premises of such arguments analysed. Probability kinematics, as formulated initially by Richard Jeffrey, is the general theory of rules for changing a (‘prior’) probability function, subject to given or imposed constraints, into a new (‘updated’, ‘posterior’) function. Such constraints can take various forms, and the rules offered for them can be limited by symmetry considerations but may not be uniquely determined.Less
While it was argued earlier in the book that no rule‐governed notion of rational opinion change could be adequate, there are certainly patterns of normal opinion change (updating in response to new data or new constraints accepted in response to experience), which have a rule‐following form. The basic example is Simple Conditionalization (often characterized as the application of Bayes's rule or Bayes's theorem, sometimes called Bayesian Conditionalization, and sometimes accepted as the sole admissible form of opinion change), but more advanced patterns (beginning with Jeffrey Conditionalization) have been described in the literature, as well as challenged there, e.g. by Isaac Levi. The question of what can justify such rules is addressed using symmetry arguments, and the (hidden or explicit) premises of such arguments analysed. Probability kinematics, as formulated initially by Richard Jeffrey, is the general theory of rules for changing a (‘prior’) probability function, subject to given or imposed constraints, into a new (‘updated’, ‘posterior’) function. Such constraints can take various forms, and the rules offered for them can be limited by symmetry considerations but may not be uniquely determined.
Colin Howson
- Published in print:
- 2000
- Published Online:
- November 2003
- ISBN:
- 9780198250371
- eISBN:
- 9780191597749
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198250371.003.0009
- Subject:
- Philosophy, Philosophy of Science
Applies the results of Ch. 7 to scientific methodology and shows that they give a logical interpretation of the subjective Bayesian theory of inductive inference. This theory is therefore no more ...
More
Applies the results of Ch. 7 to scientific methodology and shows that they give a logical interpretation of the subjective Bayesian theory of inductive inference. This theory is therefore no more necessarily subjective than deductive logic, consisting as both do of objective logical rules for proceeding from premises to conclusion. In the Bayesian case, the premises are prior probability assignments. It is shown that familiar rules of scientific method are endorsed, and, in particular, the rule that unless there is prior support for a hypothesis, its overall probability will be very small however good the fit with current evidence.Less
Applies the results of Ch. 7 to scientific methodology and shows that they give a logical interpretation of the subjective Bayesian theory of inductive inference. This theory is therefore no more necessarily subjective than deductive logic, consisting as both do of objective logical rules for proceeding from premises to conclusion. In the Bayesian case, the premises are prior probability assignments. It is shown that familiar rules of scientific method are endorsed, and, in particular, the rule that unless there is prior support for a hypothesis, its overall probability will be very small however good the fit with current evidence.
Guido Consonni and Luca La Rocca
- Published in print:
- 2011
- Published Online:
- January 2012
- ISBN:
- 9780199694587
- eISBN:
- 9780191731921
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199694587.003.0004
- Subject:
- Mathematics, Probability / Statistics
We propose a new method for the objective comparison of two nested models based on non‐local priors. More specifically, starting with a default prior under each of the two models, we construct a ...
More
We propose a new method for the objective comparison of two nested models based on non‐local priors. More specifically, starting with a default prior under each of the two models, we construct a moment prior under the larger model, and then use the fractional Bayes factor for a comparison. Non‐local priors have been recently introduced to obtain a better separation between nested models, thus accelerating the learning behaviour, relative to currently used local priors, when the smaller model holds. Although the argument showing the superior performance of non‐local priors is asymptotic, the improvement they produce is already apparent for small to moderate samples sizes, which makes them a useful and practical tool. As a by‐product, it turns out that routinely used objective methods, such as ordinary fractional Bayes factors, are alarmingly slow in learning that the smaller model holds. On the downside, when the larger model holds, non‐local priors exhibit a weaker discriminatory power against sampling distributions close to the smaller model. However, this drawback becomes rapidly negligible as the sample size grows, because the learning rate of the Bayes factor under the larger model is exponentially fast, whether one uses local or non‐local priors. We apply our methodology to directed acyclic graph models having a Gaussian distribution. Because of the recursive nature of the joint density, and the assumption of global parameter independence embodied in our prior, calculations need only be performed for individual vertices admitting a distinct parent structure under the two graphs; additionally we obtain closed‐form expressions as in the ordinary conjugate case. We provide illustrations of our method for a simple three‐variable case, as well as for a more elaborate seven‐variable situation. Although we concentrate on pairwise comparisons of nested models, our procedure can be implemented to carry‐out a search over the space of all models.Less
We propose a new method for the objective comparison of two nested models based on non‐local priors. More specifically, starting with a default prior under each of the two models, we construct a moment prior under the larger model, and then use the fractional Bayes factor for a comparison. Non‐local priors have been recently introduced to obtain a better separation between nested models, thus accelerating the learning behaviour, relative to currently used local priors, when the smaller model holds. Although the argument showing the superior performance of non‐local priors is asymptotic, the improvement they produce is already apparent for small to moderate samples sizes, which makes them a useful and practical tool. As a by‐product, it turns out that routinely used objective methods, such as ordinary fractional Bayes factors, are alarmingly slow in learning that the smaller model holds. On the downside, when the larger model holds, non‐local priors exhibit a weaker discriminatory power against sampling distributions close to the smaller model. However, this drawback becomes rapidly negligible as the sample size grows, because the learning rate of the Bayes factor under the larger model is exponentially fast, whether one uses local or non‐local priors. We apply our methodology to directed acyclic graph models having a Gaussian distribution. Because of the recursive nature of the joint density, and the assumption of global parameter independence embodied in our prior, calculations need only be performed for individual vertices admitting a distinct parent structure under the two graphs; additionally we obtain closed‐form expressions as in the ordinary conjugate case. We provide illustrations of our method for a simple three‐variable case, as well as for a more elaborate seven‐variable situation. Although we concentrate on pairwise comparisons of nested models, our procedure can be implemented to carry‐out a search over the space of all models.