Jon Williamson
- Published in print:
- 2004
- Published Online:
- September 2007
- ISBN:
- 9780198530794
- eISBN:
- 9780191712982
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198530794.001.0001
- Subject:
- Mathematics, Logic / Computer Science / Mathematical Philosophy
This book provides an introduction to, and analysis of, the use of Bayesian nets in causal modelling. It puts forward new conceptual foundations for causal network modelling: The book argues that ...
More
This book provides an introduction to, and analysis of, the use of Bayesian nets in causal modelling. It puts forward new conceptual foundations for causal network modelling: The book argues that probability and causality need to be interpreted as epistemic notions in order for the key assumptions behind causal models to hold. Under the epistemic view, probability and causality are understood in terms of the beliefs an agent ought to adopt. The book develops an objective Bayesian notion of probability and a corresponding epistemic theory of causality. This yields a general framework for causal modelling, which is extended to cope with recursive causal relations, logically complex beliefs and changes in an agent's language.Less
This book provides an introduction to, and analysis of, the use of Bayesian nets in causal modelling. It puts forward new conceptual foundations for causal network modelling: The book argues that probability and causality need to be interpreted as epistemic notions in order for the key assumptions behind causal models to hold. Under the epistemic view, probability and causality are understood in terms of the beliefs an agent ought to adopt. The book develops an objective Bayesian notion of probability and a corresponding epistemic theory of causality. This yields a general framework for causal modelling, which is extended to cope with recursive causal relations, logically complex beliefs and changes in an agent's language.
Jon Williamson
- Published in print:
- 2010
- Published Online:
- September 2010
- ISBN:
- 9780199228003
- eISBN:
- 9780191711060
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199228003.001.0001
- Subject:
- Mathematics, Probability / Statistics, Logic / Computer Science / Mathematical Philosophy
Bayesian epistemology aims to answer the following question: How strongly should an agent believe the various propositions expressible in her language? Subjective Bayesians hold that.it is largely ...
More
Bayesian epistemology aims to answer the following question: How strongly should an agent believe the various propositions expressible in her language? Subjective Bayesians hold that.it is largely (though not entirely) up to the agent as to which degrees of belief to adopt. Objective Bayesians, on the other hand, maintain that appropriate degrees of belief are largely (though not entirely) determined by the agent's evidence. This book states and defends a version of objective Bayesian epistemology. According to this version, objective Bayesianism is characterized by three norms: (i) Probability: degrees of belief should be probabilities; (ii) Calibration: they should be calibrated with evidence; and (iii) Equivocation: they should otherwise equivocate between basic outcomes. Objective Bayesianism has been challenged on a number of different fronts: for example, it has been accused of being poorly motivated, of failing to handle qualitative evidence, of yielding counter‐intuitive degrees of belief after updating, of suffering from a failure to learn from experience, of being computationally intractable, of being susceptible to paradox, of being language dependent, and of not being objective enough. The book argues that these criticisms can be met and that objective Bayesianism is a promising theory with an exciting agenda for further research.Less
Bayesian epistemology aims to answer the following question: How strongly should an agent believe the various propositions expressible in her language? Subjective Bayesians hold that.it is largely (though not entirely) up to the agent as to which degrees of belief to adopt. Objective Bayesians, on the other hand, maintain that appropriate degrees of belief are largely (though not entirely) determined by the agent's evidence. This book states and defends a version of objective Bayesian epistemology. According to this version, objective Bayesianism is characterized by three norms: (i) Probability: degrees of belief should be probabilities; (ii) Calibration: they should be calibrated with evidence; and (iii) Equivocation: they should otherwise equivocate between basic outcomes. Objective Bayesianism has been challenged on a number of different fronts: for example, it has been accused of being poorly motivated, of failing to handle qualitative evidence, of yielding counter‐intuitive degrees of belief after updating, of suffering from a failure to learn from experience, of being computationally intractable, of being susceptible to paradox, of being language dependent, and of not being objective enough. The book argues that these criticisms can be met and that objective Bayesianism is a promising theory with an exciting agenda for further research.
Flavio M. Menezes and Paulo K. Monteiro
- Published in print:
- 2004
- Published Online:
- April 2005
- ISBN:
- 9780199275984
- eISBN:
- 9780191602214
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/019927598X.001.0001
- Subject:
- Economics and Finance, Microeconomics
This book presents an in-depth discussion of the auction theory. It introduces the concept of Bayesian Nash equilibrium and the idea of studying auctions as games. Private, common, and affiliated ...
More
This book presents an in-depth discussion of the auction theory. It introduces the concept of Bayesian Nash equilibrium and the idea of studying auctions as games. Private, common, and affiliated values models and multi-object auction models are described. A general version of the Revenue Equivalence Theorem is derived and the optimal auction is characterized to relate the field of mechanism design to auction theory.Less
This book presents an in-depth discussion of the auction theory. It introduces the concept of Bayesian Nash equilibrium and the idea of studying auctions as games. Private, common, and affiliated values models and multi-object auction models are described. A general version of the Revenue Equivalence Theorem is derived and the optimal auction is characterized to relate the field of mechanism design to auction theory.
Željko Ivezi, Andrew J. Connolly, Jacob T. VanderPlas, Alexander Gray, Željko Ivezi, Andrew J. Connolly, Jacob T. VanderPlas, and Alexander Gray
- Published in print:
- 2014
- Published Online:
- October 2017
- ISBN:
- 9780691151687
- eISBN:
- 9781400848911
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691151687.003.0005
- Subject:
- Physics, Particle Physics / Astrophysics / Cosmology
This chapter introduces the most important aspects of Bayesian statistical inference and techniques for performing such calculations in practice. It first reviews the basic steps in Bayesian ...
More
This chapter introduces the most important aspects of Bayesian statistical inference and techniques for performing such calculations in practice. It first reviews the basic steps in Bayesian inference in early sections of the chapter, and then illustrates them with several examples in sections that follow. Numerical techniques for solving complex problems are next discussed, and the final section provides a summary of pros and cons for classical and Bayesian method. It argues that most users of Bayesian estimation methods are likely to use a mix of Bayesian and frequentist tools. The reverse is also true—frequentist data analysts, even if they stay formally within the frequentist framework, are often influenced by “Bayesian thinking,” referring to “priors” and “posteriors.” The most advisable position is to know both paradigms well, in order to make informed judgments about which tools to apply in which situations.Less
This chapter introduces the most important aspects of Bayesian statistical inference and techniques for performing such calculations in practice. It first reviews the basic steps in Bayesian inference in early sections of the chapter, and then illustrates them with several examples in sections that follow. Numerical techniques for solving complex problems are next discussed, and the final section provides a summary of pros and cons for classical and Bayesian method. It argues that most users of Bayesian estimation methods are likely to use a mix of Bayesian and frequentist tools. The reverse is also true—frequentist data analysts, even if they stay formally within the frequentist framework, are often influenced by “Bayesian thinking,” referring to “priors” and “posteriors.” The most advisable position is to know both paradigms well, in order to make informed judgments about which tools to apply in which situations.
H. Peyton Young
- Published in print:
- 2004
- Published Online:
- October 2011
- ISBN:
- 9780199269181
- eISBN:
- 9780191699375
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199269181.001.0001
- Subject:
- Economics and Finance, Econometrics
This book is based on the Arne Ryde Lectures in 2002. The book suggests a conceptual framework for studying strategic learning and highlights theoretical developments in the area. It discusses the ...
More
This book is based on the Arne Ryde Lectures in 2002. The book suggests a conceptual framework for studying strategic learning and highlights theoretical developments in the area. It discusses the interactive learning problem; reinforcement and regret; equilibrium; conditional no-regret learning; prediction, postdiction, and calibration; fictitious play and its variants; Bayesian learning; and hypothesis testing. The book’s framework emphasizes the amount of information required to implement different types of learning rules, criteria for evaluating their performance, and alternative notions of equilibrium to which they converge. The book also stresses the limits of what can be achieved: for a given type of game and a given amount of information, there may exist no learning procedure that satisfies certain reasonable criteria of performance and convergence.Less
This book is based on the Arne Ryde Lectures in 2002. The book suggests a conceptual framework for studying strategic learning and highlights theoretical developments in the area. It discusses the interactive learning problem; reinforcement and regret; equilibrium; conditional no-regret learning; prediction, postdiction, and calibration; fictitious play and its variants; Bayesian learning; and hypothesis testing. The book’s framework emphasizes the amount of information required to implement different types of learning rules, criteria for evaluating their performance, and alternative notions of equilibrium to which they converge. The book also stresses the limits of what can be achieved: for a given type of game and a given amount of information, there may exist no learning procedure that satisfies certain reasonable criteria of performance and convergence.
N. Thompson Hobbs and Mevin B. Hooten
- Published in print:
- 2015
- Published Online:
- October 2017
- ISBN:
- 9780691159287
- eISBN:
- 9781400866557
- Item type:
- book
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691159287.001.0001
- Subject:
- Biology, Ecology
Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This book provides a comprehensive ...
More
Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This book provides a comprehensive and accessible introduction to the latest Bayesian methods. It emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach, and is an essential primer for non-statisticians. It begins with a definition of probability and develops a step-by-step sequence of connected ideas, including basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and inference from single and multiple models. The book places less emphasis on computer coding, favoring instead a concise presentation of the mathematical statistics needed to understand how and why Bayesian analysis works. It also explains how to write out properly formulated hierarchical Bayesian models and use them in computing, research papers, and proposals. This book enables ecologists to understand the statistical principles behind Bayesian modeling and apply them to research, teaching, policy, and management.Less
Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This book provides a comprehensive and accessible introduction to the latest Bayesian methods. It emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach, and is an essential primer for non-statisticians. It begins with a definition of probability and develops a step-by-step sequence of connected ideas, including basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and inference from single and multiple models. The book places less emphasis on computer coding, favoring instead a concise presentation of the mathematical statistics needed to understand how and why Bayesian analysis works. It also explains how to write out properly formulated hierarchical Bayesian models and use them in computing, research papers, and proposals. This book enables ecologists to understand the statistical principles behind Bayesian modeling and apply them to research, teaching, policy, and management.
Robert E. Goodin
- Published in print:
- 2003
- Published Online:
- November 2003
- ISBN:
- 9780199256174
- eISBN:
- 9780191599354
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0199256179.003.0006
- Subject:
- Political Science, Political Theory
Shows how Bayesian thinking should make democratic outcomes so rationally compelling. Bayes's formula provides a mathematical expression for specifying exactly how we ought rationally to update our a ...
More
Shows how Bayesian thinking should make democratic outcomes so rationally compelling. Bayes's formula provides a mathematical expression for specifying exactly how we ought rationally to update our a priori beliefs in light of subsequent evidence, and the proposal is that voters are modelled in like fashion: votes, let us suppose, constitute (among other things) ‘reports’ of the voter's experiences and perceptions; further suppose that voters accord ‘evidentiary value’ to the reports they receive from one another through those votes; and further suppose that voters are rational, and that part and parcel of their being rational is being prepared to revise their opinions in light of further evidence (including evidence emanating from one another's votes‐cum‐reports). In this process, each of us treats our own experiences and perceptions as one source of evidence, and regards our own report as right; in that sense, we are perfectly sincere when we vote in a particular way, although we also acknowledge that our own experiences and perspectives are particular and peculiar, and hence our own perceptions are themselves inconclusive; because of that, voters striving to behave rationally should sincerely want to adjust their a priori beliefs in the light of all other experiences and perceptions that are reported at an election. Bayesian updating of that sort may well lead people who started out believing (and voting) one way to end up believing (and genuinely wanting implemented) the opposite way, just so long as sufficiently many votes‐cum‐reports point in that different direction; in other words, Bayesian reasoning can, and in politically typical cases ought to, provide people with a compelling reason to accede to the majority verdict. In this way, Bayesianism ‘rationalizes’ majority rule in a pretty strong sense; indeed if anything, it underwrites majoritarianism too strongly.Less
Shows how Bayesian thinking should make democratic outcomes so rationally compelling. Bayes's formula provides a mathematical expression for specifying exactly how we ought rationally to update our a priori beliefs in light of subsequent evidence, and the proposal is that voters are modelled in like fashion: votes, let us suppose, constitute (among other things) ‘reports’ of the voter's experiences and perceptions; further suppose that voters accord ‘evidentiary value’ to the reports they receive from one another through those votes; and further suppose that voters are rational, and that part and parcel of their being rational is being prepared to revise their opinions in light of further evidence (including evidence emanating from one another's votes‐cum‐reports). In this process, each of us treats our own experiences and perceptions as one source of evidence, and regards our own report as right; in that sense, we are perfectly sincere when we vote in a particular way, although we also acknowledge that our own experiences and perspectives are particular and peculiar, and hence our own perceptions are themselves inconclusive; because of that, voters striving to behave rationally should sincerely want to adjust their a priori beliefs in the light of all other experiences and perceptions that are reported at an election. Bayesian updating of that sort may well lead people who started out believing (and voting) one way to end up believing (and genuinely wanting implemented) the opposite way, just so long as sufficiently many votes‐cum‐reports point in that different direction; in other words, Bayesian reasoning can, and in politically typical cases ought to, provide people with a compelling reason to accede to the majority verdict. In this way, Bayesianism ‘rationalizes’ majority rule in a pretty strong sense; indeed if anything, it underwrites majoritarianism too strongly.
Ziheng Yang
- Published in print:
- 2006
- Published Online:
- April 2010
- ISBN:
- 9780198567028
- eISBN:
- 9780191728280
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198567028.001.0001
- Subject:
- Biology, Evolutionary Biology / Genetics
The field of molecular evolution has experienced explosive growth in recent years due to the rapid accumulation of genetic sequence data, continuous improvements to computer hardware and software, ...
More
The field of molecular evolution has experienced explosive growth in recent years due to the rapid accumulation of genetic sequence data, continuous improvements to computer hardware and software, and the development of sophisticated analytical methods. The increasing availability of large genomic data sets requires powerful statistical methods to analyse and interpret them, generating both computational and conceptual challenges for the field. This book provides a comprehensive coverage of modern statistical and computational methods used in molecular evolutionary analysis, such as maximum likelihood and Bayesian statistics. It describes the models, methods and algorithms that are most useful for analysing the ever-increasing supply of molecular sequence data, with a view to furthering our understanding of the evolution of genes and genomes. The book emphasizes essential concepts rather than mathematical proofs. It includes detailed derivations and implementation details, as well as numerous illustrations, worked examples, and exercises.Less
The field of molecular evolution has experienced explosive growth in recent years due to the rapid accumulation of genetic sequence data, continuous improvements to computer hardware and software, and the development of sophisticated analytical methods. The increasing availability of large genomic data sets requires powerful statistical methods to analyse and interpret them, generating both computational and conceptual challenges for the field. This book provides a comprehensive coverage of modern statistical and computational methods used in molecular evolutionary analysis, such as maximum likelihood and Bayesian statistics. It describes the models, methods and algorithms that are most useful for analysing the ever-increasing supply of molecular sequence data, with a view to furthering our understanding of the evolution of genes and genomes. The book emphasizes essential concepts rather than mathematical proofs. It includes detailed derivations and implementation details, as well as numerous illustrations, worked examples, and exercises.
Luc Bovens and Stephan Hartmann
- Published in print:
- 2004
- Published Online:
- January 2005
- ISBN:
- 9780199269754
- eISBN:
- 9780191601705
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0199269750.001.0001
- Subject:
- Philosophy, Metaphysics/Epistemology
Probabilistic models have much to offer to epistemology and philosophy of science. Arguably, the coherence theory of justification claims that the more coherent a set of propositions is, the more ...
More
Probabilistic models have much to offer to epistemology and philosophy of science. Arguably, the coherence theory of justification claims that the more coherent a set of propositions is, the more confident one ought to be in its content, ceteris paribus. An impossibility result shows that there cannot exist a coherence ordering. A coherence quasi-ordering can be constructed that respects this claim and is relevant to scientific-theory choice. Bayesian-Network models of the reliability of information sources are made applicable to Condorcet-style jury voting, Tversky and Kahneman’s Linda puzzle, the variety-of-evidence thesis, the Duhem–Quine thesis, and the informational value of testimony.Less
Probabilistic models have much to offer to epistemology and philosophy of science. Arguably, the coherence theory of justification claims that the more coherent a set of propositions is, the more confident one ought to be in its content, ceteris paribus. An impossibility result shows that there cannot exist a coherence ordering. A coherence quasi-ordering can be constructed that respects this claim and is relevant to scientific-theory choice. Bayesian-Network models of the reliability of information sources are made applicable to Condorcet-style jury voting, Tversky and Kahneman’s Linda puzzle, the variety-of-evidence thesis, the Duhem–Quine thesis, and the informational value of testimony.
Jie W Weiss and David J Weiss
- Published in print:
- 2008
- Published Online:
- January 2009
- ISBN:
- 9780195322989
- eISBN:
- 9780199869206
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195322989.003.0012
- Subject:
- Psychology, Cognitive Psychology
This chapter introduces psychologists to the Bayesian outlook in statistics. Bayesian statistics is based on a definition of probability as a particular measure of the opinions of ideally consistent ...
More
This chapter introduces psychologists to the Bayesian outlook in statistics. Bayesian statistics is based on a definition of probability as a particular measure of the opinions of ideally consistent people. Statistical inference is modification of these opinions in the light of evidence, and Bayes' theorem specifies how such modifications should be made. The tools of Bayesian statistics include the theory of specific distributions and the principle of stable estimation, which specifies when actual prior opinions may be satisfactorily approximated by a uniform distribution.Less
This chapter introduces psychologists to the Bayesian outlook in statistics. Bayesian statistics is based on a definition of probability as a particular measure of the opinions of ideally consistent people. Statistical inference is modification of these opinions in the light of evidence, and Bayes' theorem specifies how such modifications should be made. The tools of Bayesian statistics include the theory of specific distributions and the principle of stable estimation, which specifies when actual prior opinions may be satisfactorily approximated by a uniform distribution.
José M. Bernardo, M. J. Bayarri, James O. Berger, A. P. Dawid, David Heckerman, Adrian F. M. Smith, and Mike West (eds)
- Published in print:
- 2011
- Published Online:
- January 2012
- ISBN:
- 9780199694587
- eISBN:
- 9780191731921
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199694587.001.0001
- Subject:
- Mathematics, Probability / Statistics
The Valencia International Meetings on Bayesian Statistics – established in 1979 and held every four years – have been the forum for a definitive overview of current concerns and activities in ...
More
The Valencia International Meetings on Bayesian Statistics – established in 1979 and held every four years – have been the forum for a definitive overview of current concerns and activities in Bayesian statistics. These are the edited Proceedings of the Ninth meeting, and contain the invited papers each followed by their discussion and a rejoinder by the author(s). In the tradition of the earlier editions, this encompasses an enormous range of theoretical and applied research, highlighting the breadth, vitality and impact of Bayesian thinking in interdisciplinary research across many fields as well as the corresponding growth and vitality of core theory and methodology. The Valencia 9 invited papers cover a broad range of topics, including foundational and core theoretical issues in statistics, the continued development of new and refined computational methods for complex Bayesian modelling, substantive applications of flexible Bayesian modelling, and new developments in the theory and methodology of graphical modelling. They also describe advances in methodology for specific applied fields, including financial econometrics and portfolio decision making, public policy applications for drug surveillance, studies in the physical and environmental sciences, astronomy and astrophysics, climate change studies, molecular biosciences, statistical genetics or stochastic dynamic networks in systems biology.Less
The Valencia International Meetings on Bayesian Statistics – established in 1979 and held every four years – have been the forum for a definitive overview of current concerns and activities in Bayesian statistics. These are the edited Proceedings of the Ninth meeting, and contain the invited papers each followed by their discussion and a rejoinder by the author(s). In the tradition of the earlier editions, this encompasses an enormous range of theoretical and applied research, highlighting the breadth, vitality and impact of Bayesian thinking in interdisciplinary research across many fields as well as the corresponding growth and vitality of core theory and methodology. The Valencia 9 invited papers cover a broad range of topics, including foundational and core theoretical issues in statistics, the continued development of new and refined computational methods for complex Bayesian modelling, substantive applications of flexible Bayesian modelling, and new developments in the theory and methodology of graphical modelling. They also describe advances in methodology for specific applied fields, including financial econometrics and portfolio decision making, public policy applications for drug surveillance, studies in the physical and environmental sciences, astronomy and astrophysics, climate change studies, molecular biosciences, statistical genetics or stochastic dynamic networks in systems biology.
Jon Williamson
- Published in print:
- 2004
- Published Online:
- September 2007
- ISBN:
- 9780198530794
- eISBN:
- 9780191712982
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198530794.003.0003
- Subject:
- Mathematics, Logic / Computer Science / Mathematical Philosophy
This chapter introduces Bayesian networks and probabilistic independence, and shows how Bayesian nets are used to represent probability functions. Inference in Bayesian nets is discussed and the ...
More
This chapter introduces Bayesian networks and probabilistic independence, and shows how Bayesian nets are used to represent probability functions. Inference in Bayesian nets is discussed and the problem of constructing Bayesian nets is introduced. One of the simplest methods for constructing Bayesian nets — sequentially adding arrows — is explored in some detail in order to highlight some of the key features of the construction problem.Less
This chapter introduces Bayesian networks and probabilistic independence, and shows how Bayesian nets are used to represent probability functions. Inference in Bayesian nets is discussed and the problem of constructing Bayesian nets is introduced. One of the simplest methods for constructing Bayesian nets — sequentially adding arrows — is explored in some detail in order to highlight some of the key features of the construction problem.
Steven Sloman
- Published in print:
- 2005
- Published Online:
- January 2007
- ISBN:
- 9780195183115
- eISBN:
- 9780199870950
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195183115.001.0001
- Subject:
- Philosophy, Philosophy of Mind
Human beings are active agents who can think. To understand how thought serves action requires understanding how people conceive of the relation between cause and effect, between action and outcome. ...
More
Human beings are active agents who can think. To understand how thought serves action requires understanding how people conceive of the relation between cause and effect, between action and outcome. This book presents the question, in cognitive terms: how do people construct and reason with the causal models we use to represent our world? A revolution is occurring in how statisticians, philosophers, and computer scientists answer this question. Those fields have ushered in new insights about causal models by thinking about how to represent causal structure mathematically, in a framework that uses graphs and probability theory to develop what are called causal Bayesian networks. The framework starts with the idea that the purpose of causal structure is to understand and predict the effects of intervention. How does intervening on one thing affect other things? This is not a question merely about probability (or logic), but about action. The framework offers a new understanding of mind: thought is about the effects of intervention and cognition is thus intimately tied to actions that take place either in the actual physical world or in imagination, in counterfactual worlds. This book offers a conceptual introduction to the key mathematical ideas, presenting them in a non-technical way, focusing on the intuitions rather than the theorems. It tries to show why the ideas are important to understanding how people explain things and why thinking not only about the world as it is but the world as it could be is so central to human action. The book reviews the role of causality, causal models, and intervention in the basic human cognitive functions: decision making, reasoning, judgment, categorization, inductive inference, language, and learning. In short, the book offers a discussion about how people think, talk, learn, and explain things in causal terms, in terms of action and manipulation.Less
Human beings are active agents who can think. To understand how thought serves action requires understanding how people conceive of the relation between cause and effect, between action and outcome. This book presents the question, in cognitive terms: how do people construct and reason with the causal models we use to represent our world? A revolution is occurring in how statisticians, philosophers, and computer scientists answer this question. Those fields have ushered in new insights about causal models by thinking about how to represent causal structure mathematically, in a framework that uses graphs and probability theory to develop what are called causal Bayesian networks. The framework starts with the idea that the purpose of causal structure is to understand and predict the effects of intervention. How does intervening on one thing affect other things? This is not a question merely about probability (or logic), but about action. The framework offers a new understanding of mind: thought is about the effects of intervention and cognition is thus intimately tied to actions that take place either in the actual physical world or in imagination, in counterfactual worlds. This book offers a conceptual introduction to the key mathematical ideas, presenting them in a non-technical way, focusing on the intuitions rather than the theorems. It tries to show why the ideas are important to understanding how people explain things and why thinking not only about the world as it is but the world as it could be is so central to human action. The book reviews the role of causality, causal models, and intervention in the basic human cognitive functions: decision making, reasoning, judgment, categorization, inductive inference, language, and learning. In short, the book offers a discussion about how people think, talk, learn, and explain things in causal terms, in terms of action and manipulation.
N. Thompson Hobbs and Mevin B. Hooten
- Published in print:
- 2015
- Published Online:
- October 2017
- ISBN:
- 9780691159287
- eISBN:
- 9781400866557
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691159287.003.0005
- Subject:
- Biology, Ecology
This chapter lays out the basic principles of Bayesian inference, building on the concepts of probability developed in Chapter 3. It seeks to use the rules of probability to show how Bayes' theorem ...
More
This chapter lays out the basic principles of Bayesian inference, building on the concepts of probability developed in Chapter 3. It seeks to use the rules of probability to show how Bayes' theorem works, by making use of the conditional rule of probability and the law of total probability. The chapter begins with the central, underpinning tenet of the Bayesian view: the world can be divided into quantities that are observed and quantities that are unobserved. Unobserved quantities include parameters in models, latent states predicted by models, missing data, effect sizes, future states, and data before they are observed. We wish to learn about these quantities using observations. The Bayesian framework for achieving that understanding is applied in exactly the same way regardless of the specifics of the research problem at hand or the nature of the unobserved quantities.Less
This chapter lays out the basic principles of Bayesian inference, building on the concepts of probability developed in Chapter 3. It seeks to use the rules of probability to show how Bayes' theorem works, by making use of the conditional rule of probability and the law of total probability. The chapter begins with the central, underpinning tenet of the Bayesian view: the world can be divided into quantities that are observed and quantities that are unobserved. Unobserved quantities include parameters in models, latent states predicted by models, missing data, effect sizes, future states, and data before they are observed. We wish to learn about these quantities using observations. The Bayesian framework for achieving that understanding is applied in exactly the same way regardless of the specifics of the research problem at hand or the nature of the unobserved quantities.
N. Thompson Hobbs and Mevin B. Hooten
- Published in print:
- 2015
- Published Online:
- October 2017
- ISBN:
- 9780691159287
- eISBN:
- 9781400866557
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691159287.003.0006
- Subject:
- Biology, Ecology
This chapter seeks to explain hierarchical models and how they differ from simple Bayesian models and to illustrate building hierarchical models using mathematically correct expressions. It begins ...
More
This chapter seeks to explain hierarchical models and how they differ from simple Bayesian models and to illustrate building hierarchical models using mathematically correct expressions. It begins with the definition of hierarchical models. Next, the chapter introduces four general classes of hierarchical models that have broad application in ecology. These classes can be used individually or in combination to attack virtually any research problem. Examples are used to show how to draw Bayesian networks that portray stochastic relationships between observed and unobserved quantities. The chapter furthermore shows how to use network drawings as a guide for writing posterior and joint distributions.Less
This chapter seeks to explain hierarchical models and how they differ from simple Bayesian models and to illustrate building hierarchical models using mathematically correct expressions. It begins with the definition of hierarchical models. Next, the chapter introduces four general classes of hierarchical models that have broad application in ecology. These classes can be used individually or in combination to attack virtually any research problem. Examples are used to show how to draw Bayesian networks that portray stochastic relationships between observed and unobserved quantities. The chapter furthermore shows how to use network drawings as a guide for writing posterior and joint distributions.
N. Thompson Hobbs and Mevin B. Hooten
- Published in print:
- 2015
- Published Online:
- October 2017
- ISBN:
- 9780691159287
- eISBN:
- 9781400866557
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691159287.003.0010
- Subject:
- Biology, Ecology
This chapter offers a general set of steps for writing models to assist the researcher in formulating their own approach to the Bayesian model. The crucial skill of specifying models is often ...
More
This chapter offers a general set of steps for writing models to assist the researcher in formulating their own approach to the Bayesian model. The crucial skill of specifying models is often neglected in statistical texts in general and texts on Bayesian modeling in particular. The central importance of model specification also motivates this chapter. The overarching challenge in building models is to specify the components of the posterior distribution and the joint distribution and to factor the joint distribution into sensible parts. This chapter first lays out a framework for doing just that, albeit in somewhat abstract terms, before moving on to a more concrete example—the effects of grazing by livestock and wild ungulates on structure and function of a sagebrush steppe ecosystem.Less
This chapter offers a general set of steps for writing models to assist the researcher in formulating their own approach to the Bayesian model. The crucial skill of specifying models is often neglected in statistical texts in general and texts on Bayesian modeling in particular. The central importance of model specification also motivates this chapter. The overarching challenge in building models is to specify the components of the posterior distribution and the joint distribution and to factor the joint distribution into sensible parts. This chapter first lays out a framework for doing just that, albeit in somewhat abstract terms, before moving on to a more concrete example—the effects of grazing by livestock and wild ungulates on structure and function of a sagebrush steppe ecosystem.
Ludwig Fahrmeir and Thomas Kneib
- Published in print:
- 2011
- Published Online:
- September 2011
- ISBN:
- 9780199533022
- eISBN:
- 9780191728501
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199533022.001.0001
- Subject:
- Mathematics, Probability / Statistics, Biostatistics
Several recent advances in smoothing and semiparametric regression are presented in this book from a unifying, Bayesian perspective. Simulation-based full Bayesian Markov chain Monte Carlo (MCMC) ...
More
Several recent advances in smoothing and semiparametric regression are presented in this book from a unifying, Bayesian perspective. Simulation-based full Bayesian Markov chain Monte Carlo (MCMC) inference, as well as empirical Bayes procedures closely related to penalized likelihood estimation and mixed models, are considered here. Throughout, the focus is on semiparametric regression and smoothing based on basis expansions of unknown functions and effects in combination with smoothness priors for the basis coefficients. Beginning with a review of basic methods for smoothing and mixed models, longitudinal data, spatial data, and event history data are treated in separate chapters. Worked examples from various fields such as forestry, development economics, medicine, and marketing are used to illustrate the statistical methods covered in this book. Most of these examples have been analysed using implementations in the Bayesian software, BayesX, and some with R Codes.Less
Several recent advances in smoothing and semiparametric regression are presented in this book from a unifying, Bayesian perspective. Simulation-based full Bayesian Markov chain Monte Carlo (MCMC) inference, as well as empirical Bayes procedures closely related to penalized likelihood estimation and mixed models, are considered here. Throughout, the focus is on semiparametric regression and smoothing based on basis expansions of unknown functions and effects in combination with smoothness priors for the basis coefficients. Beginning with a review of basic methods for smoothing and mixed models, longitudinal data, spatial data, and event history data are treated in separate chapters. Worked examples from various fields such as forestry, development economics, medicine, and marketing are used to illustrate the statistical methods covered in this book. Most of these examples have been analysed using implementations in the Bayesian software, BayesX, and some with R Codes.
Brian J. Scholl
- Published in print:
- 2005
- Published Online:
- January 2007
- ISBN:
- 9780195179675
- eISBN:
- 9780199869794
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195179675.003.0003
- Subject:
- Philosophy, Metaphysics/Epistemology
This chapter explores a way in which visual processing may involve innate constraints and attempts to show how such processing overcomes one enduring challenge to nativism. In particular, many ...
More
This chapter explores a way in which visual processing may involve innate constraints and attempts to show how such processing overcomes one enduring challenge to nativism. In particular, many challenges to nativist theories in other areas of cognitive psychology (e.g., ‘theory of mind’, infant cognition) have focused on the later development of such abilities, and have argued that such development is in conflict with innate origins (since those origins would have to be somehow changed or overwritten). Innateness, in these contexts, is seen as antidevelopmental, associated instead with static processes and principles. In contrast, certain perceptual models demonstrate how the very same mental processes can both be innately specified and yet develop richly in response to experience with the environment. This process is entirely unmysterious, as shown in certain formal theories of visual perception, including those that appeal to spontaneous endogenous stimulation and those based on Bayesian inference.Less
This chapter explores a way in which visual processing may involve innate constraints and attempts to show how such processing overcomes one enduring challenge to nativism. In particular, many challenges to nativist theories in other areas of cognitive psychology (e.g., ‘theory of mind’, infant cognition) have focused on the later development of such abilities, and have argued that such development is in conflict with innate origins (since those origins would have to be somehow changed or overwritten). Innateness, in these contexts, is seen as antidevelopmental, associated instead with static processes and principles. In contrast, certain perceptual models demonstrate how the very same mental processes can both be innately specified and yet develop richly in response to experience with the environment. This process is entirely unmysterious, as shown in certain formal theories of visual perception, including those that appeal to spontaneous endogenous stimulation and those based on Bayesian inference.
Colin Crouch
- Published in print:
- 2005
- Published Online:
- September 2007
- ISBN:
- 9780199286652
- eISBN:
- 9780191713354
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199286652.003.0004
- Subject:
- Business and Management, Political Economy
Path dependence is a major instrument of neo-institutionalist theory. This chapter discusses the position of the economically rational actor and the scope left to the actor by the assumptions of ...
More
Path dependence is a major instrument of neo-institutionalist theory. This chapter discusses the position of the economically rational actor and the scope left to the actor by the assumptions of structural determinism. This will be done within the framework of the most heavily deterministic sets of arguments found in neo-institutionalism: path dependence theory. The model which is known as a Bayesian decision-maker is established in this chapter to model the relationship between actors and their environments. It is then shown how such actors who are institutional entrepreneurs seeking ways out of increasingly unsuccessful but still constraining paths might behave. Finally, the possibility of several viable alternatives is introduced, only one of which is likely to be discovered, to model how ideas of ‘one best way’ solutions become established.Less
Path dependence is a major instrument of neo-institutionalist theory. This chapter discusses the position of the economically rational actor and the scope left to the actor by the assumptions of structural determinism. This will be done within the framework of the most heavily deterministic sets of arguments found in neo-institutionalism: path dependence theory. The model which is known as a Bayesian decision-maker is established in this chapter to model the relationship between actors and their environments. It is then shown how such actors who are institutional entrepreneurs seeking ways out of increasingly unsuccessful but still constraining paths might behave. Finally, the possibility of several viable alternatives is introduced, only one of which is likely to be discovered, to model how ideas of ‘one best way’ solutions become established.
Jon Williamson
- Published in print:
- 2010
- Published Online:
- September 2010
- ISBN:
- 9780199228003
- eISBN:
- 9780191711060
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199228003.003.0006
- Subject:
- Mathematics, Probability / Statistics, Logic / Computer Science / Mathematical Philosophy
Objective Bayesianism has been criticized on the grounds‐that representing and reasoning with maximum entropy probability functions appears to be computationally infeasible for all but small problems ...
More
Objective Bayesianism has been criticized on the grounds‐that representing and reasoning with maximum entropy probability functions appears to be computationally infeasible for all but small problems (see, e.g. Pearl 1988, p.463). In this chapter we develop computational machinery that permits efficient representation and inference for objective Bayesian probability. This is the machinery of objective Bayesian nets and objective credal nets. Bayesian and credal nets are introduced in §6.1. They are adapted to handle objective Bayesian probability in §6.2. Then, by way of example, the formalism is applied to the problem of cancer prognosis in §6.3.Less
Objective Bayesianism has been criticized on the grounds‐that representing and reasoning with maximum entropy probability functions appears to be computationally infeasible for all but small problems (see, e.g. Pearl 1988, p.463). In this chapter we develop computational machinery that permits efficient representation and inference for objective Bayesian probability. This is the machinery of objective Bayesian nets and objective credal nets. Bayesian and credal nets are introduced in §6.1. They are adapted to handle objective Bayesian probability in §6.2. Then, by way of example, the formalism is applied to the problem of cancer prognosis in §6.3.