Keith Banting, Richard Johnston, Will Kymlicka, and Stuart Soroka
- Published in print:
- 2006
- Published Online:
- May 2007
- ISBN:
- 9780199289172
- eISBN:
- 9780191711084
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199289172.003.0002
- Subject:
- Political Science, Political Economy
This chapter introduces a new framework for testing the recognition/redistribution hypothesis. It develops an index of twenty-three different types of MCPs that have been adopted for three different ...
More
This chapter introduces a new framework for testing the recognition/redistribution hypothesis. It develops an index of twenty-three different types of MCPs that have been adopted for three different types of minority groups (immigrants, national minorities, and indigenous peoples). Western countries are then categorized in terms of their level of MCPs. Whether countries with higher levels of MCPs have faced an erosion of the welfare state as compared to countries with lower levels of MCPs is tested. It is shown that there is no negative correlation between the strength of a country's commitment to MCPs and its ability to sustain welfare spending or economic redistribution. The chapter also examines the heterogeneity/redistribution hypothesis, and shows that this too is overstated. In general, the size of immigrant groups, national minorities, and indigenous peoples in Western countries does not affect a country's ability to sustain its welfare commitments, although a rapid change in the size of immigrant groups does seem to have an effect. Yet even here, the authors of this chapter argue, there are hints that adopting MCPs can help to mitigate whatever negative effect a rapidly increasing immigrant population may have.Less
This chapter introduces a new framework for testing the recognition/redistribution hypothesis. It develops an index of twenty-three different types of MCPs that have been adopted for three different types of minority groups (immigrants, national minorities, and indigenous peoples). Western countries are then categorized in terms of their level of MCPs. Whether countries with higher levels of MCPs have faced an erosion of the welfare state as compared to countries with lower levels of MCPs is tested. It is shown that there is no negative correlation between the strength of a country's commitment to MCPs and its ability to sustain welfare spending or economic redistribution. The chapter also examines the heterogeneity/redistribution hypothesis, and shows that this too is overstated. In general, the size of immigrant groups, national minorities, and indigenous peoples in Western countries does not affect a country's ability to sustain its welfare commitments, although a rapid change in the size of immigrant groups does seem to have an effect. Yet even here, the authors of this chapter argue, there are hints that adopting MCPs can help to mitigate whatever negative effect a rapidly increasing immigrant population may have.
Richard Swinburne (ed.)
- Published in print:
- 2005
- Published Online:
- January 2012
- ISBN:
- 9780197263419
- eISBN:
- 9780191734175
- Item type:
- book
- Publisher:
- British Academy
- DOI:
- 10.5871/bacad/9780197263419.001.0001
- Subject:
- Philosophy, Logic/Philosophy of Mathematics
Bayes' theorem is a tool for assessing how probable evidence makes some hypothesis. The papers in this book consider the worth and applicability of the theorem. The book sets out the philosophical ...
More
Bayes' theorem is a tool for assessing how probable evidence makes some hypothesis. The papers in this book consider the worth and applicability of the theorem. The book sets out the philosophical issues: Elliott Sober argues that there are other criteria for assessing hypotheses; Colin Howson, Philip Dawid, and John Earman consider how the theorem can be used in statistical science, in weighing evidence in criminal trials, and in assessing evidence for the occurrence of miracles; and David Miller argues for the worth of the probability calculus as a tool for measuring propensities in nature rather than the strength of evidence. The book ends with the original paper containing the theorem, presented to the Royal Society in 1763.Less
Bayes' theorem is a tool for assessing how probable evidence makes some hypothesis. The papers in this book consider the worth and applicability of the theorem. The book sets out the philosophical issues: Elliott Sober argues that there are other criteria for assessing hypotheses; Colin Howson, Philip Dawid, and John Earman consider how the theorem can be used in statistical science, in weighing evidence in criminal trials, and in assessing evidence for the occurrence of miracles; and David Miller argues for the worth of the probability calculus as a tool for measuring propensities in nature rather than the strength of evidence. The book ends with the original paper containing the theorem, presented to the Royal Society in 1763.
Richard Swinburne
- Published in print:
- 2004
- Published Online:
- September 2007
- ISBN:
- 9780199271672
- eISBN:
- 9780191709357
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199271672.001.0001
- Subject:
- Philosophy, Philosophy of Religion
This book assesses the worth of arguments for and against the existence of God. Evidence confirms (makes more probable) an explanatory hypothesis in so far as (1) given the hypothesis the evidence is ...
More
This book assesses the worth of arguments for and against the existence of God. Evidence confirms (makes more probable) an explanatory hypothesis in so far as (1) given the hypothesis the evidence is to be expected, that is the hypothesis makes the evidence probable, (2) the evidence is not otherwise to be expected, (3) the hypothesis is simple, and (4) it fits with background knowledge (i.e., knowledge about how things behave in neighbouring fields of enquiry). When we are assessing hypotheses (such as theism, the hypothesis that there is a God) purporting to explain everything, there will be no background knowledge. Theism is a very simple hypothesis. If there is a God, there is some reason to expect that he will create a universe, with laws of nature, leading to the evolution of humans (bodies connected to souls), who often have experiences which seem to them experiences of God. It is most improbable that all this evidence would exist if there was no God. Taken together therefore all this evidence makes it probable that there is a God. The occurrence of evil, whether produced by humans or natural processes, does not significantly diminish that probability.Less
This book assesses the worth of arguments for and against the existence of God. Evidence confirms (makes more probable) an explanatory hypothesis in so far as (1) given the hypothesis the evidence is to be expected, that is the hypothesis makes the evidence probable, (2) the evidence is not otherwise to be expected, (3) the hypothesis is simple, and (4) it fits with background knowledge (i.e., knowledge about how things behave in neighbouring fields of enquiry). When we are assessing hypotheses (such as theism, the hypothesis that there is a God) purporting to explain everything, there will be no background knowledge. Theism is a very simple hypothesis. If there is a God, there is some reason to expect that he will create a universe, with laws of nature, leading to the evolution of humans (bodies connected to souls), who often have experiences which seem to them experiences of God. It is most improbable that all this evidence would exist if there was no God. Taken together therefore all this evidence makes it probable that there is a God. The occurrence of evil, whether produced by humans or natural processes, does not significantly diminish that probability.
John L. Bell
- Published in print:
- 2005
- Published Online:
- September 2007
- ISBN:
- 9780198568520
- eISBN:
- 9780191717581
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198568520.001.0001
- Subject:
- Mathematics, Logic / Computer Science / Mathematical Philosophy
This is the third edition of a well-known graduate textbook on Boolean-valued models of set theory. The aim of the first and second editions was to provide a systematic and adequately motivated ...
More
This is the third edition of a well-known graduate textbook on Boolean-valued models of set theory. The aim of the first and second editions was to provide a systematic and adequately motivated exposition of the theory of Boolean-valued models as developed by Scott and Solovay in the 1960s, deriving along the way the central set theoretic independence proofs of Cohen and others in the particularly elegant form that the Boolean-valued approach enables them to assume. In this edition, the background material has been augmented to include an introduction to Heyting algebras. It includes chapters on Boolean-valued analysis and Heyting-algebra-valued models of intuitionistic set theory.Less
This is the third edition of a well-known graduate textbook on Boolean-valued models of set theory. The aim of the first and second editions was to provide a systematic and adequately motivated exposition of the theory of Boolean-valued models as developed by Scott and Solovay in the 1960s, deriving along the way the central set theoretic independence proofs of Cohen and others in the particularly elegant form that the Boolean-valued approach enables them to assume. In this edition, the background material has been augmented to include an introduction to Heyting algebras. It includes chapters on Boolean-valued analysis and Heyting-algebra-valued models of intuitionistic set theory.
Mark L. Latash
- Published in print:
- 2008
- Published Online:
- May 2009
- ISBN:
- 9780195333169
- eISBN:
- 9780199864195
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195333169.001.0001
- Subject:
- Neuroscience, Sensory and Motor Systems, Techniques
This book discusses a general problem in biology: the lack of an adequate language for formulating biologically specific problems. This book describes recent progress in the control and coordination ...
More
This book discusses a general problem in biology: the lack of an adequate language for formulating biologically specific problems. This book describes recent progress in the control and coordination of human movement. It begins with a brief history of movement studies and reviews the current central controversies in the area of control of movements with an emphasis on the equilibrium-point hypothesis. An operational definition of synergy is introduced and a method of analysis of synergies is described based on the uncontrolled manifold hypothesis. This method is further used to characterize synergies in a variety of tasks including such common motor tasks as standing, pointing, reaching, standing-up, and manipulation of hand-held objects. Applications of this method to movements by persons with neurological disorders, persons with atypical development, and healthy elderly persons are illustrated, as well as changes in motor synergies with practice. Possible neurophysiological mechanisms of synergies are also discussed, focusing on such conspicuous structures as the spinal cord, the cerebellum, the basal ganglia, and the cortex of the large hemispheres. A variety of models are discussed based on different computational and neurophysiological principles. Possible applications of the introduced definition of synergies to other areas such as perception and language are discussed.Less
This book discusses a general problem in biology: the lack of an adequate language for formulating biologically specific problems. This book describes recent progress in the control and coordination of human movement. It begins with a brief history of movement studies and reviews the current central controversies in the area of control of movements with an emphasis on the equilibrium-point hypothesis. An operational definition of synergy is introduced and a method of analysis of synergies is described based on the uncontrolled manifold hypothesis. This method is further used to characterize synergies in a variety of tasks including such common motor tasks as standing, pointing, reaching, standing-up, and manipulation of hand-held objects. Applications of this method to movements by persons with neurological disorders, persons with atypical development, and healthy elderly persons are illustrated, as well as changes in motor synergies with practice. Possible neurophysiological mechanisms of synergies are also discussed, focusing on such conspicuous structures as the spinal cord, the cerebellum, the basal ganglia, and the cortex of the large hemispheres. A variety of models are discussed based on different computational and neurophysiological principles. Possible applications of the introduced definition of synergies to other areas such as perception and language are discussed.
Rein Taagepera
- Published in print:
- 2008
- Published Online:
- September 2008
- ISBN:
- 9780199534661
- eISBN:
- 9780191715921
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199534661.003.0006
- Subject:
- Political Science, Comparative Politics, Political Economy
The null hypothesis offers close to null prediction. Directional hypotheses are relatively easy to satisfy and offer correspondingly vague predictions. Quantitative hypotheses (models) are hard to ...
More
The null hypothesis offers close to null prediction. Directional hypotheses are relatively easy to satisfy and offer correspondingly vague predictions. Quantitative hypotheses (models) are hard to satisfy and offer quantitatively falsifiable predictions. The customary hypothesis-testing recipe in social sciences, “hypothesis → data collection → testing → acceptance/rejection,” is only a single cycle in an ascending spiral. Having “p=.01” does NOT mean confirmation in 99% of replications.Less
The null hypothesis offers close to null prediction. Directional hypotheses are relatively easy to satisfy and offer correspondingly vague predictions. Quantitative hypotheses (models) are hard to satisfy and offer quantitatively falsifiable predictions. The customary hypothesis-testing recipe in social sciences, “hypothesis → data collection → testing → acceptance/rejection,” is only a single cycle in an ascending spiral. Having “p=.01” does NOT mean confirmation in 99% of replications.
Daron Acemoglu, Simon Johnson, and James Robinson
- Published in print:
- 2006
- Published Online:
- September 2006
- ISBN:
- 9780195305197
- eISBN:
- 9780199783519
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0195305191.003.0002
- Subject:
- Economics and Finance, Development, Growth, and Environmental
Geography and institutions are the two main contenders to explain the fundamental causes of cross-country differences in prosperity. The geography hypothesis — which has a large following both in the ...
More
Geography and institutions are the two main contenders to explain the fundamental causes of cross-country differences in prosperity. The geography hypothesis — which has a large following both in the popular imagination and in academia — maintains that the geography, climate, and ecology of a society’s location shape both its technology and the incentives of its inhabitants. This essay argues that differences in institutions are more important than geography for understanding the divergent economic and social conditions of nations. While the geography hypothesis emphasizes forces of nature as a primary factor in the poverty of nations, the institutions hypothesis is about man-made influences. A case is developed for the importance of institutions which draws on the history of European colonization.Less
Geography and institutions are the two main contenders to explain the fundamental causes of cross-country differences in prosperity. The geography hypothesis — which has a large following both in the popular imagination and in academia — maintains that the geography, climate, and ecology of a society’s location shape both its technology and the incentives of its inhabitants. This essay argues that differences in institutions are more important than geography for understanding the divergent economic and social conditions of nations. While the geography hypothesis emphasizes forces of nature as a primary factor in the poverty of nations, the institutions hypothesis is about man-made influences. A case is developed for the importance of institutions which draws on the history of European colonization.
H. Peyton Young
- Published in print:
- 2004
- Published Online:
- October 2011
- ISBN:
- 9780199269181
- eISBN:
- 9780191699375
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199269181.001.0001
- Subject:
- Economics and Finance, Econometrics
This book is based on the Arne Ryde Lectures in 2002. The book suggests a conceptual framework for studying strategic learning and highlights theoretical developments in the area. It discusses the ...
More
This book is based on the Arne Ryde Lectures in 2002. The book suggests a conceptual framework for studying strategic learning and highlights theoretical developments in the area. It discusses the interactive learning problem; reinforcement and regret; equilibrium; conditional no-regret learning; prediction, postdiction, and calibration; fictitious play and its variants; Bayesian learning; and hypothesis testing. The book’s framework emphasizes the amount of information required to implement different types of learning rules, criteria for evaluating their performance, and alternative notions of equilibrium to which they converge. The book also stresses the limits of what can be achieved: for a given type of game and a given amount of information, there may exist no learning procedure that satisfies certain reasonable criteria of performance and convergence.Less
This book is based on the Arne Ryde Lectures in 2002. The book suggests a conceptual framework for studying strategic learning and highlights theoretical developments in the area. It discusses the interactive learning problem; reinforcement and regret; equilibrium; conditional no-regret learning; prediction, postdiction, and calibration; fictitious play and its variants; Bayesian learning; and hypothesis testing. The book’s framework emphasizes the amount of information required to implement different types of learning rules, criteria for evaluating their performance, and alternative notions of equilibrium to which they converge. The book also stresses the limits of what can be achieved: for a given type of game and a given amount of information, there may exist no learning procedure that satisfies certain reasonable criteria of performance and convergence.
Patrick R. Laughlin
- Published in print:
- 2011
- Published Online:
- October 2017
- ISBN:
- 9780691147918
- eISBN:
- 9781400836673
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691147918.003.0005
- Subject:
- Psychology, Social Psychology
This chapter discusses collective induction, the cooperative search for descriptive, predictive, and explanatory generalizations, rules, and principles. As a psychological process induction begins ...
More
This chapter discusses collective induction, the cooperative search for descriptive, predictive, and explanatory generalizations, rules, and principles. As a psychological process induction begins with the perception of some pattern, regularity, or relationship. The two basic processes in induction are hypothesis formation and hypothesis evaluation. This inductive process occurs for both single individuals and cooperative groups such as scientific research teams, auditing teams, securities and intelligence analysts, art experts, or air crash investigators. Theoretically, collective induction is a divisible and complementary group task in which groups may perform better than individuals by dividing the task into subtasks and combining the different insights, understandings, strategies, and other cognitive processes of the group members.Less
This chapter discusses collective induction, the cooperative search for descriptive, predictive, and explanatory generalizations, rules, and principles. As a psychological process induction begins with the perception of some pattern, regularity, or relationship. The two basic processes in induction are hypothesis formation and hypothesis evaluation. This inductive process occurs for both single individuals and cooperative groups such as scientific research teams, auditing teams, securities and intelligence analysts, art experts, or air crash investigators. Theoretically, collective induction is a divisible and complementary group task in which groups may perform better than individuals by dividing the task into subtasks and combining the different insights, understandings, strategies, and other cognitive processes of the group members.
Steffen Kühnel
- Published in print:
- 1998
- Published Online:
- November 2003
- ISBN:
- 9780198292371
- eISBN:
- 9780191600159
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198292376.003.0004
- Subject:
- Political Science, Reference
Extending the regression model to path analysis, in which all the variables are randomly distributed and treating the variances and co‐variances as exogenous variables. The worked example ...
More
Extending the regression model to path analysis, in which all the variables are randomly distributed and treating the variances and co‐variances as exogenous variables. The worked example demonstrates the use of PRELIS, LISREL 8, and GLIM software, and how to interpret the resulting statistics.Less
Extending the regression model to path analysis, in which all the variables are randomly distributed and treating the variances and co‐variances as exogenous variables. The worked example demonstrates the use of PRELIS, LISREL 8, and GLIM software, and how to interpret the resulting statistics.
Edouard Machery
- Published in print:
- 2009
- Published Online:
- May 2009
- ISBN:
- 9780195306880
- eISBN:
- 9780199867950
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195306880.003.0001
- Subject:
- Philosophy, Philosophy of Mind
The Introduction discusses the purpose of the book, which is to attempt to rejuvenate the philosophy of concepts by steering it toward a new course. The book argues, the Introduction explains, that ...
More
The Introduction discusses the purpose of the book, which is to attempt to rejuvenate the philosophy of concepts by steering it toward a new course. The book argues, the Introduction explains, that progress in the psychology of concepts and in the budding neuropsychology of concepts is conditional on psychologists and neuropsychologists eliminating the notion of concept from their theoretical vocabulary. This eliminativist proposal is the last tenet of the hypothesis that is developed at length in this book — the Heterogeneity Hypothesis. The Introduction then outlines to contents of the chapters that follow.Less
The Introduction discusses the purpose of the book, which is to attempt to rejuvenate the philosophy of concepts by steering it toward a new course. The book argues, the Introduction explains, that progress in the psychology of concepts and in the budding neuropsychology of concepts is conditional on psychologists and neuropsychologists eliminating the notion of concept from their theoretical vocabulary. This eliminativist proposal is the last tenet of the hypothesis that is developed at length in this book — the Heterogeneity Hypothesis. The Introduction then outlines to contents of the chapters that follow.
Fiona Cowie
- Published in print:
- 2003
- Published Online:
- October 2011
- ISBN:
- 9780195159783
- eISBN:
- 9780199849529
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195159783.001.0001
- Subject:
- Philosophy, Philosophy of Mind
This book reconsiders the influential nativist position toward the mind. Nativists assert that some concepts, beliefs, or capacities are innate or inborn: “native” to the mind rather than acquired. ...
More
This book reconsiders the influential nativist position toward the mind. Nativists assert that some concepts, beliefs, or capacities are innate or inborn: “native” to the mind rather than acquired. The author argues that this view is mistaken, demonstrating that nativism is an unstable amalgam of two quite different—and probably inconsistent—theses about the mind. Unlike empiricists, who postulate domain-neutral learning strategies, nativists insist that some learning tasks require special kinds of skills, and that these skills are hard-wired into our brains at birth. This “faculties hypothesis” finds its modern expression in the views of Noam Chomsky. The author, marshalling recent empirical evidence from developmental psychology, psycholinguistics, computer science, and linguistics, provides a critique of Chomsky's nativism and defends in its place a moderately nativist approach to language acquisition. Also, in contrast to empiricists, who view the mind as simply another natural phenomenon susceptible to scientific explanation, nativists suspect that the mental is inelectably mysterious. The author addresses this second strand in nativist thought, taking on the view articulated by Jerry Fodor and other nativists that learning, particularly concept acquisition, is a fundamentally inexplicable process. She challenges this explanatory pessimism, and argues that concept acquisition is psychologically explicable.Less
This book reconsiders the influential nativist position toward the mind. Nativists assert that some concepts, beliefs, or capacities are innate or inborn: “native” to the mind rather than acquired. The author argues that this view is mistaken, demonstrating that nativism is an unstable amalgam of two quite different—and probably inconsistent—theses about the mind. Unlike empiricists, who postulate domain-neutral learning strategies, nativists insist that some learning tasks require special kinds of skills, and that these skills are hard-wired into our brains at birth. This “faculties hypothesis” finds its modern expression in the views of Noam Chomsky. The author, marshalling recent empirical evidence from developmental psychology, psycholinguistics, computer science, and linguistics, provides a critique of Chomsky's nativism and defends in its place a moderately nativist approach to language acquisition. Also, in contrast to empiricists, who view the mind as simply another natural phenomenon susceptible to scientific explanation, nativists suspect that the mental is inelectably mysterious. The author addresses this second strand in nativist thought, taking on the view articulated by Jerry Fodor and other nativists that learning, particularly concept acquisition, is a fundamentally inexplicable process. She challenges this explanatory pessimism, and argues that concept acquisition is psychologically explicable.
Justin London
- Published in print:
- 2004
- Published Online:
- September 2007
- ISBN:
- 9780195160819
- eISBN:
- 9780199786763
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195160819.001.0001
- Subject:
- Psychology, Cognitive Psychology
This book develops a theory of musical meter based on psychological research in temporal perception, cognition, and motor behavior. Meter is regarded as a kind of entrainment, a synchronization of ...
More
This book develops a theory of musical meter based on psychological research in temporal perception, cognition, and motor behavior. Meter is regarded as a kind of entrainment, a synchronization of attention and actions to the rhythms of the environment. Drawing on research on the ability to make durational discriminations and categorizations at various tempos, the “speed limits” for meter are given: the inter-onset interval for metric elements must be greater than 100ms (10 per second) and less than 1.5-2.00 seconds. Care is taken to distinguish rhythms or patterns of duration from meters, the listener/performer's complex patterns of expectation and attention. It is thus shown that metric behaviors are highly tempo-dependent. Ambiguities may arise when a rhythmic pattern may be regarded under more than one meter, and conflicts may arise when a pattern of durations contradicts the ongoing meter. The theoretical core of the book is its development of a set of metric well-formedness constraints, which limit the temporal range and organization of patterns of metric entrainment. A consideration of the rhythmic practices of various non-western cultures, including some African and Indian music, leads to an additional well-formedness constraint, that of maximal evenness. This allows for meters that involve uneven (i.e., non-isochronous) beats. The book concludes with the many meters hypothesis, which proposes that a large number of expressively timed temporal templates are acquired, which are readily used when listening in familiar musical contexts.Less
This book develops a theory of musical meter based on psychological research in temporal perception, cognition, and motor behavior. Meter is regarded as a kind of entrainment, a synchronization of attention and actions to the rhythms of the environment. Drawing on research on the ability to make durational discriminations and categorizations at various tempos, the “speed limits” for meter are given: the inter-onset interval for metric elements must be greater than 100ms (10 per second) and less than 1.5-2.00 seconds. Care is taken to distinguish rhythms or patterns of duration from meters, the listener/performer's complex patterns of expectation and attention. It is thus shown that metric behaviors are highly tempo-dependent. Ambiguities may arise when a rhythmic pattern may be regarded under more than one meter, and conflicts may arise when a pattern of durations contradicts the ongoing meter. The theoretical core of the book is its development of a set of metric well-formedness constraints, which limit the temporal range and organization of patterns of metric entrainment. A consideration of the rhythmic practices of various non-western cultures, including some African and Indian music, leads to an additional well-formedness constraint, that of maximal evenness. This allows for meters that involve uneven (i.e., non-isochronous) beats. The book concludes with the many meters hypothesis, which proposes that a large number of expressively timed temporal templates are acquired, which are readily used when listening in familiar musical contexts.
Elliott Sober
- Published in print:
- 1975
- Published Online:
- October 2011
- ISBN:
- 9780198244073
- eISBN:
- 9780191680724
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198244073.001.0001
- Subject:
- Philosophy, Philosophy of Science
The diversity of our intuitions about simplicity is matched only by the tenacity with which these intuitions refuse to yield to formal characterization. Our intuitions seem unanimous in favour of ...
More
The diversity of our intuitions about simplicity is matched only by the tenacity with which these intuitions refuse to yield to formal characterization. Our intuitions seem unanimous in favour of sparse ontologies, smooth curves, homogeneous universes, invariant equations, and impoverished assumptions. Yet recent theorizing about simplicity presents a veritable chaos of opinion. Here one finds arguments that simplicity is high probability, that it is low probability, and that it is not a probability at all. Indeed, the complexities of the problem of simplicity have led some to question the possibility and the fruitfulness of trying to define the notion of simplicity that seems to be involved in hypothesis choice. This book tries to show that the simplicity of a hypothesis can be measured by attending to how well it answers certain kinds of questions. The more informative a hypothesis is in answering these questions, the simpler it is. The informativeness of hypotheses relative to questions is characterized by the amount of extra information they need to yield answers. The more additional information a hypothesis needs to answer a question, the less informative it is relative to that question.Less
The diversity of our intuitions about simplicity is matched only by the tenacity with which these intuitions refuse to yield to formal characterization. Our intuitions seem unanimous in favour of sparse ontologies, smooth curves, homogeneous universes, invariant equations, and impoverished assumptions. Yet recent theorizing about simplicity presents a veritable chaos of opinion. Here one finds arguments that simplicity is high probability, that it is low probability, and that it is not a probability at all. Indeed, the complexities of the problem of simplicity have led some to question the possibility and the fruitfulness of trying to define the notion of simplicity that seems to be involved in hypothesis choice. This book tries to show that the simplicity of a hypothesis can be measured by attending to how well it answers certain kinds of questions. The more informative a hypothesis is in answering these questions, the simpler it is. The informativeness of hypotheses relative to questions is characterized by the amount of extra information they need to yield answers. The more additional information a hypothesis needs to answer a question, the less informative it is relative to that question.
Bryan Frances
- Published in print:
- 2005
- Published Online:
- October 2005
- ISBN:
- 9780199282135
- eISBN:
- 9780191602917
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0199282137.001.0001
- Subject:
- Philosophy, Metaphysics/Epistemology
The nagging voice of the sceptic has always been present in epistemology. Over the last thirty years or so, philosophers have thought of several promising ways to counter the radical sceptic. For ...
More
The nagging voice of the sceptic has always been present in epistemology. Over the last thirty years or so, philosophers have thought of several promising ways to counter the radical sceptic. For instance, facts about the reliability of cognitive processes, principles determining which possibilities must be ruled out in order to have knowledge, and principles regarding the context-sensitivity of knowledge attributions. In this research monograph, Bryan Frances presents a new argument template for generating new kinds of radical scepticism, ones that hold even if all the clever anti-sceptical fixes such as contextualism, relevant alternatives theory, and reliabilism defeat the traditional sceptic. However, the new sceptical conclusions are quite different from traditional scepticism. Although the new sceptic concludes that people don’t know that fire engines are red, that people sometimes have pains in their knees, or even that people believe that fire engines are red or that knees sometimes throb, people admit that they know millions of exotic truths, such as the fact that black holes exist. One can know about the existence of black holes, but not about the colour of one’s shirt or even about what one believes regarding the colour of one’s shirt. The new sceptical arguments proceed in the usual way (here’s a sceptical hypothesis; one can’t neutralize it, one has to be able to neutralize it to know P; so one doesn’t know P), but the sceptical hypotheses plugged into it are “real, live” scientific-philosophical hypotheses often thought to be actually true, such as error theories about belief, colour, pain location, and character traits. Frances investigates the questions, ‘Under what conditions do we need to rule out these error theories in order to know things inconsistent with them?’ and ‘Can we rule them out?’Less
The nagging voice of the sceptic has always been present in epistemology. Over the last thirty years or so, philosophers have thought of several promising ways to counter the radical sceptic. For instance, facts about the reliability of cognitive processes, principles determining which possibilities must be ruled out in order to have knowledge, and principles regarding the context-sensitivity of knowledge attributions. In this research monograph, Bryan Frances presents a new argument template for generating new kinds of radical scepticism, ones that hold even if all the clever anti-sceptical fixes such as contextualism, relevant alternatives theory, and reliabilism defeat the traditional sceptic. However, the new sceptical conclusions are quite different from traditional scepticism. Although the new sceptic concludes that people don’t know that fire engines are red, that people sometimes have pains in their knees, or even that people believe that fire engines are red or that knees sometimes throb, people admit that they know millions of exotic truths, such as the fact that black holes exist. One can know about the existence of black holes, but not about the colour of one’s shirt or even about what one believes regarding the colour of one’s shirt. The new sceptical arguments proceed in the usual way (here’s a sceptical hypothesis; one can’t neutralize it, one has to be able to neutralize it to know P; so one doesn’t know P), but the sceptical hypotheses plugged into it are “real, live” scientific-philosophical hypotheses often thought to be actually true, such as error theories about belief, colour, pain location, and character traits. Frances investigates the questions, ‘Under what conditions do we need to rule out these error theories in order to know things inconsistent with them?’ and ‘Can we rule them out?’
Joshua D. Duntley and David M. Buss
- Published in print:
- 2005
- Published Online:
- January 2007
- ISBN:
- 9780195179675
- eISBN:
- 9780199869794
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195179675.003.0017
- Subject:
- Philosophy, Metaphysics/Epistemology
This chapter presents a new theory of homicide — homicide adaptation theory — which proposes that humans evolved adaptations to facilitate killing. The new theory is contrasted with two competing ...
More
This chapter presents a new theory of homicide — homicide adaptation theory — which proposes that humans evolved adaptations to facilitate killing. The new theory is contrasted with two competing conceptions of why people kill: the by-product hypothesis and the evolved goal hypothesis. The concept of ‘innateness’ in relation to the conception of evolved homicide adaptations presented in this chapter is discussed.Less
This chapter presents a new theory of homicide — homicide adaptation theory — which proposes that humans evolved adaptations to facilitate killing. The new theory is contrasted with two competing conceptions of why people kill: the by-product hypothesis and the evolved goal hypothesis. The concept of ‘innateness’ in relation to the conception of evolved homicide adaptations presented in this chapter is discussed.
Nikolas Rose and Joelle M. Abi-Rached
- Published in print:
- 2013
- Published Online:
- October 2017
- ISBN:
- 9780691149608
- eISBN:
- 9781400846337
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691149608.003.0006
- Subject:
- Neuroscience, Development
This chapter looks at the social brain hypothesis. The term social brain has come to stand for the argument that the human brain, and indeed that of some other animals, is specialized for a ...
More
This chapter looks at the social brain hypothesis. The term social brain has come to stand for the argument that the human brain, and indeed that of some other animals, is specialized for a collective form of life. One part of this argument is evolutionary: that the size and complexity of the brains of primates, including humans, are related to the size and complexity of their characteristic social groups. However, the social brain hypothesis is more than a general account of the role of brain size: for in this thesis, the capacities for sociality are neurally located in a specific set of brain regions shaped by evolution, notably the amygdala, orbital frontal cortex, and temporal cortex—regions that have the function of facilitating an understanding of what one might call the “mental life” of others.Less
This chapter looks at the social brain hypothesis. The term social brain has come to stand for the argument that the human brain, and indeed that of some other animals, is specialized for a collective form of life. One part of this argument is evolutionary: that the size and complexity of the brains of primates, including humans, are related to the size and complexity of their characteristic social groups. However, the social brain hypothesis is more than a general account of the role of brain size: for in this thesis, the capacities for sociality are neurally located in a specific set of brain regions shaped by evolution, notably the amygdala, orbital frontal cortex, and temporal cortex—regions that have the function of facilitating an understanding of what one might call the “mental life” of others.
Francis G. Castles
- Published in print:
- 2004
- Published Online:
- November 2004
- ISBN:
- 9780199270170
- eISBN:
- 9780191601514
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0199270171.003.0001
- Subject:
- Political Science, Political Economy
Explores the themes of what follows. It argues that the welfare state literature of the past quarter century has been big on crisis theories of the welfare state and much weaker on facts. The chapter ...
More
Explores the themes of what follows. It argues that the welfare state literature of the past quarter century has been big on crisis theories of the welfare state and much weaker on facts. The chapter suggests that, in order to say something meaningful about the future of the welfare state, we need to test crisis accounts, such as those built around globalization and population ageing, with facts drawn from comparative analysis to establish which are myths and which are realities.Less
Explores the themes of what follows. It argues that the welfare state literature of the past quarter century has been big on crisis theories of the welfare state and much weaker on facts. The chapter suggests that, in order to say something meaningful about the future of the welfare state, we need to test crisis accounts, such as those built around globalization and population ageing, with facts drawn from comparative analysis to establish which are myths and which are realities.
Bryan Frances
- Published in print:
- 2005
- Published Online:
- October 2005
- ISBN:
- 9780199282135
- eISBN:
- 9780191602917
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0199282137.003.0013
- Subject:
- Philosophy, Metaphysics/Epistemology
The importance of what was argued in the book was evaluated, with comments on which elements are of lasting significance for epistemology as a discipline. Notions treated include epistemic deference, ...
More
The importance of what was argued in the book was evaluated, with comments on which elements are of lasting significance for epistemology as a discipline. Notions treated include epistemic deference, liveness of hypotheses, mere mortality with respect to a hypothesis, epistemic superiority, responsibility to one’s epistemic community, the epistemic significance of expert disagreement, epistemic externalism, and content externalism.Less
The importance of what was argued in the book was evaluated, with comments on which elements are of lasting significance for epistemology as a discipline. Notions treated include epistemic deference, liveness of hypotheses, mere mortality with respect to a hypothesis, epistemic superiority, responsibility to one’s epistemic community, the epistemic significance of expert disagreement, epistemic externalism, and content externalism.
Ernest Nicholson
- Published in print:
- 2002
- Published Online:
- September 2011
- ISBN:
- 9780199257836
- eISBN:
- 9780191698484
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199257836.001.0001
- Subject:
- Religion, Biblical Studies
Despite innumerable studies from at least the time of the Reformation, it was not until little more than a century ago that one hypothesis concerning the origin of the Pentateuch, the so-called ...
More
Despite innumerable studies from at least the time of the Reformation, it was not until little more than a century ago that one hypothesis concerning the origin of the Pentateuch, the so-called ‘Documentary Theory’ formulated by Julius Wellhausen, established itself as the point of departure for all subsequent study of this topic. This has remained so until recently, but during the past twenty-five years the study of the Pentateuch has been once more in turmoil, and new theories have proliferated. This book arises from the conviction that much in current Pentateuchal research needs to be subjected to rigorous scrutiny and that much, indeed, is radically mistaken. The author argues that the work of Wellhausen, for all that it needs revision and development in detail, remains the securest basis for understanding the Pentateuch. The book is not a mere call to go ‘back to Wellhausen’, however, for the author also shows that much in the intervening debate has significantly modified his conclusions, as well as asking questions that were not on Wellhausen's agenda. But the Documentary Hypothesis should remain our primary point of reference, and it alone provides the most dependable perspective from which to approach this most difficult of areas in the study of the Old Testament.Less
Despite innumerable studies from at least the time of the Reformation, it was not until little more than a century ago that one hypothesis concerning the origin of the Pentateuch, the so-called ‘Documentary Theory’ formulated by Julius Wellhausen, established itself as the point of departure for all subsequent study of this topic. This has remained so until recently, but during the past twenty-five years the study of the Pentateuch has been once more in turmoil, and new theories have proliferated. This book arises from the conviction that much in current Pentateuchal research needs to be subjected to rigorous scrutiny and that much, indeed, is radically mistaken. The author argues that the work of Wellhausen, for all that it needs revision and development in detail, remains the securest basis for understanding the Pentateuch. The book is not a mere call to go ‘back to Wellhausen’, however, for the author also shows that much in the intervening debate has significantly modified his conclusions, as well as asking questions that were not on Wellhausen's agenda. But the Documentary Hypothesis should remain our primary point of reference, and it alone provides the most dependable perspective from which to approach this most difficult of areas in the study of the Old Testament.