Amie L. Thomasson
- Published in print:
- 2007
- Published Online:
- September 2007
- ISBN:
- 9780195319910
- eISBN:
- 9780199869602
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195319910.001.0001
- Subject:
- Philosophy, Metaphysics/Epistemology
Arguments that ordinary inanimate objects such as tables and chairs, sticks and stones, simply do not exist have become increasingly common. Some arguments for eliminativism are based on demands for ...
More
Arguments that ordinary inanimate objects such as tables and chairs, sticks and stones, simply do not exist have become increasingly common. Some arguments for eliminativism are based on demands for parsimony or for a non-arbitrary answer to the special composition question; others arise from prohibitions against causal redundancy, ontological vagueness, or colocation; and others still come from worries that a common sense ontology would be a rival to a scientific one. This book makes the case that the mistakes behind all of these superficially diverse eliminativist arguments may be traced to a common source, and may be successfully resisted by adopting a small cluster of interrelated and independently plausible theses about reference, analyticity, and modality. By adopting these theses, we can make sense of our common sense world view without internal contradiction, violation of plausible metaphysical principles, or rivalry with a scientific ontology. In the end, however, the most important result of addressing these eliminativist arguments is not merely avoiding their conclusions. It also leads to important metaontological results, bringing into question widely held assumptions about which uses of metaphysical principles are appropriate, which metaphysical demands are answerable, and how we incur ontological commitments. As a result, the work of this book hopes to provide not only the route to a reflective understanding of our unreflective common sense world view, but also a better understanding of the proper methods and limits of metaphysics.Less
Arguments that ordinary inanimate objects such as tables and chairs, sticks and stones, simply do not exist have become increasingly common. Some arguments for eliminativism are based on demands for parsimony or for a non-arbitrary answer to the special composition question; others arise from prohibitions against causal redundancy, ontological vagueness, or colocation; and others still come from worries that a common sense ontology would be a rival to a scientific one. This book makes the case that the mistakes behind all of these superficially diverse eliminativist arguments may be traced to a common source, and may be successfully resisted by adopting a small cluster of interrelated and independently plausible theses about reference, analyticity, and modality. By adopting these theses, we can make sense of our common sense world view without internal contradiction, violation of plausible metaphysical principles, or rivalry with a scientific ontology. In the end, however, the most important result of addressing these eliminativist arguments is not merely avoiding their conclusions. It also leads to important metaontological results, bringing into question widely held assumptions about which uses of metaphysical principles are appropriate, which metaphysical demands are answerable, and how we incur ontological commitments. As a result, the work of this book hopes to provide not only the route to a reflective understanding of our unreflective common sense world view, but also a better understanding of the proper methods and limits of metaphysics.
David A Liberles (ed.)
- Published in print:
- 2007
- Published Online:
- September 2008
- ISBN:
- 9780199299188
- eISBN:
- 9780191714979
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199299188.001.0001
- Subject:
- Biology, Evolutionary Biology / Genetics
Ancestral sequence reconstruction is a technique of growing importance in molecular evolutionary biology and comparative genomics. As a powerful tool for testing evolutionary and ecological ...
More
Ancestral sequence reconstruction is a technique of growing importance in molecular evolutionary biology and comparative genomics. As a powerful tool for testing evolutionary and ecological hypotheses, as well as uncovering the link between sequence and molecular phenotype, there are potential applications in almost all fields of applied molecular biology. This book starts with a historical overview of the field, before discussing the potential applications in drug discovery and the pharmaceutical industry. This is followed by a section on computational methodology, which provides a detailed discussion of the available methods for reconstructing ancestral sequences (including their advantages, disadvantages, and potential pitfalls). Purely computational applications of the technique are then covered, including whole proteome reconstruction. Further chapters provide a detailed discussion on taking computationally reconstructed sequences and synthesizing them in the laboratory. The book concludes with a description of the scientific questions where experimental ancestral sequence reconstruction has been utilized to provide insights and inform future research.Less
Ancestral sequence reconstruction is a technique of growing importance in molecular evolutionary biology and comparative genomics. As a powerful tool for testing evolutionary and ecological hypotheses, as well as uncovering the link between sequence and molecular phenotype, there are potential applications in almost all fields of applied molecular biology. This book starts with a historical overview of the field, before discussing the potential applications in drug discovery and the pharmaceutical industry. This is followed by a section on computational methodology, which provides a detailed discussion of the available methods for reconstructing ancestral sequences (including their advantages, disadvantages, and potential pitfalls). Purely computational applications of the technique are then covered, including whole proteome reconstruction. Further chapters provide a detailed discussion on taking computationally reconstructed sequences and synthesizing them in the laboratory. The book concludes with a description of the scientific questions where experimental ancestral sequence reconstruction has been utilized to provide insights and inform future research.
Mark C. Baker
- Published in print:
- 2005
- Published Online:
- January 2007
- ISBN:
- 9780195179675
- eISBN:
- 9780199869794
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195179675.003.0010
- Subject:
- Philosophy, Metaphysics/Epistemology
This chapter examines two different views of universal grammar. Most linguists assume that universal grammar is underspecified — providing us with an incomplete grammar to be elaborated by learning. ...
More
This chapter examines two different views of universal grammar. Most linguists assume that universal grammar is underspecified — providing us with an incomplete grammar to be elaborated by learning. But the alternative is that it is overspecified — providing us with a full range of possible grammars from which we select one on the basis of environmental input. Underspecification is now the dominant view in the developmental sciences, and is often treated as the null hypothesis on grounds of greater possibility, parsimony, and simplicity. The chapter questions whether the underspecification view is really feasible and whether it is more parsimonious than the overspecification view, drawing on examples from certain African languages. It also shows that the perplexity evoked by overspecification theories disappears if language has a concealing purpose as well as a communicating purpose, similar to a code.Less
This chapter examines two different views of universal grammar. Most linguists assume that universal grammar is underspecified — providing us with an incomplete grammar to be elaborated by learning. But the alternative is that it is overspecified — providing us with a full range of possible grammars from which we select one on the basis of environmental input. Underspecification is now the dominant view in the developmental sciences, and is often treated as the null hypothesis on grounds of greater possibility, parsimony, and simplicity. The chapter questions whether the underspecification view is really feasible and whether it is more parsimonious than the overspecification view, drawing on examples from certain African languages. It also shows that the perplexity evoked by overspecification theories disappears if language has a concealing purpose as well as a communicating purpose, similar to a code.
Ádám Miklósi
- Published in print:
- 2007
- Published Online:
- January 2008
- ISBN:
- 9780199295852
- eISBN:
- 9780191711688
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199295852.003.0001
- Subject:
- Biology, Animal Biology
Dogs have always been the focus of human interest about nature. In many cultures they even won the ‘prestigious’ title of being man's best friend, whilst in others dogs did not receive such sympathy ...
More
Dogs have always been the focus of human interest about nature. In many cultures they even won the ‘prestigious’ title of being man's best friend, whilst in others dogs did not receive such sympathy from humans. This chapter reviews the history of dogs in science and puts the study of dogs into an ethological perspective that is interested in questions on function and evolution of behaviour in parallel with understanding behavioural mechanism and development. Comparative investigations lie at the heart of the study of dogs, and the chapter provides an overview on theoretical problems associated with the comparative method.Less
Dogs have always been the focus of human interest about nature. In many cultures they even won the ‘prestigious’ title of being man's best friend, whilst in others dogs did not receive such sympathy from humans. This chapter reviews the history of dogs in science and puts the study of dogs into an ethological perspective that is interested in questions on function and evolution of behaviour in parallel with understanding behavioural mechanism and development. Comparative investigations lie at the heart of the study of dogs, and the chapter provides an overview on theoretical problems associated with the comparative method.
Vanessa Barker
- Published in print:
- 2009
- Published Online:
- September 2009
- ISBN:
- 9780195370027
- eISBN:
- 9780199871315
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195370027.003.0004
- Subject:
- Political Science, American Politics
This chapter details the case study of Washington State. It shows how a democratic process based on deliberative democracy led to relatively mild penal regime, with low rates of imprisonment and high ...
More
This chapter details the case study of Washington State. It shows how a democratic process based on deliberative democracy led to relatively mild penal regime, with low rates of imprisonment and high rates of community sanctions. It shows how high rates of civic engagement and norms of reciprocity and mutual obligation made polity members less willing to infringe upon the rights and liberties of others, creating a more inclusive political community, one that incorporated racial and ethnic minorities and criminal offenders. It analyzes how Washington's history of farmer cooperatives provided the cultural and institutional support for the use of community sanctions based on the disciplinary power of work and community service. The chapter also analyzes how the deliberative process moderated the harsh demands of an outraged crime victims movement.Less
This chapter details the case study of Washington State. It shows how a democratic process based on deliberative democracy led to relatively mild penal regime, with low rates of imprisonment and high rates of community sanctions. It shows how high rates of civic engagement and norms of reciprocity and mutual obligation made polity members less willing to infringe upon the rights and liberties of others, creating a more inclusive political community, one that incorporated racial and ethnic minorities and criminal offenders. It analyzes how Washington's history of farmer cooperatives provided the cultural and institutional support for the use of community sanctions based on the disciplinary power of work and community service. The chapter also analyzes how the deliberative process moderated the harsh demands of an outraged crime victims movement.
Philip Stratton-Lake and Brad Hooker
- Published in print:
- 2006
- Published Online:
- May 2010
- ISBN:
- 9780199269914
- eISBN:
- 9780191710032
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199269914.003.0008
- Subject:
- Philosophy, Moral Philosophy, General
This chapter offers a partial defence of Scanlon's buck-passing account of the relation between base properties, goodness, and practical reasons. Jonathan Dancy and Roger Crisp have both argued that ...
More
This chapter offers a partial defence of Scanlon's buck-passing account of the relation between base properties, goodness, and practical reasons. Jonathan Dancy and Roger Crisp have both argued that even if Scanlon's buck-passing account is superior to the Moorean account, there are other contending accounts that Scanlon does not consider. Against Dancy and Crisp, Stratton–Lake and Hooker argue that these proposed accounts, although genuine alternatives to the Moorean and buck-passing accounts, are nevertheless deeply problematic and do nothing to harm the case for Scanlon's account. Regarding Scanlon's two arguments, the authors find that the parsimony argument, once clarified, does offer some support for the buck-passing view, but that the appeal to value pluralism does not. Finally, they defend Scanlon's account against an ‘open question’ worry about the relation between the fact that something has reason-giving properties and its goodness.Less
This chapter offers a partial defence of Scanlon's buck-passing account of the relation between base properties, goodness, and practical reasons. Jonathan Dancy and Roger Crisp have both argued that even if Scanlon's buck-passing account is superior to the Moorean account, there are other contending accounts that Scanlon does not consider. Against Dancy and Crisp, Stratton–Lake and Hooker argue that these proposed accounts, although genuine alternatives to the Moorean and buck-passing accounts, are nevertheless deeply problematic and do nothing to harm the case for Scanlon's account. Regarding Scanlon's two arguments, the authors find that the parsimony argument, once clarified, does offer some support for the buck-passing view, but that the appeal to value pluralism does not. Finally, they defend Scanlon's account against an ‘open question’ worry about the relation between the fact that something has reason-giving properties and its goodness.
Tal Pupko, Adi Doron-Faigenboim, David A. Liberles, and Gina M. Cannarozzi
- Published in print:
- 2007
- Published Online:
- September 2008
- ISBN:
- 9780199299188
- eISBN:
- 9780191714979
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199299188.003.0004
- Subject:
- Biology, Evolutionary Biology / Genetics
Modeling of sequence evolution is fundamental to ancestral sequence reconstruction. Care must be taken in choosing a model, however, as the use of unrealistic models can lead to erroneous ...
More
Modeling of sequence evolution is fundamental to ancestral sequence reconstruction. Care must be taken in choosing a model, however, as the use of unrealistic models can lead to erroneous conclusions. The choice of model and the effects of assumptions inherent within are discussed in this chapter in terms of their effects on probabilistic ancestral sequence reconstruction. This chapter discusses standard probabilistic models, site rate variation to these models, and deviations from the standard (homogeneous, stationary, reversible) models. Model selection, selecting one model from many, given data, and the comparison of different models are included as well as covarion models, the use of outside information when modeling, and the treatment of gaps.Less
Modeling of sequence evolution is fundamental to ancestral sequence reconstruction. Care must be taken in choosing a model, however, as the use of unrealistic models can lead to erroneous conclusions. The choice of model and the effects of assumptions inherent within are discussed in this chapter in terms of their effects on probabilistic ancestral sequence reconstruction. This chapter discusses standard probabilistic models, site rate variation to these models, and deviations from the standard (homogeneous, stationary, reversible) models. Model selection, selecting one model from many, given data, and the comparison of different models are included as well as covarion models, the use of outside information when modeling, and the treatment of gaps.
K.M. Jaszczolt
- Published in print:
- 2005
- Published Online:
- September 2007
- ISBN:
- 9780199261987
- eISBN:
- 9780191718656
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199261987.003.0003
- Subject:
- Linguistics, Semantics and Pragmatics
This chapter contains the exposition of the new theory of default Semantics. It is proposed that the problems with compositionality of meaning can be solved when compositionality is viewed ‘one level ...
More
This chapter contains the exposition of the new theory of default Semantics. It is proposed that the problems with compositionality of meaning can be solved when compositionality is viewed ‘one level higher’ as pertaining to merger representations — semantic representations that are a product of the synthesis of information about meaning coming from various sources, such as word meaning, sentence structure, conscious pragmatic inference, cognitive defaults, and social-cultural defaults. Cognitive principles underlying such mergers are proposed, and the foundations of a metalanguage for formalization in relational semantics are specified.Less
This chapter contains the exposition of the new theory of default Semantics. It is proposed that the problems with compositionality of meaning can be solved when compositionality is viewed ‘one level higher’ as pertaining to merger representations — semantic representations that are a product of the synthesis of information about meaning coming from various sources, such as word meaning, sentence structure, conscious pragmatic inference, cognitive defaults, and social-cultural defaults. Cognitive principles underlying such mergers are proposed, and the foundations of a metalanguage for formalization in relational semantics are specified.
Jerrold I. Davis, Kevin C. Nixon, and Damon P. Little
- Published in print:
- 2006
- Published Online:
- September 2007
- ISBN:
- 9780199297306
- eISBN:
- 9780191713729
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199297306.003.0007
- Subject:
- Biology, Evolutionary Biology / Genetics
Software for cladistic analysis has been widely available for more than twenty years, and a series of advances made during this time have facilitated the analysis of matrices of ever-increasing size. ...
More
Software for cladistic analysis has been widely available for more than twenty years, and a series of advances made during this time have facilitated the analysis of matrices of ever-increasing size. This chapter provides an overview of the development of parsimony methods for cladistic analysis, describes strategies that have allowed large data matrices to be analysed by conventional methods, and in doing so, demonstrates that data sets historically considered intracable could in fact have been readily approached using then-available hardware and software. Preliminary analyses, even when unsuccessful at discovering most-parsimonious trees, can be used to identify appropriate software settings for use during thorough analyses. A useful indicator of the settings that yield the most efficient searches is the excess branch swapping ratio, which is the ratio between the number of tree rearrangements conducted during a particular phase of branch swapping in which shorter trees are being discovered, and the minimum possible number of rearrangements during this phase. It is concluded that two-stage search strategies, with intensive branch swapping conducted on a small percentage of the most optimal sets of trees obtained by a large number of relatively short searches, are more efficient than one-stage searches.Less
Software for cladistic analysis has been widely available for more than twenty years, and a series of advances made during this time have facilitated the analysis of matrices of ever-increasing size. This chapter provides an overview of the development of parsimony methods for cladistic analysis, describes strategies that have allowed large data matrices to be analysed by conventional methods, and in doing so, demonstrates that data sets historically considered intracable could in fact have been readily approached using then-available hardware and software. Preliminary analyses, even when unsuccessful at discovering most-parsimonious trees, can be used to identify appropriate software settings for use during thorough analyses. A useful indicator of the settings that yield the most efficient searches is the excess branch swapping ratio, which is the ratio between the number of tree rearrangements conducted during a particular phase of branch swapping in which shorter trees are being discovered, and the minimum possible number of rearrangements during this phase. It is concluded that two-stage search strategies, with intensive branch swapping conducted on a small percentage of the most optimal sets of trees obtained by a large number of relatively short searches, are more efficient than one-stage searches.
David A. Liberles
- Published in print:
- 2006
- Published Online:
- September 2007
- ISBN:
- 9780199297306
- eISBN:
- 9780191713729
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199297306.003.0010
- Subject:
- Biology, Evolutionary Biology / Genetics
As genome sequencing projects have propagated, comparative genomics has emerged as a method of choice for understanding protein function. There are simple approaches for comparing sequences, like ...
More
As genome sequencing projects have propagated, comparative genomics has emerged as a method of choice for understanding protein function. There are simple approaches for comparing sequences, like relative entropy or binary transformations of gene content comparisons. However, phylogenetic methods that explicitly consider evolutionary history are not only more powerful, but enable additional types of analysis drawing on knowledge in parallel fields, such as ecology, anthropology, and geology. This chapter focuses both on methodological issues and on their application to real genomic-scale problems. Parsimony and maximum likelihood are two phylogenetic approaches that are used and often compared side-by-side. While the choice between them has been contentious at times, they frequently give similar results and where they don't, they can complement each other. Maximum likelihood works well when a good model is available. Parsimony works well when a good model does not or can not exist, as for very complex processes, and also along very short branches where multiple events per position (as in a sequence) are extremely infrequent. Both methods can be used to estimate ancestral states in a phylogenetic tree. Parsimony based ancestral character reconstruction is fast and can be performed easily in large scale genomic applications.Less
As genome sequencing projects have propagated, comparative genomics has emerged as a method of choice for understanding protein function. There are simple approaches for comparing sequences, like relative entropy or binary transformations of gene content comparisons. However, phylogenetic methods that explicitly consider evolutionary history are not only more powerful, but enable additional types of analysis drawing on knowledge in parallel fields, such as ecology, anthropology, and geology. This chapter focuses both on methodological issues and on their application to real genomic-scale problems. Parsimony and maximum likelihood are two phylogenetic approaches that are used and often compared side-by-side. While the choice between them has been contentious at times, they frequently give similar results and where they don't, they can complement each other. Maximum likelihood works well when a good model is available. Parsimony works well when a good model does not or can not exist, as for very complex processes, and also along very short branches where multiple events per position (as in a sequence) are extremely infrequent. Both methods can be used to estimate ancestral states in a phylogenetic tree. Parsimony based ancestral character reconstruction is fast and can be performed easily in large scale genomic applications.
Yu-Feng Hsu
Ronald Carter and William Hayes (eds)
- Published in print:
- 2005
- Published Online:
- March 2012
- ISBN:
- 9780520098473
- eISBN:
- 9780520916067
- Item type:
- book
- Publisher:
- University of California Press
- DOI:
- 10.1525/california/9780520098473.001.0001
- Subject:
- Biology, Animal Biology
Heliodinids are tiny, brightly colored dayflying moths. This book proposes phylogenetic relationships among genera of Heliodinidae using parsimony and character compatibility, and describes and ...
More
Heliodinids are tiny, brightly colored dayflying moths. This book proposes phylogenetic relationships among genera of Heliodinidae using parsimony and character compatibility, and describes and illustrates 45 North and Central species (25 newly named) assigned to five genera (two new, two exhumed from synonymy). Larval host plants are recorded for 33 species (14 newly discovered), about 45% of the known fauna; 90% of these are specialists on Caryophyllales, primarily Nyctaginaceae.Less
Heliodinids are tiny, brightly colored dayflying moths. This book proposes phylogenetic relationships among genera of Heliodinidae using parsimony and character compatibility, and describes and illustrates 45 North and Central species (25 newly named) assigned to five genera (two new, two exhumed from synonymy). Larval host plants are recorded for 33 species (14 newly discovered), about 45% of the known fauna; 90% of these are specialists on Caryophyllales, primarily Nyctaginaceae.
Donna Harrington
- Published in print:
- 2008
- Published Online:
- January 2009
- ISBN:
- 9780195339888
- eISBN:
- 9780199863662
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195339888.003.0004
- Subject:
- Social Work, Research and Evaluation
This chapter examines how to determine whether a confirmatory factor analysis model fits well, including a discussion of the various fit indices available, which ones to use, and thresholds for ...
More
This chapter examines how to determine whether a confirmatory factor analysis model fits well, including a discussion of the various fit indices available, which ones to use, and thresholds for determining acceptable fit. Assessment of model fit involves considering a number of indices of model fit, including absolute fit, parsimony correction, comparative fit, and predictive fit indices. Recommendations for identifying acceptable model fit are presented, and methods of finding sources of poor fit were discussed. Model revision, including the use and testing of nested models, modification indices, localized areas of strain, and specification search, is discussed. The chapter also addresses how to revise a model that does not fit well, including incorporating theory-based changes and the use of modification indices. Finally, a detailed confirmatory factor analysis (CFA) example is presented that includes a discussion of all the aspects of specifying, testing, assessing, and revising the model.Less
This chapter examines how to determine whether a confirmatory factor analysis model fits well, including a discussion of the various fit indices available, which ones to use, and thresholds for determining acceptable fit. Assessment of model fit involves considering a number of indices of model fit, including absolute fit, parsimony correction, comparative fit, and predictive fit indices. Recommendations for identifying acceptable model fit are presented, and methods of finding sources of poor fit were discussed. Model revision, including the use and testing of nested models, modification indices, localized areas of strain, and specification search, is discussed. The chapter also addresses how to revise a model that does not fit well, including incorporating theory-based changes and the use of modification indices. Finally, a detailed confirmatory factor analysis (CFA) example is presented that includes a discussion of all the aspects of specifying, testing, assessing, and revising the model.
Ekkehart Schlicht
- Published in print:
- 1998
- Published Online:
- November 2003
- ISBN:
- 9780198292241
- eISBN:
- 9780191596865
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198292244.003.0008
- Subject:
- Economics and Finance, Microeconomics, History of Economic Thought
Custom consists of a set of regularities that may be described as rules. This chapter explores the nature of rules as cognitive entities, and what distinguishes rules from arbitrary associations ...
More
Custom consists of a set of regularities that may be described as rules. This chapter explores the nature of rules as cognitive entities, and what distinguishes rules from arbitrary associations (‘random rules’). Our cognitive make‐up induces discontinuity in rule formation and gives rise to rigidity and hysteresis on a purely cognitive level.Less
Custom consists of a set of regularities that may be described as rules. This chapter explores the nature of rules as cognitive entities, and what distinguishes rules from arbitrary associations (‘random rules’). Our cognitive make‐up induces discontinuity in rule formation and gives rise to rigidity and hysteresis on a purely cognitive level.
Hud Hudson
- Published in print:
- 2005
- Published Online:
- January 2009
- ISBN:
- 9780199282579
- eISBN:
- 9780191712463
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199282579.003.0005
- Subject:
- Philosophy, Metaphysics/Epistemology
A substantivalist dualist who believes in spacetime and material objects is committed to a relation of occupation or location. This chapter explores different candidate descriptions of this relation ...
More
A substantivalist dualist who believes in spacetime and material objects is committed to a relation of occupation or location. This chapter explores different candidate descriptions of this relation and investigates some of the philosophical difficulties which arise for the resulting conceptions. One reason to focus on the metaphysics of occupation relations is that some of them seem to make room for the possibility of extended mereological simples. The chapter critically evaluates several philosophical attempts to promote the cause of extended simples. Four problems are then advanced to yield the conclusion that extended mereological simples are conceptually possible yet metaphysically impossible: the problem of spatial intrinsics, the problem of shapes, the problem of parsimony, and the problem of diachoric identity.Less
A substantivalist dualist who believes in spacetime and material objects is committed to a relation of occupation or location. This chapter explores different candidate descriptions of this relation and investigates some of the philosophical difficulties which arise for the resulting conceptions. One reason to focus on the metaphysics of occupation relations is that some of them seem to make room for the possibility of extended mereological simples. The chapter critically evaluates several philosophical attempts to promote the cause of extended simples. Four problems are then advanced to yield the conclusion that extended mereological simples are conceptually possible yet metaphysically impossible: the problem of spatial intrinsics, the problem of shapes, the problem of parsimony, and the problem of diachoric identity.
XAVIER GABAIX and DAVID LAIBSON
- Published in print:
- 2008
- Published Online:
- October 2011
- ISBN:
- 9780195328318
- eISBN:
- 9780199851768
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195328318.003.0012
- Subject:
- Economics and Finance, Economic History
Models are significant as they are used to provide a supposedly reliable description or representation of the world. Most of the models that scientists attempt to generate and analyze are based on ...
More
Models are significant as they are used to provide a supposedly reliable description or representation of the world. Most of the models that scientists attempt to generate and analyze are based on assumptions that are only believed to be true since such models would not consider irregularities and inconsistencies with common theory. This chapter introduces seven key properties, whether they already be widely accepted or have yet to be accepted at all, that a good economic model should possess: 1) parsimony, 2) tractability, 3) conceptual insightfulness, 4) generalizability, 5) falsifiability, 6) empirical consistency, and 7) predictive precision. For this analysis, it is argued that classical optimization assumptions are not necessary in coming up with economic models, and that these should be regarded as hypotheses that require testing.Less
Models are significant as they are used to provide a supposedly reliable description or representation of the world. Most of the models that scientists attempt to generate and analyze are based on assumptions that are only believed to be true since such models would not consider irregularities and inconsistencies with common theory. This chapter introduces seven key properties, whether they already be widely accepted or have yet to be accepted at all, that a good economic model should possess: 1) parsimony, 2) tractability, 3) conceptual insightfulness, 4) generalizability, 5) falsifiability, 6) empirical consistency, and 7) predictive precision. For this analysis, it is argued that classical optimization assumptions are not necessary in coming up with economic models, and that these should be regarded as hypotheses that require testing.
Allison B. Kaufman and James C. Kaufman (eds)
- Published in print:
- 2018
- Published Online:
- September 2018
- ISBN:
- 9780262037426
- eISBN:
- 9780262344814
- Item type:
- book
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262037426.001.0001
- Subject:
- Psychology, Cognitive Psychology
In a post-truth, fake news world, we are particularly susceptible to the claims of pseudoscience. When emotions and opinions are more widely disseminated than scientific findings, and self-proclaimed ...
More
In a post-truth, fake news world, we are particularly susceptible to the claims of pseudoscience. When emotions and opinions are more widely disseminated than scientific findings, and self-proclaimed experts get their expertise from Google, how can the average person distinguish real science from fake? This book examines pseudoscience from a variety of perspectives, through case studies, analysis, and personal accounts that show how to recognize pseudoscience, why it is so widely accepted, and how to advocate for real science. Contributors examine the basics of pseudoscience, including issues of cognitive bias; the costs of pseudoscience, with accounts of naturopathy and logical fallacies in the anti-vaccination movement; perceptions of scientific soundness; the mainstream presence of “integrative medicine,” hypnosis, and parapsychology; and the use of case studies and new media in science advocacy.Less
In a post-truth, fake news world, we are particularly susceptible to the claims of pseudoscience. When emotions and opinions are more widely disseminated than scientific findings, and self-proclaimed experts get their expertise from Google, how can the average person distinguish real science from fake? This book examines pseudoscience from a variety of perspectives, through case studies, analysis, and personal accounts that show how to recognize pseudoscience, why it is so widely accepted, and how to advocate for real science. Contributors examine the basics of pseudoscience, including issues of cognitive bias; the costs of pseudoscience, with accounts of naturopathy and logical fallacies in the anti-vaccination movement; perceptions of scientific soundness; the mainstream presence of “integrative medicine,” hypnosis, and parapsychology; and the use of case studies and new media in science advocacy.
John Braithwaite and Philip Pettit
- Published in print:
- 1992
- Published Online:
- October 2011
- ISBN:
- 9780198240563
- eISBN:
- 9780191680205
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198240563.003.0006
- Subject:
- Philosophy, Political Philosophy
This chapter shows where the republican target of promoting dominion is likely to lead the criminal justice system. It illustrates what the comprehensiveness requirement means in practical policy ...
More
This chapter shows where the republican target of promoting dominion is likely to lead the criminal justice system. It illustrates what the comprehensiveness requirement means in practical policy terms, looking at the key questions of criminal justice in a systemic way. It considers ten questions in turn. The treatment given to each question is determined by the need: (a) to show how the theory recommends different policies from its various better-known competitors—liberalism, retributivism, utilitarianism, and preventionism; (b) to show how the theory recommends policies different from contemporary practice; and (c) to show the transformation in the research agenda of criminology required by the theory. Before coming to the consideration of the ten questions, the chapter identifies four general presumptions the republican stance supports (parsimony, the checking of power, reprobation, and reintegration). These presumptions serve as middle-range principles for interpreting the abstract goal endorsed by republicans: the promotion of dominion.Less
This chapter shows where the republican target of promoting dominion is likely to lead the criminal justice system. It illustrates what the comprehensiveness requirement means in practical policy terms, looking at the key questions of criminal justice in a systemic way. It considers ten questions in turn. The treatment given to each question is determined by the need: (a) to show how the theory recommends different policies from its various better-known competitors—liberalism, retributivism, utilitarianism, and preventionism; (b) to show how the theory recommends policies different from contemporary practice; and (c) to show the transformation in the research agenda of criminology required by the theory. Before coming to the consideration of the ten questions, the chapter identifies four general presumptions the republican stance supports (parsimony, the checking of power, reprobation, and reintegration). These presumptions serve as middle-range principles for interpreting the abstract goal endorsed by republicans: the promotion of dominion.
ZIHENG YANG
- Published in print:
- 2006
- Published Online:
- April 2010
- ISBN:
- 9780198567028
- eISBN:
- 9780191728280
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198567028.003.0003
- Subject:
- Biology, Evolutionary Biology / Genetics
This chapter provides an overview of phylogeny reconstruction methods. It introduces some basic concepts used to describe trees and discusses general features of tree-reconstruction methods. Distance ...
More
This chapter provides an overview of phylogeny reconstruction methods. It introduces some basic concepts used to describe trees and discusses general features of tree-reconstruction methods. Distance and parsimony methods are also discussed.Less
This chapter provides an overview of phylogeny reconstruction methods. It introduces some basic concepts used to describe trees and discusses general features of tree-reconstruction methods. Distance and parsimony methods are also discussed.
ZIHENG YANG
- Published in print:
- 2006
- Published Online:
- April 2010
- ISBN:
- 9780198567028
- eISBN:
- 9780191728280
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198567028.003.0006
- Subject:
- Biology, Evolutionary Biology / Genetics
This chapter discusses two problems: the evaluation of statistical properties of tree reconstruction methods and tests of the significance of estimated phylogenies. Section 6.1 discusses criteria for ...
More
This chapter discusses two problems: the evaluation of statistical properties of tree reconstruction methods and tests of the significance of estimated phylogenies. Section 6.1 discusses criteria for assessing the statistical properties of tree reconstruction methods. A summary of simulation studies conducted to evaluate different methods is provided, as well as some recommendations concerning the use of those methods in practical data analysis. Sections 6.2 and 6.3 deal with the likelihood versus parsimony debate from the likelihood and parsimony perspectives, respectively. Section 6.4 provides an overview of methods for assessing the reliability of estimated phylogenies.Less
This chapter discusses two problems: the evaluation of statistical properties of tree reconstruction methods and tests of the significance of estimated phylogenies. Section 6.1 discusses criteria for assessing the statistical properties of tree reconstruction methods. A summary of simulation studies conducted to evaluate different methods is provided, as well as some recommendations concerning the use of those methods in practical data analysis. Sections 6.2 and 6.3 deal with the likelihood versus parsimony debate from the likelihood and parsimony perspectives, respectively. Section 6.4 provides an overview of methods for assessing the reliability of estimated phylogenies.
Ziheng Yang
- Published in print:
- 2014
- Published Online:
- August 2014
- ISBN:
- 9780199602605
- eISBN:
- 9780191782251
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199602605.001.0001
- Subject:
- Biology, Biomathematics / Statistics and Data Analysis / Complexity Studies, Evolutionary Biology / Genetics
This book summarizes the statistical models and computational algorithms for comparative analysis of genetic sequence data in the fields of molecular evolution, molecular phylogenetics, and ...
More
This book summarizes the statistical models and computational algorithms for comparative analysis of genetic sequence data in the fields of molecular evolution, molecular phylogenetics, and statistical phylogeography. The book presents and explains the models of nucleotide, amino acid, and codon substitution, and their use in calculating pairwise sequence distances and in reconstruction of phylogenetic trees. All major methods for phylogeny reconstruction are covered in detail, including neighbour joining, maximum parsimony, maximum likelihood, and Bayesian methods. Using motivating examples, the book includes a comprehensive introduction to Bayesian computation using Markov chain Monte Carlo (MCMC). Advanced topics include estimation of species divergence times using the molecular clock, detection of molecular adaptation, simulation of molecular evolution, as well as species tree estimation and species delimitation using genomic sequence data.Less
This book summarizes the statistical models and computational algorithms for comparative analysis of genetic sequence data in the fields of molecular evolution, molecular phylogenetics, and statistical phylogeography. The book presents and explains the models of nucleotide, amino acid, and codon substitution, and their use in calculating pairwise sequence distances and in reconstruction of phylogenetic trees. All major methods for phylogeny reconstruction are covered in detail, including neighbour joining, maximum parsimony, maximum likelihood, and Bayesian methods. Using motivating examples, the book includes a comprehensive introduction to Bayesian computation using Markov chain Monte Carlo (MCMC). Advanced topics include estimation of species divergence times using the molecular clock, detection of molecular adaptation, simulation of molecular evolution, as well as species tree estimation and species delimitation using genomic sequence data.