Neil Abell, David W. Springer, and Akihito Kamata
- Published in print:
- 2009
- Published Online:
- September 2009
- ISBN:
- 9780195333367
- eISBN:
- 9780199864300
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195333367.003.0004
- Subject:
- Social Work, Research and Evaluation
This chapter provides a theoretical overview of reliability, as well as pragmatic considerations in establishing different types of reliability. To illustrate key points, it draws from two scales: ...
More
This chapter provides a theoretical overview of reliability, as well as pragmatic considerations in establishing different types of reliability. To illustrate key points, it draws from two scales: the Family Responsibility Scale and the Parental Self-Care Scale. Various forms of reliability are addressed, including interrater, test-retest, and internal consistency. Guidelines for interpreting reliability coefficients for clinical and research purposes are provided, including computation of stratified alpha for multidimensional measures. Computation of the standard error of measurement (SEM) is illustrated. The chapter concludes by asserting that a solid reliability coefficient is indispensable as a primary principle in assessing the quality of scores from a scale or test.Less
This chapter provides a theoretical overview of reliability, as well as pragmatic considerations in establishing different types of reliability. To illustrate key points, it draws from two scales: the Family Responsibility Scale and the Parental Self-Care Scale. Various forms of reliability are addressed, including interrater, test-retest, and internal consistency. Guidelines for interpreting reliability coefficients for clinical and research purposes are provided, including computation of stratified alpha for multidimensional measures. Computation of the standard error of measurement (SEM) is illustrated. The chapter concludes by asserting that a solid reliability coefficient is indispensable as a primary principle in assessing the quality of scores from a scale or test.
Mette Elise Jolly
- Published in print:
- 2007
- Published Online:
- September 2007
- ISBN:
- 9780199213078
- eISBN:
- 9780191707155
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199213078.003.0005
- Subject:
- Political Science, European Union
This chapter outlines the methods used in the empirical part of the book and addresses issues such as validity and reliability, both in general and also with specific reference to the empirical ...
More
This chapter outlines the methods used in the empirical part of the book and addresses issues such as validity and reliability, both in general and also with specific reference to the empirical analysis of the book. Furthermore, the link is made between the theoretical definitions of the central concepts and the variables investigated in Chapter 6.Less
This chapter outlines the methods used in the empirical part of the book and addresses issues such as validity and reliability, both in general and also with specific reference to the empirical analysis of the book. Furthermore, the link is made between the theoretical definitions of the central concepts and the variables investigated in Chapter 6.
A. M. C. Casiday
- Published in print:
- 2006
- Published Online:
- January 2007
- ISBN:
- 9780199297184
- eISBN:
- 9780191711381
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199297184.003.0004
- Subject:
- Religion, Early Christian Studies
This chapter places Cassian's monastic programme within the milieu of Nitrian asceticism and redresses the arguments that have recently been advanced against Cassian's reliability as an historian. It ...
More
This chapter places Cassian's monastic programme within the milieu of Nitrian asceticism and redresses the arguments that have recently been advanced against Cassian's reliability as an historian. It brings to bear in the study of Cassian's works the significant results from contemporary scholarship that have effectively demonstrated that the Fathers of the Egyptian desert were possessed of a theological culture of considerable vitality, and that stereotypes about the pursuit of intellectual activity (not least allegorical interpretation of Scripture in the manner of Philo, Origen, and others) cleaving along ethnic lines are simply unsound.Less
This chapter places Cassian's monastic programme within the milieu of Nitrian asceticism and redresses the arguments that have recently been advanced against Cassian's reliability as an historian. It brings to bear in the study of Cassian's works the significant results from contemporary scholarship that have effectively demonstrated that the Fathers of the Egyptian desert were possessed of a theological culture of considerable vitality, and that stereotypes about the pursuit of intellectual activity (not least allegorical interpretation of Scripture in the manner of Philo, Origen, and others) cleaving along ethnic lines are simply unsound.
Luc Bovens and Stephan Hartmann
- Published in print:
- 2004
- Published Online:
- January 2005
- ISBN:
- 9780199269754
- eISBN:
- 9780191601705
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0199269750.001.0001
- Subject:
- Philosophy, Metaphysics/Epistemology
Probabilistic models have much to offer to epistemology and philosophy of science. Arguably, the coherence theory of justification claims that the more coherent a set of propositions is, the more ...
More
Probabilistic models have much to offer to epistemology and philosophy of science. Arguably, the coherence theory of justification claims that the more coherent a set of propositions is, the more confident one ought to be in its content, ceteris paribus. An impossibility result shows that there cannot exist a coherence ordering. A coherence quasi-ordering can be constructed that respects this claim and is relevant to scientific-theory choice. Bayesian-Network models of the reliability of information sources are made applicable to Condorcet-style jury voting, Tversky and Kahneman’s Linda puzzle, the variety-of-evidence thesis, the Duhem–Quine thesis, and the informational value of testimony.Less
Probabilistic models have much to offer to epistemology and philosophy of science. Arguably, the coherence theory of justification claims that the more coherent a set of propositions is, the more confident one ought to be in its content, ceteris paribus. An impossibility result shows that there cannot exist a coherence ordering. A coherence quasi-ordering can be constructed that respects this claim and is relevant to scientific-theory choice. Bayesian-Network models of the reliability of information sources are made applicable to Condorcet-style jury voting, Tversky and Kahneman’s Linda puzzle, the variety-of-evidence thesis, the Duhem–Quine thesis, and the informational value of testimony.
Kathleen M. McGraw
- Published in print:
- 1998
- Published Online:
- November 2003
- ISBN:
- 9780198294719
- eISBN:
- 9780191599361
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198294719.003.0034
- Subject:
- Political Science, Reference
Provides a discussion of issues relating to research design and experimental methods in political science. Issues are elaborated relating to control and random case selection, internal and external ...
More
Provides a discussion of issues relating to research design and experimental methods in political science. Issues are elaborated relating to control and random case selection, internal and external validity, identification of mediating variables, and replication. Examples of experimental contributions in political science are outlined in the fields of public opinion research, decision‐making and information processing, collective action, public choice, and public policy. Experimentation represents a burgeoning cutting‐edge approach in the future of political science research.Less
Provides a discussion of issues relating to research design and experimental methods in political science. Issues are elaborated relating to control and random case selection, internal and external validity, identification of mediating variables, and replication. Examples of experimental contributions in political science are outlined in the fields of public opinion research, decision‐making and information processing, collective action, public choice, and public policy. Experimentation represents a burgeoning cutting‐edge approach in the future of political science research.
C. A. J. Coady
- Published in print:
- 1994
- Published Online:
- November 2003
- ISBN:
- 9780198235514
- eISBN:
- 9780191597220
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198235518.001.0001
- Subject:
- Philosophy, Metaphysics/Epistemology
This book is about a topic in epistemology that had been much neglected until its publication, but has subsequently become much more discussed. That topic is testimony, or, less technically, the ...
More
This book is about a topic in epistemology that had been much neglected until its publication, but has subsequently become much more discussed. That topic is testimony, or, less technically, the conveying of information by telling. Coady argues that reliance upon the word of others plays a crucial role in the economy of knowledge, though the extent and depth of this reliance have gone largely unrecognized in the philosophical tradition. He discusses those efforts that have been made to explain and justify the role of testimony in the getting and sustaining of knowledge or reliable belief, and concludes that, with the partial exception of Thomas Reid's discussion in the eighteenth century, they have been unsuccessful. This widespread failure, he argues, stems from a reductive approach with an individualist bias that fails to appreciate just how fundamental are our cognitive debts to one another. Indeed, he argues, the very possibility of linguistic communication rests upon some basic reliability of testimony. He spells out an alternative to the reductive way of understanding the links between testimony, perception, memory, and inference. In the latter part of the book, Coady explores several puzzles generated by our reliance on testimony, including those created by the tension between prior probabilities and testimony to astonishing events, the supposed increase in unreliability of testimonial chains of transmission as they expand, and a puzzle about competence and transmission of knowledge. He also discusses certain implications of his view of testimony for important issues in history, psychology, mathematics, and the law.Less
This book is about a topic in epistemology that had been much neglected until its publication, but has subsequently become much more discussed. That topic is testimony, or, less technically, the conveying of information by telling. Coady argues that reliance upon the word of others plays a crucial role in the economy of knowledge, though the extent and depth of this reliance have gone largely unrecognized in the philosophical tradition. He discusses those efforts that have been made to explain and justify the role of testimony in the getting and sustaining of knowledge or reliable belief, and concludes that, with the partial exception of Thomas Reid's discussion in the eighteenth century, they have been unsuccessful. This widespread failure, he argues, stems from a reductive approach with an individualist bias that fails to appreciate just how fundamental are our cognitive debts to one another. Indeed, he argues, the very possibility of linguistic communication rests upon some basic reliability of testimony. He spells out an alternative to the reductive way of understanding the links between testimony, perception, memory, and inference. In the latter part of the book, Coady explores several puzzles generated by our reliance on testimony, including those created by the tension between prior probabilities and testimony to astonishing events, the supposed increase in unreliability of testimonial chains of transmission as they expand, and a puzzle about competence and transmission of knowledge. He also discusses certain implications of his view of testimony for important issues in history, psychology, mathematics, and the law.
Kieran Setiya
- Published in print:
- 2012
- Published Online:
- January 2013
- ISBN:
- 9780199657452
- eISBN:
- 9780191745539
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199657452.001.0001
- Subject:
- Philosophy, Moral Philosophy, Metaphysics/Epistemology
Can we have objective knowledge of right and wrong, of how we should live and what there is reason to do? The thought that we can is beset by sceptical problems. In the face of radical disagreement, ...
More
Can we have objective knowledge of right and wrong, of how we should live and what there is reason to do? The thought that we can is beset by sceptical problems. In the face of radical disagreement, can we be sure that we are not deceived? If the facts are independent of what we think, is our reliability a mere coincidence? Can it be anything but luck when our beliefs are true? This book confronts these questions in their most compelling and articulate forms: the argument from ethical disagreement; the argument from reliability and coincidence; and the argument from accidental truth. In order to resist the inference from disagreement to scepticism, we must reject epistemologies of intuition, coherence, and reflective equilibrium. The problem of disagreement can be solved only if the basic standards of epistemology in ethics are biased towards the truth. In order to solve the problem of coincidence, we must embrace arguments for reliability in ethics that rely on ethical beliefs. Such arguments do not beg the question in an epistemically damaging way. And in order to make sense of ethical knowledge as non-accidental truth, we must deny the independence of ethical fact and belief. We can do so without implausible predictions of convergence or relativity if the facts are bound to us through the natural history of human life. If there is objective ethical knowledge, human nature is its source.Less
Can we have objective knowledge of right and wrong, of how we should live and what there is reason to do? The thought that we can is beset by sceptical problems. In the face of radical disagreement, can we be sure that we are not deceived? If the facts are independent of what we think, is our reliability a mere coincidence? Can it be anything but luck when our beliefs are true? This book confronts these questions in their most compelling and articulate forms: the argument from ethical disagreement; the argument from reliability and coincidence; and the argument from accidental truth. In order to resist the inference from disagreement to scepticism, we must reject epistemologies of intuition, coherence, and reflective equilibrium. The problem of disagreement can be solved only if the basic standards of epistemology in ethics are biased towards the truth. In order to solve the problem of coincidence, we must embrace arguments for reliability in ethics that rely on ethical beliefs. Such arguments do not beg the question in an epistemically damaging way. And in order to make sense of ethical knowledge as non-accidental truth, we must deny the independence of ethical fact and belief. We can do so without implausible predictions of convergence or relativity if the facts are bound to us through the natural history of human life. If there is objective ethical knowledge, human nature is its source.
Emily White, Bruce K. Armstrong, and Rodolfo Saracci
- Published in print:
- 2008
- Published Online:
- September 2009
- ISBN:
- 9780198509851
- eISBN:
- 9780191723827
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198509851.001.0001
- Subject:
- Public Health and Epidemiology, Public Health, Epidemiology
The accurate measurement of exposure to putative causes of disease is essential to the validity of epidemiologic research. This book covers general principles and methods that can be applied to ...
More
The accurate measurement of exposure to putative causes of disease is essential to the validity of epidemiologic research. This book covers general principles and methods that can be applied to accurately measure a wide range of exposures (risk factors) in epidemiology, including demographic, anthropometric, nutritional, medical, reproductive, genetic, metabolic, and environmental factors. It covers the methods and quality control approaches for the most commonly used data collection methods in epidemiology, including personal interviews, self administered questionnaires, abstraction of records, keeping of dairies, measurements in blood and other body products, and measurements of the environment. The emphasis is on general methods and examples, but not on detailed reviews of the measurement methods for specific exposures. This book also covers three other major topics relevant to exposure measurement. The first is methods to design, analyze, and interpret validity and reliability studies that quantify the degree of measurement error for a specific exposure. This topic is included because such ancillary studies are important in understanding the effects of exposure measurement error on the ‘parent’ epidemiologic study. The second is methods to maximize response rates. While this topic falls under the construct of reducing selection bias, and most of the rest of the book is focused on reducing misclassification bias, it is included because it is an important aspect of the data collection phase of most epidemiologic studies. The third additional topic, ethical issues in the conduct of epidemiologic research, is included for the same reason.Less
The accurate measurement of exposure to putative causes of disease is essential to the validity of epidemiologic research. This book covers general principles and methods that can be applied to accurately measure a wide range of exposures (risk factors) in epidemiology, including demographic, anthropometric, nutritional, medical, reproductive, genetic, metabolic, and environmental factors. It covers the methods and quality control approaches for the most commonly used data collection methods in epidemiology, including personal interviews, self administered questionnaires, abstraction of records, keeping of dairies, measurements in blood and other body products, and measurements of the environment. The emphasis is on general methods and examples, but not on detailed reviews of the measurement methods for specific exposures. This book also covers three other major topics relevant to exposure measurement. The first is methods to design, analyze, and interpret validity and reliability studies that quantify the degree of measurement error for a specific exposure. This topic is included because such ancillary studies are important in understanding the effects of exposure measurement error on the ‘parent’ epidemiologic study. The second is methods to maximize response rates. While this topic falls under the construct of reducing selection bias, and most of the rest of the book is focused on reducing misclassification bias, it is included because it is an important aspect of the data collection phase of most epidemiologic studies. The third additional topic, ethical issues in the conduct of epidemiologic research, is included for the same reason.
David Brady
- Published in print:
- 2009
- Published Online:
- September 2009
- ISBN:
- 9780195385878
- eISBN:
- 9780199870066
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195385878.003.0002
- Subject:
- Political Science, Comparative Politics
This chapter begins by reviewing the shortcomings of the official U.S. measure of poverty, arguing that it is unreliable and invalid. Then, the chapter reviews major theoretical and methodological ...
More
This chapter begins by reviewing the shortcomings of the official U.S. measure of poverty, arguing that it is unreliable and invalid. Then, the chapter reviews major theoretical and methodological advances in poverty measurement and advocates five criteria in the measurement of poverty: (1) to measure comparative historical variation effectively, (2) to be relative rather than absolute, (3) to conceptualize poverty as social exclusion and capability deprivation, (4) to incorporate taxes and transfers, and (5) to integrate the depth of poverty. Overall, the aim is to facilitate the integration of theoretical and methodological advances into the empirical measurement of poverty. Also, criticisms are made of absolute measures of poverty, the measurement of poverty before taxes and transfers, and measures of redistribution. This chapter makes a theoretical argument regarding how poverty should be measured for the study of affluent democracies.Less
This chapter begins by reviewing the shortcomings of the official U.S. measure of poverty, arguing that it is unreliable and invalid. Then, the chapter reviews major theoretical and methodological advances in poverty measurement and advocates five criteria in the measurement of poverty: (1) to measure comparative historical variation effectively, (2) to be relative rather than absolute, (3) to conceptualize poverty as social exclusion and capability deprivation, (4) to incorporate taxes and transfers, and (5) to integrate the depth of poverty. Overall, the aim is to facilitate the integration of theoretical and methodological advances into the empirical measurement of poverty. Also, criticisms are made of absolute measures of poverty, the measurement of poverty before taxes and transfers, and measures of redistribution. This chapter makes a theoretical argument regarding how poverty should be measured for the study of affluent democracies.
Paul Schulman and Emery Roe
- Published in print:
- 2016
- Published Online:
- January 2017
- ISBN:
- 9780804793933
- eISBN:
- 9780804798624
- Item type:
- book
- Publisher:
- Stanford University Press
- DOI:
- 10.11126/stanford/9780804793933.001.0001
- Subject:
- Business and Management, Organization Studies
High-reliability management of critical infrastructures-the safe and continued provision of electricity, natural gas, telecommunications, transportation, and water-is a social imperative. Loss of ...
More
High-reliability management of critical infrastructures-the safe and continued provision of electricity, natural gas, telecommunications, transportation, and water-is a social imperative. Loss of service in interconnected critical infrastructure systems (ICISs) after hurricanes, earthquakes, floods, and tsunamis and their delayed large-scale recovery have turned these events into catastrophes. Reliability and Risk reveals a neglected management dimension and provides a new framework for understanding interconnected infrastructures, their potential for cascading failure, and how to improve their reliability and reduce risk of system failure. The book answers two questions: How are modern interconnected infrastructures managed and regulated for reliability? How can policy makers, analysts, managers, and citizenry better promote reliability in interconnected systems whose failures can scarcely be imagined? The current consensus is that the answers lie in better design, technology, and regulation, but the book argues that these have inevitable shortfalls and that it is dangerous to stop there. The framework developed in Reliability and Risk draws from first-of-its-kind research at the infrastructure crossroads of California, the California Delta, in the San Francisco Bay region. The book demonstrates that infrastructure reliability in an interconnected world must be managed by system professionals in real time.Less
High-reliability management of critical infrastructures-the safe and continued provision of electricity, natural gas, telecommunications, transportation, and water-is a social imperative. Loss of service in interconnected critical infrastructure systems (ICISs) after hurricanes, earthquakes, floods, and tsunamis and their delayed large-scale recovery have turned these events into catastrophes. Reliability and Risk reveals a neglected management dimension and provides a new framework for understanding interconnected infrastructures, their potential for cascading failure, and how to improve their reliability and reduce risk of system failure. The book answers two questions: How are modern interconnected infrastructures managed and regulated for reliability? How can policy makers, analysts, managers, and citizenry better promote reliability in interconnected systems whose failures can scarcely be imagined? The current consensus is that the answers lie in better design, technology, and regulation, but the book argues that these have inevitable shortfalls and that it is dangerous to stop there. The framework developed in Reliability and Risk draws from first-of-its-kind research at the infrastructure crossroads of California, the California Delta, in the San Francisco Bay region. The book demonstrates that infrastructure reliability in an interconnected world must be managed by system professionals in real time.
Graeme D. Ruxton, Thomas N. Sherratt, and Michael P. Speed
- Published in print:
- 2004
- Published Online:
- September 2007
- ISBN:
- 9780198528609
- eISBN:
- 9780191713392
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198528609.003.0013
- Subject:
- Biology, Animal Biology
This chapter examines the phenomenon of automimicry, where individuals within a population may share the same warning signal but differ in their investment in defence. It seeks to explain the ...
More
This chapter examines the phenomenon of automimicry, where individuals within a population may share the same warning signal but differ in their investment in defence. It seeks to explain the evolution of this variability and how the predator’s continued appropriate response to the warning signal can be maintained in the face of this potential decrease in signal reliability. It also considers the use of mimicry by predators (aggressive mimicry), floral mimicry that attracts pollinators, and intraspecific sexual mimicry.Less
This chapter examines the phenomenon of automimicry, where individuals within a population may share the same warning signal but differ in their investment in defence. It seeks to explain the evolution of this variability and how the predator’s continued appropriate response to the warning signal can be maintained in the face of this potential decrease in signal reliability. It also considers the use of mimicry by predators (aggressive mimicry), floral mimicry that attracts pollinators, and intraspecific sexual mimicry.
Pedro Rosas and Felix A. Wichmann
- Published in print:
- 2011
- Published Online:
- September 2012
- ISBN:
- 9780195387247
- eISBN:
- 9780199918379
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195387247.003.0008
- Subject:
- Psychology, Cognitive Neuroscience, Cognitive Psychology
This chapter briefly introduces the robust-weak-fusion model, which offers an exceptionally clear and elegant framework within which to understand empirical studies on cue combination. Research on ...
More
This chapter briefly introduces the robust-weak-fusion model, which offers an exceptionally clear and elegant framework within which to understand empirical studies on cue combination. Research on cue combination is an area in the cognitive neurosciences where quantitative models and predictions are the norm rather than the exception—and this is certainly a development that this book welcomes wholeheartedly. What they view critically, however, is the strong emphasis on so-called optimal cue combination. Optimal in the context of human cue combination typically refers to the minimum-variance unbiased estimator for multiple sources of information, corresponding to maximum-likelihood estimation when the probability distribution of the estimates based on each cue are Gaussian, independent, and the prior of the observer is uniform (noninformative). The central aim of this chapter is to spell out worries regarding both the term optimality as well as against the use of the minimum-variance unbiased estimator as the statistical tool to go from the reliability of a cue to its weight in robust weak fusion.Less
This chapter briefly introduces the robust-weak-fusion model, which offers an exceptionally clear and elegant framework within which to understand empirical studies on cue combination. Research on cue combination is an area in the cognitive neurosciences where quantitative models and predictions are the norm rather than the exception—and this is certainly a development that this book welcomes wholeheartedly. What they view critically, however, is the strong emphasis on so-called optimal cue combination. Optimal in the context of human cue combination typically refers to the minimum-variance unbiased estimator for multiple sources of information, corresponding to maximum-likelihood estimation when the probability distribution of the estimates based on each cue are Gaussian, independent, and the prior of the observer is uniform (noninformative). The central aim of this chapter is to spell out worries regarding both the term optimality as well as against the use of the minimum-variance unbiased estimator as the statistical tool to go from the reliability of a cue to its weight in robust weak fusion.
Graeme D. Ruxton, Thomas N. Sherratt, and Michael P. Speed
- Published in print:
- 2004
- Published Online:
- September 2007
- ISBN:
- 9780198528609
- eISBN:
- 9780191713392
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198528609.003.0007
- Subject:
- Biology, Animal Biology
Predators that rely on surprise may be persuaded to desist from attacking if prey use reliable signals that the predator has been detected. Prey may also be able to reliably signal to a predator that ...
More
Predators that rely on surprise may be persuaded to desist from attacking if prey use reliable signals that the predator has been detected. Prey may also be able to reliably signal to a predator that they are difficult to catch or subdue, and that cause the predator to desist from attacking or switch their attack to another prey individual. The theory underlying such signals is considered and compared to the available empirical data to determine the evolution of such signals and their ecological prevalence.Less
Predators that rely on surprise may be persuaded to desist from attacking if prey use reliable signals that the predator has been detected. Prey may also be able to reliably signal to a predator that they are difficult to catch or subdue, and that cause the predator to desist from attacking or switch their attack to another prey individual. The theory underlying such signals is considered and compared to the available empirical data to determine the evolution of such signals and their ecological prevalence.
Rama Natarajan and Richard S. Zemel
- Published in print:
- 2011
- Published Online:
- September 2012
- ISBN:
- 9780195387247
- eISBN:
- 9780199918379
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195387247.003.0020
- Subject:
- Psychology, Cognitive Neuroscience, Cognitive Psychology
This chapter considers how a neural system could properly combine cues even as their reliabilities change dynamically and continuously in time. It describes a framework for characterizing how ...
More
This chapter considers how a neural system could properly combine cues even as their reliabilities change dynamically and continuously in time. It describes a framework for characterizing how uncertain information may be represented in population codes, and how the representation may be optimized to facilitate optimal decoding. It then motivates and presents a coding scheme within this framework that applies to time-varying stimulus variables. Next, it describes a hierarchical network setup that utilizes this coding approach, and demonstrates an application to a dynamic cue-combination task—a novel adaptation of a sensorimotor task. Specifically, it considers how a neural population might recursively integrate dynamic inputs from multiple sensory modalities with learned prior information, to determine appropriate motor commands that control behavior; the results of simulated experiments with this network are presented. Finally, the implications of this scheme and suggest some future directions are discussed.Less
This chapter considers how a neural system could properly combine cues even as their reliabilities change dynamically and continuously in time. It describes a framework for characterizing how uncertain information may be represented in population codes, and how the representation may be optimized to facilitate optimal decoding. It then motivates and presents a coding scheme within this framework that applies to time-varying stimulus variables. Next, it describes a hierarchical network setup that utilizes this coding approach, and demonstrates an application to a dynamic cue-combination task—a novel adaptation of a sensorimotor task. Specifically, it considers how a neural population might recursively integrate dynamic inputs from multiple sensory modalities with learned prior information, to determine appropriate motor commands that control behavior; the results of simulated experiments with this network are presented. Finally, the implications of this scheme and suggest some future directions are discussed.
Paul Horwich
- Published in print:
- 2010
- Published Online:
- February 2010
- ISBN:
- 9780199268900
- eISBN:
- 9780191708459
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199268900.003.0010
- Subject:
- Philosophy, Metaphysics/Epistemology, Philosophy of Language
We respect certain epistemic norms — including (roughly) that one should tentatively believe things to have the colors they seem to have, should conform to modus ponens and the principle of ...
More
We respect certain epistemic norms — including (roughly) that one should tentatively believe things to have the colors they seem to have, should conform to modus ponens and the principle of non-contradiction, and should reason in accord with induction. But what could make these (or certain alternative norms, perhaps) the right ones? What could explain their correctness? This chapter scrutinizes the most commonly offered answers to this question. Special attention is devoted to the ‘semantogenetic’ strategy whereby the correctness of such norms is grounded in the meaning-engendering features of words. (For example, it's sometimes said that the axioms of arithmetic should be accepted because they implicitly define the primitive arithmetical terms). But amongst the competing strategies briefly examined here are proposals that invoke rational intuition, those that are militantly internalistic, those that prioritize considerations of reliability, and those that aim to ground the facts of epistemic rationality ‘constructively’ in normative commitments implicit in our linguistic activity. It is argued that none of these approaches is remotely adequate. And it is suggested, in conclusion, both that the correctness of the above-mentioned norms cannot be explained, and that this result should be neither surprising nor worrying.Less
We respect certain epistemic norms — including (roughly) that one should tentatively believe things to have the colors they seem to have, should conform to modus ponens and the principle of non-contradiction, and should reason in accord with induction. But what could make these (or certain alternative norms, perhaps) the right ones? What could explain their correctness? This chapter scrutinizes the most commonly offered answers to this question. Special attention is devoted to the ‘semantogenetic’ strategy whereby the correctness of such norms is grounded in the meaning-engendering features of words. (For example, it's sometimes said that the axioms of arithmetic should be accepted because they implicitly define the primitive arithmetical terms). But amongst the competing strategies briefly examined here are proposals that invoke rational intuition, those that are militantly internalistic, those that prioritize considerations of reliability, and those that aim to ground the facts of epistemic rationality ‘constructively’ in normative commitments implicit in our linguistic activity. It is argued that none of these approaches is remotely adequate. And it is suggested, in conclusion, both that the correctness of the above-mentioned norms cannot be explained, and that this result should be neither surprising nor worrying.
Peter Lyons and Howard J. Doueck
- Published in print:
- 2009
- Published Online:
- February 2010
- ISBN:
- 9780195373912
- eISBN:
- 9780199865604
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195373912.003.0005
- Subject:
- Social Work, Research and Evaluation
This chapter examines participant selection, sampling design, sample size and sampling error; as well as the importance of statistical power, effect size, confidence levels, and confidence intervals. ...
More
This chapter examines participant selection, sampling design, sample size and sampling error; as well as the importance of statistical power, effect size, confidence levels, and confidence intervals. Types of sampling, including probability and nonprobability sampling methods, are discussed in relation to both quantitative and qualitative research designs. The measurement properties of instruments including requirements of validity and reliability as well as issues in measurement with human measures (credibility, inquiry audits, and triangulation) are presented.Less
This chapter examines participant selection, sampling design, sample size and sampling error; as well as the importance of statistical power, effect size, confidence levels, and confidence intervals. Types of sampling, including probability and nonprobability sampling methods, are discussed in relation to both quantitative and qualitative research designs. The measurement properties of instruments including requirements of validity and reliability as well as issues in measurement with human measures (credibility, inquiry audits, and triangulation) are presented.
A. Aldo Faisal
- Published in print:
- 2009
- Published Online:
- February 2010
- ISBN:
- 9780199235070
- eISBN:
- 9780191715778
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199235070.003.0011
- Subject:
- Mathematics, Biostatistics
Variability is inherent in neurons. To account for variability we have to make use of stochastic models. We will take a look at this biologically more rigorous approach by studying the fundamental ...
More
Variability is inherent in neurons. To account for variability we have to make use of stochastic models. We will take a look at this biologically more rigorous approach by studying the fundamental signal of our brain’s neurons: the action potential and the voltage-gated ion channels mediating it. We will discuss how to model and simulate the action potential stochastically. We review the methods and show that classic stochastic approximation methods fail at capturing important properties of the highly nonlinear action potential mechanism, making the use of accurate models and simulation methods essential for understanding the neural code. We will review what stochastic modelling has taught us about the function, structure, and limits of action potential signalling in neurons, the most surprising insight being that stochastic effects of individual signalling molecules become relevant for whole-cell behaviour. We suggest that most of the experimentally observed neuronal variability can be explained from the bottom-up as generated by molecular sources of thermodynamic noise.Less
Variability is inherent in neurons. To account for variability we have to make use of stochastic models. We will take a look at this biologically more rigorous approach by studying the fundamental signal of our brain’s neurons: the action potential and the voltage-gated ion channels mediating it. We will discuss how to model and simulate the action potential stochastically. We review the methods and show that classic stochastic approximation methods fail at capturing important properties of the highly nonlinear action potential mechanism, making the use of accurate models and simulation methods essential for understanding the neural code. We will review what stochastic modelling has taught us about the function, structure, and limits of action potential signalling in neurons, the most surprising insight being that stochastic effects of individual signalling molecules become relevant for whole-cell behaviour. We suggest that most of the experimentally observed neuronal variability can be explained from the bottom-up as generated by molecular sources of thermodynamic noise.
J. Patrick Meyer
- Published in print:
- 2010
- Published Online:
- March 2012
- ISBN:
- 9780195380361
- eISBN:
- 9780199847914
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195380361.003.0006
- Subject:
- Psychology, Social Psychology
Absolutely no judgment or evaluation should be included when reporting a reliability analysis. Estimates should speak for themselves and it is the task of the readers to judge their values. The ...
More
Absolutely no judgment or evaluation should be included when reporting a reliability analysis. Estimates should speak for themselves and it is the task of the readers to judge their values. The discussion should focus on limitations of analysis. In this chapter, example discussions for the Benchmark Assessment of English Language Arts (ELA), the Palmetto Achievement Challenge Test (PACT) of mathematics, and the MSCAN are provided. Several books for future reading are recommended. These include books on classical test theory. Some of the recommended book include Theory of Mental Tests by Gulliksen, Statistical Theories of Mental Test Scores by Lord and Novick, and Reliability for the Social Sciences: Theory and Applications by Traub. The chapter also advises that reading be done on original references so that one can read the original words of the author. Other references are given after this chapter.Less
Absolutely no judgment or evaluation should be included when reporting a reliability analysis. Estimates should speak for themselves and it is the task of the readers to judge their values. The discussion should focus on limitations of analysis. In this chapter, example discussions for the Benchmark Assessment of English Language Arts (ELA), the Palmetto Achievement Challenge Test (PACT) of mathematics, and the MSCAN are provided. Several books for future reading are recommended. These include books on classical test theory. Some of the recommended book include Theory of Mental Tests by Gulliksen, Statistical Theories of Mental Test Scores by Lord and Novick, and Reliability for the Social Sciences: Theory and Applications by Traub. The chapter also advises that reading be done on original references so that one can read the original words of the author. Other references are given after this chapter.
Stephen Handel
- Published in print:
- 2006
- Published Online:
- September 2007
- ISBN:
- 9780195169645
- eISBN:
- 9780199786732
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195169645.003.0009
- Subject:
- Psychology, Cognitive Psychology
The proximal stimulation at ear and eye is neutral, the excitations from different sources are intermixed, and the excitations can be organized in several ways to yield different percepts. The ...
More
The proximal stimulation at ear and eye is neutral, the excitations from different sources are intermixed, and the excitations can be organized in several ways to yield different percepts. The Gestalt psychologists proposed laws of organization due to cortical field forces that acted to knit the visual excitations into continuous enclosed objects. The resulting objects were described as the “simplest” possible, termed prägnanz, although prägnanz was never clearly defined. Similar laws of organization have been proposed for auditory excitation. The most important is onset synchrony, namely auditory excitations that start at the same time are assumed to come from one source. When there is simultaneous auditory and visual excitation, the normal unity assumption is that both excitations come from the same source. If there is a conflict in the two excitations, the compellingness and reliability of the excitations will determine how each is weighted in importance.Less
The proximal stimulation at ear and eye is neutral, the excitations from different sources are intermixed, and the excitations can be organized in several ways to yield different percepts. The Gestalt psychologists proposed laws of organization due to cortical field forces that acted to knit the visual excitations into continuous enclosed objects. The resulting objects were described as the “simplest” possible, termed prägnanz, although prägnanz was never clearly defined. Similar laws of organization have been proposed for auditory excitation. The most important is onset synchrony, namely auditory excitations that start at the same time are assumed to come from one source. When there is simultaneous auditory and visual excitation, the normal unity assumption is that both excitations come from the same source. If there is a conflict in the two excitations, the compellingness and reliability of the excitations will determine how each is weighted in importance.
Gregory Currie
- Published in print:
- 2010
- Published Online:
- May 2010
- ISBN:
- 9780199282609
- eISBN:
- 9780191712432
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199282609.003.0002
- Subject:
- Philosophy, Aesthetics, Philosophy of Mind
To distinguish narratives from other representational vehicles we need to say something about what distinguishes the contents of narratives from the contents of other things: theories, lists, annals, ...
More
To distinguish narratives from other representational vehicles we need to say something about what distinguishes the contents of narratives from the contents of other things: theories, lists, annals, rambling conversational remarks, instruction manuals. The book does not seek to define narrative, choosing instead to focus on the graded notion of narrativity. We can then think of things high in narrativity as combining certain features which make for the detailed representation of particulars, especially agents, in their causal and temporal relations. While it needs careful handling, this idea is defensible against recent criticism due to Velleman. An appendix to the chapter speculates on the evolutionary background which makes representations of just these kinds so very important to us as indices of reliability.Less
To distinguish narratives from other representational vehicles we need to say something about what distinguishes the contents of narratives from the contents of other things: theories, lists, annals, rambling conversational remarks, instruction manuals. The book does not seek to define narrative, choosing instead to focus on the graded notion of narrativity. We can then think of things high in narrativity as combining certain features which make for the detailed representation of particulars, especially agents, in their causal and temporal relations. While it needs careful handling, this idea is defensible against recent criticism due to Velleman. An appendix to the chapter speculates on the evolutionary background which makes representations of just these kinds so very important to us as indices of reliability.