Roger M. Barker
- Published in print:
- 2010
- Published Online:
- May 2010
- ISBN:
- 9780199576814
- eISBN:
- 9780191722509
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199576814.003.0007
- Subject:
- Business and Management, International Business, Corporate Governance and Accountability
A variety of statistical robustness tests confirm that the conclusions of Chapter 6 are not sensitive to the inclusion of particular countries or observations in the data set, or the choice of ...
More
A variety of statistical robustness tests confirm that the conclusions of Chapter 6 are not sensitive to the inclusion of particular countries or observations in the data set, or the choice of individual control variables. The reestimation of the model in terms of first‐differences (i.e., a dynamic model specification) also gives rise to consistent results.Less
A variety of statistical robustness tests confirm that the conclusions of Chapter 6 are not sensitive to the inclusion of particular countries or observations in the data set, or the choice of individual control variables. The reestimation of the model in terms of first‐differences (i.e., a dynamic model specification) also gives rise to consistent results.
Domitilla Del Vecchio and Richard M. Murray
- Published in print:
- 2014
- Published Online:
- October 2017
- ISBN:
- 9780691161532
- eISBN:
- 9781400850501
- Item type:
- book
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691161532.001.0001
- Subject:
- Biology, Biochemistry / Molecular Biology
This book provides an accessible introduction to the principles and tools for modeling, analyzing, and synthesizing biomolecular systems. It begins with modeling tools such as reaction-rate ...
More
This book provides an accessible introduction to the principles and tools for modeling, analyzing, and synthesizing biomolecular systems. It begins with modeling tools such as reaction-rate equations, reduced-order models, stochastic models, and specific models of important core processes. It then describes in detail the control and dynamical systems tools used to analyze these models. These include tools for analyzing stability of equilibria, limit cycles, robustness, and parameter uncertainty. Modeling and analysis techniques are then applied to design examples from both natural systems and synthetic biomolecular circuits. In addition, the book addresses the problem of modular composition of synthetic circuits, the tools for analyzing the extent of modularity, and the design techniques for ensuring modular behavior. It also looks at design trade-offs, focusing on perturbations due to noise and competition for shared cellular resources. Featuring numerous exercises and illustrations throughout, the book is the ideal textbook for advanced undergraduates and graduate students. For researchers, it can also serve as a self-contained reference on the feedback control techniques that can be applied to biomolecular systems.Less
This book provides an accessible introduction to the principles and tools for modeling, analyzing, and synthesizing biomolecular systems. It begins with modeling tools such as reaction-rate equations, reduced-order models, stochastic models, and specific models of important core processes. It then describes in detail the control and dynamical systems tools used to analyze these models. These include tools for analyzing stability of equilibria, limit cycles, robustness, and parameter uncertainty. Modeling and analysis techniques are then applied to design examples from both natural systems and synthetic biomolecular circuits. In addition, the book addresses the problem of modular composition of synthetic circuits, the tools for analyzing the extent of modularity, and the design techniques for ensuring modular behavior. It also looks at design trade-offs, focusing on perturbations due to noise and competition for shared cellular resources. Featuring numerous exercises and illustrations throughout, the book is the ideal textbook for advanced undergraduates and graduate students. For researchers, it can also serve as a self-contained reference on the feedback control techniques that can be applied to biomolecular systems.
Laura Valentini
- Published in print:
- 2011
- Published Online:
- January 2012
- ISBN:
- 9780199593859
- eISBN:
- 9780191731457
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199593859.003.0007
- Subject:
- Political Science, Political Theory
This chapter discusses how to move from a general concern with the justification of coercion to particular substantive principles of justice. It argues that a social system is just only so long as it ...
More
This chapter discusses how to move from a general concern with the justification of coercion to particular substantive principles of justice. It argues that a social system is just only so long as it respects the right to freedom of those subject to it, namely their right to the social conditions necessary to lead autonomous lives. For this to be the case, the distribution of freedom engendered by the system has to be justifiable in the eyes of all those who are subject to it. Focusing on domestic societies in particular, the chapter concludes that a multiplicity of principles of economic justice might instantiate mutually justifiable distributions of freedom, not all of which are egalitarian in form. In other words, contrary to most contemporary liberal theorists’ arguments on the view defended in this chapter, economic equality is not a fundamental, non-negotiable demand of justice.Less
This chapter discusses how to move from a general concern with the justification of coercion to particular substantive principles of justice. It argues that a social system is just only so long as it respects the right to freedom of those subject to it, namely their right to the social conditions necessary to lead autonomous lives. For this to be the case, the distribution of freedom engendered by the system has to be justifiable in the eyes of all those who are subject to it. Focusing on domestic societies in particular, the chapter concludes that a multiplicity of principles of economic justice might instantiate mutually justifiable distributions of freedom, not all of which are egalitarian in form. In other words, contrary to most contemporary liberal theorists’ arguments on the view defended in this chapter, economic equality is not a fundamental, non-negotiable demand of justice.
Andrea Rotnitzky
- Published in print:
- 2005
- Published Online:
- September 2007
- ISBN:
- 9780198566540
- eISBN:
- 9780191718038
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198566540.003.0006
- Subject:
- Mathematics, Probability / Statistics
This chapter reviews some key elements of semiparametric theory and the contributions of semiparametric inference to the challenge posed by high-dimensional data, for which the specification of ...
More
This chapter reviews some key elements of semiparametric theory and the contributions of semiparametric inference to the challenge posed by high-dimensional data, for which the specification of realistic parametric models for the mechanism generating the data may be difficult, if not impossible. The usefulness of semiparametric modeling is illustrated by a number of examples. A non-technical account of the formulation of the semiparametric variance bound and ways of calculating it are described. Consequences are set out for estimations that result from the curse of dimensionality. The possibilities for approaching inference are discussed when estimation of irregular parameters is inevitable; some unresolved questions are raised.Less
This chapter reviews some key elements of semiparametric theory and the contributions of semiparametric inference to the challenge posed by high-dimensional data, for which the specification of realistic parametric models for the mechanism generating the data may be difficult, if not impossible. The usefulness of semiparametric modeling is illustrated by a number of examples. A non-technical account of the formulation of the semiparametric variance bound and ways of calculating it are described. Consequences are set out for estimations that result from the curse of dimensionality. The possibilities for approaching inference are discussed when estimation of irregular parameters is inevitable; some unresolved questions are raised.
Günter P. Wagner
- Published in print:
- 2014
- Published Online:
- October 2017
- ISBN:
- 9780691156460
- eISBN:
- 9781400851461
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691156460.003.0005
- Subject:
- Biology, Evolutionary Biology / Genetics
This chapter examines the evolutionary processes that led to the origin of body parts, with particular emphasis on the concept of “novelties.” It first considers the distinction between the evolution ...
More
This chapter examines the evolutionary processes that led to the origin of body parts, with particular emphasis on the concept of “novelties.” It first considers the distinction between the evolution of adaptations and the origin of novelties, and more specifically innovations, before proposing a perspective of what evolutionary novelties are. To this end, a definition of morphological novelty is given, followed by a discussion of phenomenological modes for the origin of Type I novelties such as the differentiation of repeated elements. The chapter also describes how natural selection creates character individuality and concludes with an analysis of modularity, functional specialization, and robustness and canalization.Less
This chapter examines the evolutionary processes that led to the origin of body parts, with particular emphasis on the concept of “novelties.” It first considers the distinction between the evolution of adaptations and the origin of novelties, and more specifically innovations, before proposing a perspective of what evolutionary novelties are. To this end, a definition of morphological novelty is given, followed by a discussion of phenomenological modes for the origin of Type I novelties such as the differentiation of repeated elements. The chapter also describes how natural selection creates character individuality and concludes with an analysis of modularity, functional specialization, and robustness and canalization.
Jennifer A. Dunne, Ulrich Brose, Richard J. Williams, and Neo D. Martinez
- Published in print:
- 2005
- Published Online:
- September 2007
- ISBN:
- 9780198564836
- eISBN:
- 9780191713828
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198564836.003.0011
- Subject:
- Biology, Aquatic Biology
One of the challenges in the study of complex systems is how to integrate recent network structure discoveries with advances in modeling the dynamics of large non-linear systems. This is important ...
More
One of the challenges in the study of complex systems is how to integrate recent network structure discoveries with advances in modeling the dynamics of large non-linear systems. This is important for ecology, where the study of ecological networks — particularly in the form of webs — is a central organizing principle for research into the relationships between ecosystem complexity and diversity, and ecosystem stability, robustness, and persistence. Ecologists and physicists have applied recent advances in statistical mechanics of network topology to food web data. Such studies have uncovered general properties of food web structure, extended previous generalities of ‘real-world’ network structure, and new insights of complex network structure and robustness. However, although large complex networks of interacting species are observed in nature and their broad-scale structure is well described, few, if any, biologically plausible models have been able to simulate the persistent dynamics of such networks. Advanced theoretical insights into the dynamics of ecological networks are confined to relatively low-dimensional subsystems with less than a half dozen species, or to high-dimensional models that have a priori, biologically implausible stability assumptions. This chapter reviews efforts to characterize food web network structure as well as research into ecological non-linear dynamics. It also discusses approaches that seek to integrate food web structure and dynamics, highlighting factors that appear critical for the persistence and stability of complex species ecosystems.Less
One of the challenges in the study of complex systems is how to integrate recent network structure discoveries with advances in modeling the dynamics of large non-linear systems. This is important for ecology, where the study of ecological networks — particularly in the form of webs — is a central organizing principle for research into the relationships between ecosystem complexity and diversity, and ecosystem stability, robustness, and persistence. Ecologists and physicists have applied recent advances in statistical mechanics of network topology to food web data. Such studies have uncovered general properties of food web structure, extended previous generalities of ‘real-world’ network structure, and new insights of complex network structure and robustness. However, although large complex networks of interacting species are observed in nature and their broad-scale structure is well described, few, if any, biologically plausible models have been able to simulate the persistent dynamics of such networks. Advanced theoretical insights into the dynamics of ecological networks are confined to relatively low-dimensional subsystems with less than a half dozen species, or to high-dimensional models that have a priori, biologically implausible stability assumptions. This chapter reviews efforts to characterize food web network structure as well as research into ecological non-linear dynamics. It also discusses approaches that seek to integrate food web structure and dynamics, highlighting factors that appear critical for the persistence and stability of complex species ecosystems.
Domitilla Del Vecchio and Richard M. Murray
- Published in print:
- 2014
- Published Online:
- October 2017
- ISBN:
- 9780691161532
- eISBN:
- 9781400850501
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691161532.003.0003
- Subject:
- Biology, Biochemistry / Molecular Biology
This chapter turns to some of the tools from dynamical systems and feedback control theory that will be used in the rest of the text to analyze and design biological circuits. It first models the ...
More
This chapter turns to some of the tools from dynamical systems and feedback control theory that will be used in the rest of the text to analyze and design biological circuits. It first models the dynamics of a system using the input/output modeling formalism described in Chapter 1 and then studies the “robustness” of the system of a given function of the circuit. The chapter then discusses some of the underlying ideas for how to model biological oscillatory behavior, focusing on those types of oscillations that are most common in biomolecular systems. Hereafter, the chapter explores how the location of equilibrium points, their stability, their regions of attraction, and other dynamic phenomena vary based on the values of the parameters in a model. Finally, methods for reducing the complexity of the models that are introduced in this chapter are reviewed.Less
This chapter turns to some of the tools from dynamical systems and feedback control theory that will be used in the rest of the text to analyze and design biological circuits. It first models the dynamics of a system using the input/output modeling formalism described in Chapter 1 and then studies the “robustness” of the system of a given function of the circuit. The chapter then discusses some of the underlying ideas for how to model biological oscillatory behavior, focusing on those types of oscillations that are most common in biomolecular systems. Hereafter, the chapter explores how the location of equilibrium points, their stability, their regions of attraction, and other dynamic phenomena vary based on the values of the parameters in a model. Finally, methods for reducing the complexity of the models that are introduced in this chapter are reviewed.
Gregory Currie
- Published in print:
- 2010
- Published Online:
- May 2010
- ISBN:
- 9780199282609
- eISBN:
- 9780191712432
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199282609.003.0010
- Subject:
- Philosophy, Aesthetics, Philosophy of Mind
This chapter examines the relation between narrative and the psychological notion of Character, as exemplified in regularities of motive and behaviour which are robust under change of circumstance. ...
More
This chapter examines the relation between narrative and the psychological notion of Character, as exemplified in regularities of motive and behaviour which are robust under change of circumstance. Narratives often focus on more than simply the intentions that determine a particular action; they postulate more or less settled Characters for the people who perform those actions. It is argued that there is a natural connection between narrative and Character which makes the latter the natural mode of representation for the former, and gives Character a stabilizing and clarifying role in narrative. The twentieth century saw literary theorists turn against Character, originating with Knights' attack on Bradley's treatment of Shakespearean tragedy; it is argued that the literary case against character is weak.Less
This chapter examines the relation between narrative and the psychological notion of Character, as exemplified in regularities of motive and behaviour which are robust under change of circumstance. Narratives often focus on more than simply the intentions that determine a particular action; they postulate more or less settled Characters for the people who perform those actions. It is argued that there is a natural connection between narrative and Character which makes the latter the natural mode of representation for the former, and gives Character a stabilizing and clarifying role in narrative. The twentieth century saw literary theorists turn against Character, originating with Knights' attack on Bradley's treatment of Shakespearean tragedy; it is argued that the literary case against character is weak.
Anastasios Xepapadeas
- Published in print:
- 2012
- Published Online:
- May 2012
- ISBN:
- 9780199692873
- eISBN:
- 9780191738371
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199692873.003.0006
- Subject:
- Economics and Finance, Development, Growth, and Environmental
Robust policy rules and precaution might be called for, under conditions of scientific uncertainty, when a policy maker is concerned about possible misspecification of the natural system that is used ...
More
Robust policy rules and precaution might be called for, under conditions of scientific uncertainty, when a policy maker is concerned about possible misspecification of the natural system that is used to model pollution dynamics. Precaution, however, could be costly. The present chapter develops a conceptual framework for designing robust policy rules and estimating the cost of being precautious in the context of an international pollution control problem. Cooperative and non-cooperative robust policy rules are determined and the cost, in terms of value loss, of being robust relative to conventional policy rules is estimated.Less
Robust policy rules and precaution might be called for, under conditions of scientific uncertainty, when a policy maker is concerned about possible misspecification of the natural system that is used to model pollution dynamics. Precaution, however, could be costly. The present chapter develops a conceptual framework for designing robust policy rules and estimating the cost of being precautious in the context of an international pollution control problem. Cooperative and non-cooperative robust policy rules are determined and the cost, in terms of value loss, of being robust relative to conventional policy rules is estimated.
Peter Dayan
- Published in print:
- 2012
- Published Online:
- May 2016
- ISBN:
- 9780262018081
- eISBN:
- 9780262306027
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262018081.003.0009
- Subject:
- Psychology, Social Psychology
Animals are extremely robust decision makers. They make seemingly good choices in a very wide range of circumstances, using neural hardware that is noisy, labile, and error prone. This chapter ...
More
Animals are extremely robust decision makers. They make seemingly good choices in a very wide range of circumstances, using neural hardware that is noisy, labile, and error prone. This chapter considers dimensions of robustness that go beyond fault tolerance, including the effects of outliers and various forms of uncertainty, and discusses the multiple scales of robustness afforded by the rich complexities of neural control.Less
Animals are extremely robust decision makers. They make seemingly good choices in a very wide range of circumstances, using neural hardware that is noisy, labile, and error prone. This chapter considers dimensions of robustness that go beyond fault tolerance, including the effects of outliers and various forms of uncertainty, and discusses the multiple scales of robustness afforded by the rich complexities of neural control.
Peter M. Todd and Gerd Gigerenzer
- Published in print:
- 2012
- Published Online:
- May 2012
- ISBN:
- 9780195315448
- eISBN:
- 9780199932429
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195315448.001.0001
- Subject:
- Psychology, Cognitive Psychology, Human-Technology Interaction
The idea that more information and more computation yield better decisions has long shaped our vision of rationality. Yet humans and other animals typically rely on simple heuristics or rules of ...
More
The idea that more information and more computation yield better decisions has long shaped our vision of rationality. Yet humans and other animals typically rely on simple heuristics or rules of thumb to solve adaptive problems, focusing on one or a few important cues and ignoring the rest, and shortcutting computation rather than striving for as much as possible. In this book, the authors argue that in an uncertain world, more information and computation are not always better, and instead ask when, and why, less can be more. The answers to these questions constitute the idea of ecological rationality, as explored in the chapters in this book: how people can be effective decision makers by using simple heuristics that fit well to the structure of their environment. When people wield the right tool from the mind’s adaptive toolbox for a particular situation, they can make good choices with little information or computation—enabling simple strategies to excel by exploiting the reliable patterns in the world to do some of the work. Heuristics are not good or bad, “biased” or “unbiased,” on their own, but only in relation to the setting in which they are used. The authors show heuristics and environments fitting together to produce good decisions in domains including sports competitions, the search for a parking space, business group meetings, and doctor/patient interactions. The message of Ecological Rationality is to study mind and environment in tandem. Intelligence is not only in the mind but also in the world, captured in the structures of information inherent in our physical, biological, social, and cultural surroundings.Less
The idea that more information and more computation yield better decisions has long shaped our vision of rationality. Yet humans and other animals typically rely on simple heuristics or rules of thumb to solve adaptive problems, focusing on one or a few important cues and ignoring the rest, and shortcutting computation rather than striving for as much as possible. In this book, the authors argue that in an uncertain world, more information and computation are not always better, and instead ask when, and why, less can be more. The answers to these questions constitute the idea of ecological rationality, as explored in the chapters in this book: how people can be effective decision makers by using simple heuristics that fit well to the structure of their environment. When people wield the right tool from the mind’s adaptive toolbox for a particular situation, they can make good choices with little information or computation—enabling simple strategies to excel by exploiting the reliable patterns in the world to do some of the work. Heuristics are not good or bad, “biased” or “unbiased,” on their own, but only in relation to the setting in which they are used. The authors show heuristics and environments fitting together to produce good decisions in domains including sports competitions, the search for a parking space, business group meetings, and doctor/patient interactions. The message of Ecological Rationality is to study mind and environment in tandem. Intelligence is not only in the mind but also in the world, captured in the structures of information inherent in our physical, biological, social, and cultural surroundings.
Ricard Solé and Santiago F. Elena
- Published in print:
- 2018
- Published Online:
- May 2019
- ISBN:
- 9780691158846
- eISBN:
- 9780691185118
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691158846.003.0003
- Subject:
- Biology, Evolutionary Biology / Genetics
This chapter begins by discussing fitness landscape, an idea first introduced by evolutionary geneticist Sewall Wright and later extended by several other authors. The fitness landscape is defined in ...
More
This chapter begins by discussing fitness landscape, an idea first introduced by evolutionary geneticist Sewall Wright and later extended by several other authors. The fitness landscape is defined in terms of some particular traits that are implicit in the virus particle phenotype and are usually described in terms of replication rate or infectivity. The landscape appears in most textbook plots as a multi-peaked surface. Local maxima represent optimal fitness values, which can be reached through mutation from a subset of lower-fitness neighbors. Given an initial condition defined by a quasi-species distribution localized somewhere in the sequence space, the population will evolve by exploring nearest positions through mutation. The remainder of the chapter deals with symmetric competition, epistasis in RNA viruses, experimental virus landscapes, the survival of the flattest effect, and virus robustness.Less
This chapter begins by discussing fitness landscape, an idea first introduced by evolutionary geneticist Sewall Wright and later extended by several other authors. The fitness landscape is defined in terms of some particular traits that are implicit in the virus particle phenotype and are usually described in terms of replication rate or infectivity. The landscape appears in most textbook plots as a multi-peaked surface. Local maxima represent optimal fitness values, which can be reached through mutation from a subset of lower-fitness neighbors. Given an initial condition defined by a quasi-species distribution localized somewhere in the sequence space, the population will evolve by exploring nearest positions through mutation. The remainder of the chapter deals with symmetric competition, epistasis in RNA viruses, experimental virus landscapes, the survival of the flattest effect, and virus robustness.
Alfred Wagenhofer
- Published in print:
- 2004
- Published Online:
- January 2005
- ISBN:
- 9780199260621
- eISBN:
- 9780191601668
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0199260621.003.0001
- Subject:
- Economics and Finance, Financial Economics
The objective is to discuss the merits of analytical models in financial accounting research. Analytical models are particularly useful for gaining insights into situations that are characterized by ...
More
The objective is to discuss the merits of analytical models in financial accounting research. Analytical models are particularly useful for gaining insights into situations that are characterized by strategic interactions of various decision-makers with information asymmetry and conflicting interests. Describes the common model structures used in this kind of research, typical assumptions, and major results of the models. Provides examples including aggregations, conservatism, earnings management, and auditing as a basis for an evaluation of the costs and benefits of analytical research. A major advantage is the ability to derive results that run counter to common wisdom, thus enhancing our understanding of real phenomena as well as identifying the conditions under which certain results hold or do not hold. Finally, it considers robustness issues and empirical testing of analytical results, and policy recommendations based on them.Less
The objective is to discuss the merits of analytical models in financial accounting research. Analytical models are particularly useful for gaining insights into situations that are characterized by strategic interactions of various decision-makers with information asymmetry and conflicting interests. Describes the common model structures used in this kind of research, typical assumptions, and major results of the models. Provides examples including aggregations, conservatism, earnings management, and auditing as a basis for an evaluation of the costs and benefits of analytical research. A major advantage is the ability to derive results that run counter to common wisdom, thus enhancing our understanding of real phenomena as well as identifying the conditions under which certain results hold or do not hold. Finally, it considers robustness issues and empirical testing of analytical results, and policy recommendations based on them.
Peter Svedberg
- Published in print:
- 2000
- Published Online:
- November 2003
- ISBN:
- 9780198292685
- eISBN:
- 9780191596957
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198292686.003.0005
- Subject:
- Economics and Finance, Development, Growth, and Environmental
In this chapter, the most well‐known model used for estimating the prevalence of undernutrition worldwide is presented. This is the model proposed by the FAO and applied for monitoring the progress ...
More
In this chapter, the most well‐known model used for estimating the prevalence of undernutrition worldwide is presented. This is the model proposed by the FAO and applied for monitoring the progress towards the UN Millennium objective: to reduce undernutrition by half before 2015. The model comprises three main building blocks: the national food (calorie) supply, a function for the distribution of calories across households, and a norm for what is the lowest acceptable per person calorie intake in households. The main statistical data used by the FAO to estimate the model are presented. Finally, a simple robustness test is provided, demonstrating that the FAO estimates of undernutrition are highly sensitive even to small variations in the values attached to the three key parameters.Less
In this chapter, the most well‐known model used for estimating the prevalence of undernutrition worldwide is presented. This is the model proposed by the FAO and applied for monitoring the progress towards the UN Millennium objective: to reduce undernutrition by half before 2015. The model comprises three main building blocks: the national food (calorie) supply, a function for the distribution of calories across households, and a norm for what is the lowest acceptable per person calorie intake in households. The main statistical data used by the FAO to estimate the model are presented. Finally, a simple robustness test is provided, demonstrating that the FAO estimates of undernutrition are highly sensitive even to small variations in the values attached to the three key parameters.
Peter M. Todd and Gerd Gigerenzer
- Published in print:
- 2012
- Published Online:
- May 2012
- ISBN:
- 9780195315448
- eISBN:
- 9780199932429
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195315448.003.0011
- Subject:
- Psychology, Cognitive Psychology, Human-Technology Interaction
In our uncertain world, more information and computation do not always yield better decisions—simple heuristics that employ few cues and little processing can often match or beat optimizing ...
More
In our uncertain world, more information and computation do not always yield better decisions—simple heuristics that employ few cues and little processing can often match or beat optimizing strategies. This chapter explores when and why less can be more in decision making, through the study of the ecological rationality of decision mechanisms used in appropriate contexts. Ecological rationality appears when the structure of boundedly rational decision mechanisms matches the structure of information in the environment. The chapter introduces the ways that heuristic mechanisms are constructed, the types of information structure they can be applied to, and how to study the intelligent, adaptive behavior that emerges from the interaction of both mind and world.Less
In our uncertain world, more information and computation do not always yield better decisions—simple heuristics that employ few cues and little processing can often match or beat optimizing strategies. This chapter explores when and why less can be more in decision making, through the study of the ecological rationality of decision mechanisms used in appropriate contexts. Ecological rationality appears when the structure of boundedly rational decision mechanisms matches the structure of information in the environment. The chapter introduces the ways that heuristic mechanisms are constructed, the types of information structure they can be applied to, and how to study the intelligent, adaptive behavior that emerges from the interaction of both mind and world.
Søren Johansen and Bent Nielsen
- Published in print:
- 2009
- Published Online:
- September 2009
- ISBN:
- 9780199237197
- eISBN:
- 9780191717314
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199237197.003.0001
- Subject:
- Economics and Finance, Econometrics
This chapter analyzes an algorithm suggested by Hendry (1999) for estimation in a regression with more regressors than observations, with the purpose of finding an estimator that is robust to ...
More
This chapter analyzes an algorithm suggested by Hendry (1999) for estimation in a regression with more regressors than observations, with the purpose of finding an estimator that is robust to outliers and structural breaks. This estimator is an example of a one-step M-estimator based on Huber's skip function. The asymptotic theory is derived in the situation where there are no outliers or structural breaks using empirical process techniques. Stationary processes, trend stationary autoregressions, and unit root processes are considered.Less
This chapter analyzes an algorithm suggested by Hendry (1999) for estimation in a regression with more regressors than observations, with the purpose of finding an estimator that is robust to outliers and structural breaks. This estimator is an example of a one-step M-estimator based on Huber's skip function. The asymptotic theory is derived in the situation where there are no outliers or structural breaks using empirical process techniques. Stationary processes, trend stationary autoregressions, and unit root processes are considered.
Helen Steward
- Published in print:
- 2012
- Published Online:
- May 2012
- ISBN:
- 9780199552054
- eISBN:
- 9780191738838
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199552054.003.0007
- Subject:
- Philosophy, Metaphysics/Epistemology, Philosophy of Mind
This chapter deals with various objections to the solution to the Challenge from Chance that is offered in the previous chapter. The suggestion that a newly formulated version of the Challenge can ...
More
This chapter deals with various objections to the solution to the Challenge from Chance that is offered in the previous chapter. The suggestion that a newly formulated version of the Challenge can simply be raised again against the position offered is considered and rebutted. Objections stemming from so-called ‘Frankfurt-style’ examples are also considered. It is argued that Frankfurt-style examples do not succeed in showing that a power of refrainment is not essential to agency. The idea that the wanted power of refrainment is insufficiently ‘robust’ to constitute anything more than an insignificant ‘flicker of freedom’, as charged by Fischer, is also considered and rejected.Less
This chapter deals with various objections to the solution to the Challenge from Chance that is offered in the previous chapter. The suggestion that a newly formulated version of the Challenge can simply be raised again against the position offered is considered and rebutted. Objections stemming from so-called ‘Frankfurt-style’ examples are also considered. It is argued that Frankfurt-style examples do not succeed in showing that a power of refrainment is not essential to agency. The idea that the wanted power of refrainment is insufficiently ‘robust’ to constitute anything more than an insignificant ‘flicker of freedom’, as charged by Fischer, is also considered and rejected.
Michael Weisberg
- Published in print:
- 2013
- Published Online:
- May 2013
- ISBN:
- 9780199933662
- eISBN:
- 9780199333004
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199933662.001.0001
- Subject:
- Philosophy, Philosophy of Science
In the 1950s John Reber convinced many Californians that the best way to solve the state’s water shortages was to dam up the San Francisco Bay. Against massive political pressure, his opponents ...
More
In the 1950s John Reber convinced many Californians that the best way to solve the state’s water shortages was to dam up the San Francisco Bay. Against massive political pressure, his opponents convinced lawmakers that this would lead to disaster—not by empirical measurement alone, but by constructing a model. Simulation and Similarity explains why this was a good strategy. It provides an account of modeling and idealization in modern scientific practice, focusing on concrete, mathematical, and computational models. The book has three main themes: the nature of models, the practice of modeling, and nature of the relationship between models and real-world phenomena. In addition to its careful analysis of physical, computational, and mathematical models, one of Simulation and Similarity’s most novel features is Weisberg’s account of the model–world relationship. Breaking with the dominant tradition, which says this relation should be analyzed using logical notions such as isomorphism, Weisberg presents a similarity-based account called weighted feature-matching. This account of the model–world relationship is developed with an eye to understanding how modeling is actually practiced, hence it takes into account how scientists’ theoretical goals shape the ways in which their models are applied and analyzed.Less
In the 1950s John Reber convinced many Californians that the best way to solve the state’s water shortages was to dam up the San Francisco Bay. Against massive political pressure, his opponents convinced lawmakers that this would lead to disaster—not by empirical measurement alone, but by constructing a model. Simulation and Similarity explains why this was a good strategy. It provides an account of modeling and idealization in modern scientific practice, focusing on concrete, mathematical, and computational models. The book has three main themes: the nature of models, the practice of modeling, and nature of the relationship between models and real-world phenomena. In addition to its careful analysis of physical, computational, and mathematical models, one of Simulation and Similarity’s most novel features is Weisberg’s account of the model–world relationship. Breaking with the dominant tradition, which says this relation should be analyzed using logical notions such as isomorphism, Weisberg presents a similarity-based account called weighted feature-matching. This account of the model–world relationship is developed with an eye to understanding how modeling is actually practiced, hence it takes into account how scientists’ theoretical goals shape the ways in which their models are applied and analyzed.
Robert Hudson
- Published in print:
- 2013
- Published Online:
- September 2013
- ISBN:
- 9780199303281
- eISBN:
- 9780199367627
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199303281.001.0001
- Subject:
- Philosophy, Philosophy of Science
The main goal of Seeing Things is to criticize a common way of arguing called ‘robustness reasoning’. With robustness reasoning, one claims that an observation report is more likely to be true (or, ...
More
The main goal of Seeing Things is to criticize a common way of arguing called ‘robustness reasoning’. With robustness reasoning, one claims that an observation report is more likely to be true (or, is better justified) if this report is produced by multiple, independent sources (such as by multiple scientists, or by means of multiple experimental strategies). In the book it is argued that robustness reasoning lacks the special, informative value it is often claimed to have. This result is defended by both exposing key flaws in various popular, philosophical defences of robustness reasoning, as well as by recounting five episodes in the history of science where robustness reasoning was not used. These episodes include research into experimental microbiology, dark matter (and its possible constitution as WIMPs), Jean Perrin’s proof of the atomic nature of matter, and the accelerative expansion of the universe (dark energy). In addition to criticizing robustness reasoning, the book relates its analysis of the failure of robustness reasoning to an assessment of a highly popular approach to scientific realism called ‘(theoretical) preservationism’, arguing that those who defend this approach to realism commit similar errors to those that advocate robustness reasoning. In turn, a new form of realism is formulated and defended, called ‘methodological preservationism’, that recognizes the fundamental value to scientists (and the rest of us) of naked eye observation along with closely related technological and reason-based enhancements of such observation.Less
The main goal of Seeing Things is to criticize a common way of arguing called ‘robustness reasoning’. With robustness reasoning, one claims that an observation report is more likely to be true (or, is better justified) if this report is produced by multiple, independent sources (such as by multiple scientists, or by means of multiple experimental strategies). In the book it is argued that robustness reasoning lacks the special, informative value it is often claimed to have. This result is defended by both exposing key flaws in various popular, philosophical defences of robustness reasoning, as well as by recounting five episodes in the history of science where robustness reasoning was not used. These episodes include research into experimental microbiology, dark matter (and its possible constitution as WIMPs), Jean Perrin’s proof of the atomic nature of matter, and the accelerative expansion of the universe (dark energy). In addition to criticizing robustness reasoning, the book relates its analysis of the failure of robustness reasoning to an assessment of a highly popular approach to scientific realism called ‘(theoretical) preservationism’, arguing that those who defend this approach to realism commit similar errors to those that advocate robustness reasoning. In turn, a new form of realism is formulated and defended, called ‘methodological preservationism’, that recognizes the fundamental value to scientists (and the rest of us) of naked eye observation along with closely related technological and reason-based enhancements of such observation.
Peter Hammerstein and Jeffrey R. Stevens
- Published in print:
- 2012
- Published Online:
- May 2016
- ISBN:
- 9780262018081
- eISBN:
- 9780262306027
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262018081.003.0001
- Subject:
- Psychology, Social Psychology
How do organisms make decisions? The study of games and decisions has long been guided by a philosophical discourse on concepts of rationality and their implications. This discourse has led to a ...
More
How do organisms make decisions? The study of games and decisions has long been guided by a philosophical discourse on concepts of rationality and their implications. This discourse has led to a large body of mathematical work and kept generations of researchers busy, but few serious attempts have been made to understand decision making in the real world. Over the last decades, however, decision theory has moved toward the sciences and developed its “taste for the facts.” Research is now guided by experimental economics, cognitive psychology, behavioral biology, and—most recently—neuroscience. Despite the increasingly empirical leanings of decision science, the explanatory power of evolutionary theory has been neglected. This Strüngmann Forum was convened to rectify this oversight, with the goal of initiating an alternative to the existing axiom-based decision theory by developing a theory of decision making founded on evolutionary principles.Less
How do organisms make decisions? The study of games and decisions has long been guided by a philosophical discourse on concepts of rationality and their implications. This discourse has led to a large body of mathematical work and kept generations of researchers busy, but few serious attempts have been made to understand decision making in the real world. Over the last decades, however, decision theory has moved toward the sciences and developed its “taste for the facts.” Research is now guided by experimental economics, cognitive psychology, behavioral biology, and—most recently—neuroscience. Despite the increasingly empirical leanings of decision science, the explanatory power of evolutionary theory has been neglected. This Strüngmann Forum was convened to rectify this oversight, with the goal of initiating an alternative to the existing axiom-based decision theory by developing a theory of decision making founded on evolutionary principles.