Rein Taagepera
- Published in print:
- 2008
- Published Online:
- September 2008
- ISBN:
- 9780199534661
- eISBN:
- 9780191715921
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199534661.003.0011
- Subject:
- Political Science, Comparative Politics, Political Economy
The number of communication channels may well turn out to be a major building block in constructing quantitatively predictive logical models in social sciences. This number does determine ...
More
The number of communication channels may well turn out to be a major building block in constructing quantitatively predictive logical models in social sciences. This number does determine representative assembly sizes and mean durations of cabinets. Some physical and social processes involve minimization or maximization of some quantities. Models for various processes can be formulated as differential equations, especially those that express rates of change in time, space, etc. Quantities that are conserved during changes play an important role in physics and might do so in social sciences.Less
The number of communication channels may well turn out to be a major building block in constructing quantitatively predictive logical models in social sciences. This number does determine representative assembly sizes and mean durations of cabinets. Some physical and social processes involve minimization or maximization of some quantities. Models for various processes can be formulated as differential equations, especially those that express rates of change in time, space, etc. Quantities that are conserved during changes play an important role in physics and might do so in social sciences.
W. A. Bogart
- Published in print:
- 2010
- Published Online:
- January 2011
- ISBN:
- 9780195379877
- eISBN:
- 9780199869060
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195379877.001.0001
- Subject:
- Law, Legal Profession and Ethics
This book is on problem gambling and its regulation, and situates this analysis in the larger context of regulating excessive consumption. This work analyzes the effectiveness of the law in ...
More
This book is on problem gambling and its regulation, and situates this analysis in the larger context of regulating excessive consumption. This work analyzes the effectiveness of the law in controlling excessive consumption. It engages theoretical discussions concerning the effectiveness of legal intervention, especially regarding “normativity”, the relationship between law and norms. It also argues that various forms of over consumption (alcohol, smoking, non-nutritious eating) can be more effectively controlled by altering norms regarding them so that such excesses can be suppressed to a greater extent. Regulatory efforts are aimed not at forbidding consumption but at suppressing excessive aspects. In the case of tobacco, this means zero consumption since there is no safe level of smoking. In contrast, in terms of alcohol, this means encouraging consumption of only moderate amounts. Addictive drugs are, generally, prohibited, and their use is criminalized. But there is a significant measure of public opinion that prohibition does more harm than good; that permit but discourage would produce better results. The battle against obesity, a contested concept, focuses on encouraging eating nutritious foods and being physically active. The book then focuses on one form of consumption that is associated with major social issues: problem gambling. Regulation, to date, has been mostly on ensuring honesty regarding the various games and in promoting revenue enhancement for owners (often governments). However, in the face of the mounting evidence regarding the damage caused by those with impaired control, there are increasing calls for the regulatory frameworks to make “harm minimization” and related concepts a priority. “Harm minimization” brings permit but discourage to the fore in terms of gambling and problem gambling.Less
This book is on problem gambling and its regulation, and situates this analysis in the larger context of regulating excessive consumption. This work analyzes the effectiveness of the law in controlling excessive consumption. It engages theoretical discussions concerning the effectiveness of legal intervention, especially regarding “normativity”, the relationship between law and norms. It also argues that various forms of over consumption (alcohol, smoking, non-nutritious eating) can be more effectively controlled by altering norms regarding them so that such excesses can be suppressed to a greater extent. Regulatory efforts are aimed not at forbidding consumption but at suppressing excessive aspects. In the case of tobacco, this means zero consumption since there is no safe level of smoking. In contrast, in terms of alcohol, this means encouraging consumption of only moderate amounts. Addictive drugs are, generally, prohibited, and their use is criminalized. But there is a significant measure of public opinion that prohibition does more harm than good; that permit but discourage would produce better results. The battle against obesity, a contested concept, focuses on encouraging eating nutritious foods and being physically active. The book then focuses on one form of consumption that is associated with major social issues: problem gambling. Regulation, to date, has been mostly on ensuring honesty regarding the various games and in promoting revenue enhancement for owners (often governments). However, in the face of the mounting evidence regarding the damage caused by those with impaired control, there are increasing calls for the regulatory frameworks to make “harm minimization” and related concepts a priority. “Harm minimization” brings permit but discourage to the fore in terms of gambling and problem gambling.
James Bergin
- Published in print:
- 2005
- Published Online:
- July 2005
- ISBN:
- 9780199280292
- eISBN:
- 9780191602498
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0199280290.003.0020
- Subject:
- Economics and Finance, Microeconomics
Studies evolution and learning. The discussion includes fictitious play and replicator dynamics, and stochastic stability is considered at length. Following this, some detailed discussion of ...
More
Studies evolution and learning. The discussion includes fictitious play and replicator dynamics, and stochastic stability is considered at length. Following this, some detailed discussion of stochastic stability is given, including the computation of invariant distributions and minimum cost trees. Blackwell approachability is used to define strategies that minimize regret across all actions. This is then connected to correlated equilibrium. Calibrated forecasts are defined, and a connection to correlated equilibria also noted. Provides a brief discussion of Bayesian learning and the key role of the martingale convergence theorem.Less
Studies evolution and learning. The discussion includes fictitious play and replicator dynamics, and stochastic stability is considered at length. Following this, some detailed discussion of stochastic stability is given, including the computation of invariant distributions and minimum cost trees. Blackwell approachability is used to define strategies that minimize regret across all actions. This is then connected to correlated equilibrium. Calibrated forecasts are defined, and a connection to correlated equilibria also noted. Provides a brief discussion of Bayesian learning and the key role of the martingale convergence theorem.
John P. Burkett
- Published in print:
- 2006
- Published Online:
- October 2011
- ISBN:
- 9780195189629
- eISBN:
- 9780199850778
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195189629.003.0002
- Subject:
- Economics and Finance, Microeconomics
This chapter examines simple production processes with only two inputs and one output. It describes methods for solving a cost minimization problem or finding the cheapest combination of inputs that ...
More
This chapter examines simple production processes with only two inputs and one output. It describes methods for solving a cost minimization problem or finding the cheapest combination of inputs that produce a given output. This chapter examines simple production processes with only two inputs and one output. It describes methods for solving a cost minimization problem or finding the cheapest combination of inputs that produce a given output. To calculate the cost-minimizing combination of inputs, the isoquant's equation together with a formula equating the marginal rate of substitution (MRS) and the relative price of the input should be on the horizontal axis. If the input qualities obtained are both positive, they constitute an interior solution. However, if one of them is negative then a corner solution is required.Less
This chapter examines simple production processes with only two inputs and one output. It describes methods for solving a cost minimization problem or finding the cheapest combination of inputs that produce a given output. This chapter examines simple production processes with only two inputs and one output. It describes methods for solving a cost minimization problem or finding the cheapest combination of inputs that produce a given output. To calculate the cost-minimizing combination of inputs, the isoquant's equation together with a formula equating the marginal rate of substitution (MRS) and the relative price of the input should be on the horizontal axis. If the input qualities obtained are both positive, they constitute an interior solution. However, if one of them is negative then a corner solution is required.
John P. Burkett
- Published in print:
- 2006
- Published Online:
- October 2011
- ISBN:
- 9780195189629
- eISBN:
- 9780199850778
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195189629.003.0003
- Subject:
- Economics and Finance, Microeconomics
This chapter examines the use of linear programming in cost minimization efforts in production processes. Most economics have turned to linear programming to explain the convexity of isoquants, ...
More
This chapter examines the use of linear programming in cost minimization efforts in production processes. Most economics have turned to linear programming to explain the convexity of isoquants, explore substitution possibilities among large sets of inputs, and predict substitution possibilities involving new inputs. Simple linear programming problems can be solved by geometric reasoning while more complicated ones can be solved by algebraic methods.Less
This chapter examines the use of linear programming in cost minimization efforts in production processes. Most economics have turned to linear programming to explain the convexity of isoquants, explore substitution possibilities among large sets of inputs, and predict substitution possibilities involving new inputs. Simple linear programming problems can be solved by geometric reasoning while more complicated ones can be solved by algebraic methods.
Wanja Wiese
- Published in print:
- 2018
- Published Online:
- September 2018
- ISBN:
- 9780262036993
- eISBN:
- 9780262343275
- Item type:
- book
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262036993.001.0001
- Subject:
- Philosophy, Philosophy of Mind
The unity of the experienced world and the experienced self have puzzled humanity for centuries. How can we understand this and related types of phenomenal (i.e., experienced) unity? This book ...
More
The unity of the experienced world and the experienced self have puzzled humanity for centuries. How can we understand this and related types of phenomenal (i.e., experienced) unity? This book develops an interdisciplinary account of phenomenal unity. It focuses on examples of experienced wholes such as perceived objects (chairs and tables, but also groups of objects), bodily experiences, successions of events, and the attentional structure of consciousness. As a first step, the book investigates how the unity of consciousness can be characterized phenomenologically: what is it like to experience wholes, what is the experiential contribution of phenomenal unity? This raises conceptual and empirical questions. In addressing these questions, connections are drawn to phenomenological accounts and research on Gestalt theory. As a second step, the book suggests how phenomenal unity can be analyzed computationally, by drawing on concepts and ideas of the framework of predictive processing. The result is a conceptual framework, as well as an interdisciplinary account of phenomenal unity: the regularity account of phenomenal unity. According to this account, experienced wholes correspond to a hierarchy of connecting regularities. The brain tracks these regularities by hierarchical prediction error minimization, which approximates hierarchical Bayesian inference.Less
The unity of the experienced world and the experienced self have puzzled humanity for centuries. How can we understand this and related types of phenomenal (i.e., experienced) unity? This book develops an interdisciplinary account of phenomenal unity. It focuses on examples of experienced wholes such as perceived objects (chairs and tables, but also groups of objects), bodily experiences, successions of events, and the attentional structure of consciousness. As a first step, the book investigates how the unity of consciousness can be characterized phenomenologically: what is it like to experience wholes, what is the experiential contribution of phenomenal unity? This raises conceptual and empirical questions. In addressing these questions, connections are drawn to phenomenological accounts and research on Gestalt theory. As a second step, the book suggests how phenomenal unity can be analyzed computationally, by drawing on concepts and ideas of the framework of predictive processing. The result is a conceptual framework, as well as an interdisciplinary account of phenomenal unity: the regularity account of phenomenal unity. According to this account, experienced wholes correspond to a hierarchy of connecting regularities. The brain tracks these regularities by hierarchical prediction error minimization, which approximates hierarchical Bayesian inference.
Jakob Hohwy
- Published in print:
- 2013
- Published Online:
- January 2014
- ISBN:
- 9780199682737
- eISBN:
- 9780191766350
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199682737.001.0001
- Subject:
- Philosophy, Philosophy of Mind
A new theory is taking hold in neuroscience. The theory is increasingly being used to interpret and drive experimental and theoretical studies, and it is finding its way into many other domains of ...
More
A new theory is taking hold in neuroscience. The theory is increasingly being used to interpret and drive experimental and theoretical studies, and it is finding its way into many other domains of research on the mind. It is the theory that the brain is a sophisticated hypothesis-testing mechanism, which is constantly involved in minimizing the error of its predictions about the sensory input it receives from the world. This mechanism is meant to explain perception and action and everything mental in between. It is an attractive theory because powerful theoretical arguments support it. It is also attractive because more and more empirical evidence is beginning to point in its favour. It has enormous unifying power and yet it can explain in detail too. This book explores this theory. It explains how the theory works and how it applies; it sets out why the theory is attractive; and it shows why and how the central ideas behind the theory profoundly change how we should conceive of perception, action, attention, and other central aspects of the mind. The central argument of the book is that the simple idea of prediction error minimization offers a surprisingly good, explanatory fit with our actual perceptual phenomenology, and that it throws new light on core, intriguing aspects of the nature of mind.Less
A new theory is taking hold in neuroscience. The theory is increasingly being used to interpret and drive experimental and theoretical studies, and it is finding its way into many other domains of research on the mind. It is the theory that the brain is a sophisticated hypothesis-testing mechanism, which is constantly involved in minimizing the error of its predictions about the sensory input it receives from the world. This mechanism is meant to explain perception and action and everything mental in between. It is an attractive theory because powerful theoretical arguments support it. It is also attractive because more and more empirical evidence is beginning to point in its favour. It has enormous unifying power and yet it can explain in detail too. This book explores this theory. It explains how the theory works and how it applies; it sets out why the theory is attractive; and it shows why and how the central ideas behind the theory profoundly change how we should conceive of perception, action, attention, and other central aspects of the mind. The central argument of the book is that the simple idea of prediction error minimization offers a surprisingly good, explanatory fit with our actual perceptual phenomenology, and that it throws new light on core, intriguing aspects of the nature of mind.
Marianne Mason
- Published in print:
- 2020
- Published Online:
- September 2020
- ISBN:
- 9780226647654
- eISBN:
- 9780226647821
- Item type:
- chapter
- Publisher:
- University of Chicago Press
- DOI:
- 10.7208/chicago/9780226647821.003.0004
- Subject:
- Linguistics, Sociolinguistics / Anthropological Linguistics
This chapter examines the discursive features of three of the interrogation strategies most commonly used in the Reid interrogation method: the sympathetic-detective/minimization strategy, ...
More
This chapter examines the discursive features of three of the interrogation strategies most commonly used in the Reid interrogation method: the sympathetic-detective/minimization strategy, confronting the suspect with evidence of guilt, and appealing to the suspect’s self-interest. The data for the chapter includes the police interrogations of two suspects who were charged with murder and rape respectively. The analysis shows how the police officers in each case dismissed the invocations of the right to counsel of both suspects and proceeded to use the three aforementioned strategies to direct the suspects to provide a confession, while taking ‘innocence off the table’ and ignoring the suspects’ frequent denials. Removing the option of a suspect’s innocence, particularly if it leads to a suspect providing information or a confession, may substantiate (partially or fully) the police officers’ construction of a suspect’s alleged guilt.Less
This chapter examines the discursive features of three of the interrogation strategies most commonly used in the Reid interrogation method: the sympathetic-detective/minimization strategy, confronting the suspect with evidence of guilt, and appealing to the suspect’s self-interest. The data for the chapter includes the police interrogations of two suspects who were charged with murder and rape respectively. The analysis shows how the police officers in each case dismissed the invocations of the right to counsel of both suspects and proceeded to use the three aforementioned strategies to direct the suspects to provide a confession, while taking ‘innocence off the table’ and ignoring the suspects’ frequent denials. Removing the option of a suspect’s innocence, particularly if it leads to a suspect providing information or a confession, may substantiate (partially or fully) the police officers’ construction of a suspect’s alleged guilt.
Philip Gaines
- Published in print:
- 2020
- Published Online:
- September 2020
- ISBN:
- 9780226647654
- eISBN:
- 9780226647821
- Item type:
- chapter
- Publisher:
- University of Chicago Press
- DOI:
- 10.7208/chicago/9780226647821.003.0005
- Subject:
- Linguistics, Sociolinguistics / Anthropological Linguistics
Most American police interviewers have traditionally employed a range of manipulative interrogation techniques on suspects whom they deem to be guilty in order to extract a confession. Scholars of ...
More
Most American police interviewers have traditionally employed a range of manipulative interrogation techniques on suspects whom they deem to be guilty in order to extract a confession. Scholars of police interrogation and confession have identified two overarching approaches—maximization and minimization—used by police to, respectively, induce a sense of hopelessness in view of the dire consequences of not confessing and provide mitigating justifications for the suspect in order to make confession more palatable. One such minimization technique is the mitigation of blame—the focus of this chapter. Discourse analysis of the patterns of questioning in the interrogation of a 16-year-old accused of smothering her baby shows the development of a number of themes that are combined to mitigate the suspect’s blame for the alleged murder: the discursive construction of provocation, spontaneity, and responsibility. In summary, interrogators mitigated blame by constructing a narrative in which the suspect 1) was provoked to act because of resentment toward her mother over being burdened with excessive child care, 2) acted spontaneously rather than premeditatedly, and 3) suffered from mental and relational dysfunction for which she was responsible.Less
Most American police interviewers have traditionally employed a range of manipulative interrogation techniques on suspects whom they deem to be guilty in order to extract a confession. Scholars of police interrogation and confession have identified two overarching approaches—maximization and minimization—used by police to, respectively, induce a sense of hopelessness in view of the dire consequences of not confessing and provide mitigating justifications for the suspect in order to make confession more palatable. One such minimization technique is the mitigation of blame—the focus of this chapter. Discourse analysis of the patterns of questioning in the interrogation of a 16-year-old accused of smothering her baby shows the development of a number of themes that are combined to mitigate the suspect’s blame for the alleged murder: the discursive construction of provocation, spontaneity, and responsibility. In summary, interrogators mitigated blame by constructing a narrative in which the suspect 1) was provoked to act because of resentment toward her mother over being burdened with excessive child care, 2) acted spontaneously rather than premeditatedly, and 3) suffered from mental and relational dysfunction for which she was responsible.
Jakob Hohwy
- Published in print:
- 2015
- Published Online:
- May 2016
- ISBN:
- 9780262029346
- eISBN:
- 9780262330213
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262029346.003.0012
- Subject:
- Philosophy, Moral Philosophy
Jakob Hohwy seeks to recover an approach to consciousness from a general theory of brain function, namely the prediction error minimization theory. The way this theory applies to mental and ...
More
Jakob Hohwy seeks to recover an approach to consciousness from a general theory of brain function, namely the prediction error minimization theory. The way this theory applies to mental and developmental disorder demonstrates its relevance to consciousness. The resulting view is discussed in relation to a contemporary theory of consciousness, namely, the idea that conscious perception depends on Bayesian metacognition which is also supported by considerations of psychopathology. This Bayesian theory is first disconnected from the higher-order thought theory, and then, via a prediction error conception of action, connected instead to the global workspace theory. Considerations of mental and developmental disorder therefore show that a very general theory of brain function is relevant to explaining the structure of conscious perception. Furthermore, Hohwy argues that this theory can unify two contemporary approaches to consciousness in a move that seeks to elucidate the fundamental mechanism for the selection of representational content into consciousness.Less
Jakob Hohwy seeks to recover an approach to consciousness from a general theory of brain function, namely the prediction error minimization theory. The way this theory applies to mental and developmental disorder demonstrates its relevance to consciousness. The resulting view is discussed in relation to a contemporary theory of consciousness, namely, the idea that conscious perception depends on Bayesian metacognition which is also supported by considerations of psychopathology. This Bayesian theory is first disconnected from the higher-order thought theory, and then, via a prediction error conception of action, connected instead to the global workspace theory. Considerations of mental and developmental disorder therefore show that a very general theory of brain function is relevant to explaining the structure of conscious perception. Furthermore, Hohwy argues that this theory can unify two contemporary approaches to consciousness in a move that seeks to elucidate the fundamental mechanism for the selection of representational content into consciousness.
F. M. Kamm
- Published in print:
- 2001
- Published Online:
- November 2003
- ISBN:
- 9780195144024
- eISBN:
- 9780199870998
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0195144023.003.0011
- Subject:
- Philosophy, Moral Philosophy
Examine the question of whether it is morally permissible to treat people in ways ruled out by the Principle of Permissible Harm (PPH; this was introduced in Ch. 7 and provides an account of certain ...
More
Examine the question of whether it is morally permissible to treat people in ways ruled out by the Principle of Permissible Harm (PPH; this was introduced in Ch. 7 and provides an account of certain restrictions/constraints on killing) only for the sake of minimizing violations of the PPH itself, or whether there is a constraint on doing this. Having considered alternative grounds for a constraint in Ch. 9, Ch. 10 fleshes out a victim‐focussed, agent‐neutral, rights‐based view founded in a strengthened PPH right (constraint), which protects against minimizing violations of the right by violating the right; consideration is given to whether and in what sense minimizing violations of PPH rights by violating them would be both strictly irrational and also exhibit lack of concern for the right. Rejecting this as a route to founding the constraint (on grounds that strict irrationality would arise only if there were already a constraint (or an absolute right not to be killed)), a constraint is generated by focusing on a concern that is at the heart of the PPH and applying it to the pursuit of any goal (utility or minimization of rights violations), the chapter considers how the permissibility of minimization would alter every person's status, and examines the distinction between eliminating a right, violating it, and infringing it, focussing on the significance of negative residues of, and compensation for, rights violations. An exploration is made of whether the structure of deontological and consequentialist theories can be brought closer together via the agent‐neutral value of an inviolable status (of a certain sort), though a distinction is made between the irrationality argument against minimizing the violation of constraints, and support for a concept of the person as strongly inviolable. It is also considered whether creatures who are inviolable are therefore more important entities whose existence makes the world a better place and whether belief in a constraint affects both how good the world is and the effect of acts done in accord with or in opposition to the constraint; further examination is made of the futility of permitting minimization of rights violations by violating rights (‘futilitarianism’) by contrasting the role of utility vs rights per se in motivating minimizing.Less
Examine the question of whether it is morally permissible to treat people in ways ruled out by the Principle of Permissible Harm (PPH; this was introduced in Ch. 7 and provides an account of certain restrictions/constraints on killing) only for the sake of minimizing violations of the PPH itself, or whether there is a constraint on doing this. Having considered alternative grounds for a constraint in Ch. 9, Ch. 10 fleshes out a victim‐focussed, agent‐neutral, rights‐based view founded in a strengthened PPH right (constraint), which protects against minimizing violations of the right by violating the right; consideration is given to whether and in what sense minimizing violations of PPH rights by violating them would be both strictly irrational and also exhibit lack of concern for the right. Rejecting this as a route to founding the constraint (on grounds that strict irrationality would arise only if there were already a constraint (or an absolute right not to be killed)), a constraint is generated by focusing on a concern that is at the heart of the PPH and applying it to the pursuit of any goal (utility or minimization of rights violations), the chapter considers how the permissibility of minimization would alter every person's status, and examines the distinction between eliminating a right, violating it, and infringing it, focussing on the significance of negative residues of, and compensation for, rights violations. An exploration is made of whether the structure of deontological and consequentialist theories can be brought closer together via the agent‐neutral value of an inviolable status (of a certain sort), though a distinction is made between the irrationality argument against minimizing the violation of constraints, and support for a concept of the person as strongly inviolable. It is also considered whether creatures who are inviolable are therefore more important entities whose existence makes the world a better place and whether belief in a constraint affects both how good the world is and the effect of acts done in accord with or in opposition to the constraint; further examination is made of the futility of permitting minimization of rights violations by violating rights (‘futilitarianism’) by contrasting the role of utility vs rights per se in motivating minimizing.
John A. Hawkins
- Published in print:
- 2004
- Published Online:
- January 2010
- ISBN:
- 9780199252695
- eISBN:
- 9780191719301
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199252695.003.006
- Subject:
- Linguistics, Syntax and Morphology
This chapter continues the discussion of domain minimization by examining the impact of reduced formal marking on relative positioning. Section 6.1 first presents some data from performance, ...
More
This chapter continues the discussion of domain minimization by examining the impact of reduced formal marking on relative positioning. Section 6.1 first presents some data from performance, principally from corpus studies of alternating structures in English. Section 6.2 then presents corresponding cross-linguistic data from grammars that test the Performance–Grammar Correspondence Hypothesis. Section 6.3 considers classical morphological typology from the processing perspective of this chapter.Less
This chapter continues the discussion of domain minimization by examining the impact of reduced formal marking on relative positioning. Section 6.1 first presents some data from performance, principally from corpus studies of alternating structures in English. Section 6.2 then presents corresponding cross-linguistic data from grammars that test the Performance–Grammar Correspondence Hypothesis. Section 6.3 considers classical morphological typology from the processing perspective of this chapter.
Jakob Hohwy
- Published in print:
- 2013
- Published Online:
- January 2014
- ISBN:
- 9780199682737
- eISBN:
- 9780191766350
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199682737.003.0014
- Subject:
- Philosophy, Philosophy of Mind
A brief concluding section summarizes the main points that have emerged from considering the predictive mind. The mind exists in prediction. Our perceptual experience of the world arises in our ...
More
A brief concluding section summarizes the main points that have emerged from considering the predictive mind. The mind exists in prediction. Our perceptual experience of the world arises in our attempts at predicting our own current sensory input. This notion spreads to attention and agency. Perception, attention and agency are three different ways of doing the same thing: accounting for sensory input as well as we can from inside the confines of the skull. We are good at this, mostly, but it is a precarious and fragile process because we are hostages to our prior beliefs, our noisy brains, the uncertain sensory deliverances from the world, and to the brain’s urge to rid itself efficiently of prediction error. Intriguing approaches to a number of different mental phenomena arise from applying to them the simple idea that the brain does nothing but minimize its prediction error.Less
A brief concluding section summarizes the main points that have emerged from considering the predictive mind. The mind exists in prediction. Our perceptual experience of the world arises in our attempts at predicting our own current sensory input. This notion spreads to attention and agency. Perception, attention and agency are three different ways of doing the same thing: accounting for sensory input as well as we can from inside the confines of the skull. We are good at this, mostly, but it is a precarious and fragile process because we are hostages to our prior beliefs, our noisy brains, the uncertain sensory deliverances from the world, and to the brain’s urge to rid itself efficiently of prediction error. Intriguing approaches to a number of different mental phenomena arise from applying to them the simple idea that the brain does nothing but minimize its prediction error.
Matt Ryan
- Published in print:
- 2021
- Published Online:
- January 2022
- ISBN:
- 9781529209921
- eISBN:
- 9781529209952
- Item type:
- chapter
- Publisher:
- Policy Press
- DOI:
- 10.1332/policypress/9781529209921.003.0008
- Subject:
- Political Science, Comparative Politics
We can learn much from independently explaining the absence of citizen control. Where programmes have failed to empower there are at least two sufficient explanations. One is the absence of ...
More
We can learn much from independently explaining the absence of citizen control. Where programmes have failed to empower there are at least two sufficient explanations. One is the absence of participatory leadership alone, which resonates with the idea that participatory leadership is a necessary condition for success. The other is the combined absence of bureaucratic support, autonomous civil society, and financial freedoms. Designers/adopters of participatory programmes are reminded that the otherwise beneficial features of political leadership can be immaterial where civil society and organisational staff participation is desultory, and funds are lacking. A key remaining puzzle is what distinguishes citizen control from its absence in cases where participatory leadership is present. The answers to this question remain inconclusive. Future research may need to investigate more closely whether the presence of some conditions are beneficial in the early stages of development of a PB but less so later and vice versa.Less
We can learn much from independently explaining the absence of citizen control. Where programmes have failed to empower there are at least two sufficient explanations. One is the absence of participatory leadership alone, which resonates with the idea that participatory leadership is a necessary condition for success. The other is the combined absence of bureaucratic support, autonomous civil society, and financial freedoms. Designers/adopters of participatory programmes are reminded that the otherwise beneficial features of political leadership can be immaterial where civil society and organisational staff participation is desultory, and funds are lacking. A key remaining puzzle is what distinguishes citizen control from its absence in cases where participatory leadership is present. The answers to this question remain inconclusive. Future research may need to investigate more closely whether the presence of some conditions are beneficial in the early stages of development of a PB but less so later and vice versa.
Natasha Du Rose
- Published in print:
- 2015
- Published Online:
- January 2016
- ISBN:
- 9781847426727
- eISBN:
- 9781447307839
- Item type:
- chapter
- Publisher:
- Policy Press
- DOI:
- 10.1332/policypress/9781847426727.003.0004
- Subject:
- Social Work, Crime and Justice
Chapter 5 is concerned with the public health strand of drug policy discourse and the technology of medicalisation underpinning it. Medicalization operates as a form of control and regulation whereby ...
More
Chapter 5 is concerned with the public health strand of drug policy discourse and the technology of medicalisation underpinning it. Medicalization operates as a form of control and regulation whereby social structural issues such as poverty and social inequalities are individualised and deflected into the realm of disease. The chapter discusses the disease model of addiction, the construction of drug users, and drug using women in particular, as pathological. How female users are situated in contemporary drug policy, as on the one hand irresponsible, bad choice makers and on the other as responsible for their predicament are explored. Harm minimisation and recovery discourse, coercive policies and the marginalisation of women in treatment is discussed. It is argued that medicalisation has been deployed in part to facilitate and reinforce punishment regimes, the widening of the net of social control and fails to address the social problems female users face.Less
Chapter 5 is concerned with the public health strand of drug policy discourse and the technology of medicalisation underpinning it. Medicalization operates as a form of control and regulation whereby social structural issues such as poverty and social inequalities are individualised and deflected into the realm of disease. The chapter discusses the disease model of addiction, the construction of drug users, and drug using women in particular, as pathological. How female users are situated in contemporary drug policy, as on the one hand irresponsible, bad choice makers and on the other as responsible for their predicament are explored. Harm minimisation and recovery discourse, coercive policies and the marginalisation of women in treatment is discussed. It is argued that medicalisation has been deployed in part to facilitate and reinforce punishment regimes, the widening of the net of social control and fails to address the social problems female users face.
Kenneth McLaughlin
- Published in print:
- 2008
- Published Online:
- March 2012
- ISBN:
- 9781847420459
- eISBN:
- 9781447303572
- Item type:
- chapter
- Publisher:
- Policy Press
- DOI:
- 10.1332/policypress/9781847420459.003.0005
- Subject:
- Social Work, Social Policy
This chapter is about statutory mental health social work, particularly in relation to the contemporary concern with risk management and risk minimisation. Social work here is seen as having a ...
More
This chapter is about statutory mental health social work, particularly in relation to the contemporary concern with risk management and risk minimisation. Social work here is seen as having a primary role in the assessment of risk. By discussing the wider societal preoccupation with risk avoidance, its incorporation into social policy and social work can be highlighted, providing a clear example of where social work practice cannot be divorced from wider societal trends. A new Mental Health Act that amends the 1983 Act has been influenced, to a large extent, by high-profile tragedies where psychiatric patients have killed themselves or others. By discussing the actual threat posed by such people, it can be shown that the fear outweighs the reality of danger, and the dangers to civil liberties inherent in the proposed changes can be highlighted. This chapter also focuses on the increase in coercive measures by social workers under the Mental Health Act of 1983.Less
This chapter is about statutory mental health social work, particularly in relation to the contemporary concern with risk management and risk minimisation. Social work here is seen as having a primary role in the assessment of risk. By discussing the wider societal preoccupation with risk avoidance, its incorporation into social policy and social work can be highlighted, providing a clear example of where social work practice cannot be divorced from wider societal trends. A new Mental Health Act that amends the 1983 Act has been influenced, to a large extent, by high-profile tragedies where psychiatric patients have killed themselves or others. By discussing the actual threat posed by such people, it can be shown that the fear outweighs the reality of danger, and the dangers to civil liberties inherent in the proposed changes can be highlighted. This chapter also focuses on the increase in coercive measures by social workers under the Mental Health Act of 1983.
Gillian Abel and Lisa Fitzgerald
- Published in print:
- 2010
- Published Online:
- March 2012
- ISBN:
- 9781847423344
- eISBN:
- 9781447303664
- Item type:
- chapter
- Publisher:
- Policy Press
- DOI:
- 10.1332/policypress/9781847423344.003.0001
- Subject:
- Social Work, Crime and Justice
Prior to the passing of the Prostitution Reform Act of 2003 (PRA), although sex work in New Zealand was not deemed illegal, the activities associated with it, such as soliciting, brothel keeping, ...
More
Prior to the passing of the Prostitution Reform Act of 2003 (PRA), although sex work in New Zealand was not deemed illegal, the activities associated with it, such as soliciting, brothel keeping, living on the earnings of prostitution, and procuring, were criminalised. This criminalisation of sex-work-related activities led to violence, coercion, and exploitation. For nearly two decades the New Zealand Prostitutes's Collective (NZPC), together with politicians, women's rights activists, academics, and other volunteers, advocated and lobbied for legislative change. And in June 2003, New Zealand became the first country to decriminalise sex work when the PRA was voted and passed. This legislative approach is different from other international approaches as it represents a shift from regulating sex work from a moral perspective to acknowledging the human rights of this section of the population. By decriminalisation, prostitution and sex work were acknowledged as service work. And through the legislative reform, sex workers in New Zealand were able to operate under the same employment and legal rights accorded to any other occupational group. This book examines the decriminalisation of prostitution in New Zealand. It looks at the particularities of the history of prostitution in the country, how it evolved, and how it has gained acceptance by the public. Chapters Two to Six examine the passing of the PRA in 2003. Chapter Seven outlines the statutory authority for the Prostitution Law Review Committee, its membership, and its role. Chapter Eight presents a research project commissioned by the Ministry of Justice for the review of the PRA. Chapter Ten to Fourteen provide a detailed review of the research done by the Christchurch School of Medicine. In these four chapters, the methodological approach, the public health authorities's experiences, the role of media, the decriminalisation and harm minimisation, and the ongoing perceptions of stigma, form the focus. The concluding chapter brings together the material covered in the book by summarising the effects of decriminalisation of the sex industry in New Zealand.Less
Prior to the passing of the Prostitution Reform Act of 2003 (PRA), although sex work in New Zealand was not deemed illegal, the activities associated with it, such as soliciting, brothel keeping, living on the earnings of prostitution, and procuring, were criminalised. This criminalisation of sex-work-related activities led to violence, coercion, and exploitation. For nearly two decades the New Zealand Prostitutes's Collective (NZPC), together with politicians, women's rights activists, academics, and other volunteers, advocated and lobbied for legislative change. And in June 2003, New Zealand became the first country to decriminalise sex work when the PRA was voted and passed. This legislative approach is different from other international approaches as it represents a shift from regulating sex work from a moral perspective to acknowledging the human rights of this section of the population. By decriminalisation, prostitution and sex work were acknowledged as service work. And through the legislative reform, sex workers in New Zealand were able to operate under the same employment and legal rights accorded to any other occupational group. This book examines the decriminalisation of prostitution in New Zealand. It looks at the particularities of the history of prostitution in the country, how it evolved, and how it has gained acceptance by the public. Chapters Two to Six examine the passing of the PRA in 2003. Chapter Seven outlines the statutory authority for the Prostitution Law Review Committee, its membership, and its role. Chapter Eight presents a research project commissioned by the Ministry of Justice for the review of the PRA. Chapter Ten to Fourteen provide a detailed review of the research done by the Christchurch School of Medicine. In these four chapters, the methodological approach, the public health authorities's experiences, the role of media, the decriminalisation and harm minimisation, and the ongoing perceptions of stigma, form the focus. The concluding chapter brings together the material covered in the book by summarising the effects of decriminalisation of the sex industry in New Zealand.
Gillian Abel and Lisa Fitzgerald
- Published in print:
- 2010
- Published Online:
- March 2012
- ISBN:
- 9781847423344
- eISBN:
- 9781447303664
- Item type:
- chapter
- Publisher:
- Policy Press
- DOI:
- 10.1332/policypress/9781847423344.003.0013
- Subject:
- Social Work, Crime and Justice
In the sex-worker population, the priority of harm minimisation has been to minimise disease transmission through educating the workers on safe-sex practices. This harm-minimisation approach assumes ...
More
In the sex-worker population, the priority of harm minimisation has been to minimise disease transmission through educating the workers on safe-sex practices. This harm-minimisation approach assumes that by educating the sex workers on HIV/AIDS and informing them of their responsibility in preventing transmission, they would make rational choices to protect themselves and others. However, the assumption that sex workers are vectors of diseases marginalises and blames them without considering the implications of poverty, gender, public fear, and law. For a more effective health approach in the sex industry, it is hence vital to consider structural and political issues such as poverty and law. Within this context, public-health workers have been challenged to take a more holistic approach to health promotion for sex workers. In addition to taking into consideration the health of the workers, their human rights and the need to create a safer working environment were also considered in the risk-management and harm-minimisation assessment of the sex industry. Within the broad context of harm minimisation commenced the advocacy for the decriminalisation of the sex industry. It was envisaged that by repealing the laws which criminalised sex-work activities, the autonomy of the sex workers and their capability to protect themselves would increase. It was theorised that under a decriminalised system, human rights and set standards for working environments would improve the health and safety of the workers. This chapter examines whether there have been gains for sex-worker health and safety in a decriminalised environment. It examines the main threats to health and safety identified by the participants: risks to sexual health, risks of violence and exploitation, and risks to emotional health.Less
In the sex-worker population, the priority of harm minimisation has been to minimise disease transmission through educating the workers on safe-sex practices. This harm-minimisation approach assumes that by educating the sex workers on HIV/AIDS and informing them of their responsibility in preventing transmission, they would make rational choices to protect themselves and others. However, the assumption that sex workers are vectors of diseases marginalises and blames them without considering the implications of poverty, gender, public fear, and law. For a more effective health approach in the sex industry, it is hence vital to consider structural and political issues such as poverty and law. Within this context, public-health workers have been challenged to take a more holistic approach to health promotion for sex workers. In addition to taking into consideration the health of the workers, their human rights and the need to create a safer working environment were also considered in the risk-management and harm-minimisation assessment of the sex industry. Within the broad context of harm minimisation commenced the advocacy for the decriminalisation of the sex industry. It was envisaged that by repealing the laws which criminalised sex-work activities, the autonomy of the sex workers and their capability to protect themselves would increase. It was theorised that under a decriminalised system, human rights and set standards for working environments would improve the health and safety of the workers. This chapter examines whether there have been gains for sex-worker health and safety in a decriminalised environment. It examines the main threats to health and safety identified by the participants: risks to sexual health, risks of violence and exploitation, and risks to emotional health.
K. Shrader-Frechette
- Published in print:
- 1998
- Published Online:
- March 2012
- ISBN:
- 9780198765295
- eISBN:
- 9780191695292
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198765295.003.0009
- Subject:
- Law, Philosophy of Law
In cases of uncertainty, choosing a maximin strategy typically minimizes public risk (to citizens) and maximizes industry risk (to those responsible for dangerous technology). But this raises the ...
More
In cases of uncertainty, choosing a maximin strategy typically minimizes public risk (to citizens) and maximizes industry risk (to those responsible for dangerous technology). But this raises the question of whether, in a situation of uncertainty where we must do one of the other, we ought to minimize industry risk or public risk. This chapter argues that rational risk evaluation and management often requires us to minimize public risk.Less
In cases of uncertainty, choosing a maximin strategy typically minimizes public risk (to citizens) and maximizes industry risk (to those responsible for dangerous technology). But this raises the question of whether, in a situation of uncertainty where we must do one of the other, we ought to minimize industry risk or public risk. This chapter argues that rational risk evaluation and management often requires us to minimize public risk.
Masashi Sugiyama and Motoaki Kawanabe
- Published in print:
- 2012
- Published Online:
- September 2013
- ISBN:
- 9780262017091
- eISBN:
- 9780262301220
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262017091.003.0002
- Subject:
- Computer Science, Machine Learning
This chapter discusses function learning methods under covariate shift. Ordinary empirical risk minimization learning is not consistent under covariate shift for misspecified models, and this ...
More
This chapter discusses function learning methods under covariate shift. Ordinary empirical risk minimization learning is not consistent under covariate shift for misspecified models, and this inconsistency issue can be resolved by considering importance-weighted loss functions. Here, various importance-weighted empirical risk minimization methods are introduced, including least squares and Huber’s method for regression, and Fisher discriminant analysis, logistic regression, support vector machines, and boosting for classification. Their adaptive and regularized variants are also described. The numerical behavior of these importance-weighted learning methods is illustrated through experiments.Less
This chapter discusses function learning methods under covariate shift. Ordinary empirical risk minimization learning is not consistent under covariate shift for misspecified models, and this inconsistency issue can be resolved by considering importance-weighted loss functions. Here, various importance-weighted empirical risk minimization methods are introduced, including least squares and Huber’s method for regression, and Fisher discriminant analysis, logistic regression, support vector machines, and boosting for classification. Their adaptive and regularized variants are also described. The numerical behavior of these importance-weighted learning methods is illustrated through experiments.