Adrienne Lehrer
- Published in print:
- 2009
- Published Online:
- September 2009
- ISBN:
- 9780195307931
- eISBN:
- 9780199867493
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195307931.003.0014
- Subject:
- Linguistics, Theoretical Linguistics
The treatment of language by philosophers has been biased toward scientific language, requiring precision in reference, denotation, and truth conditions. Scientific language, however, is special, and ...
More
The treatment of language by philosophers has been biased toward scientific language, requiring precision in reference, denotation, and truth conditions. Scientific language, however, is special, and most conversation does not require such precision. Wine scientists, especially enologists writing scientific articles, must agree on terminology if communication is to succeed. Ann Noble, among others, has developed olfactory standards so that others can learn to discriminate and label accurately.Less
The treatment of language by philosophers has been biased toward scientific language, requiring precision in reference, denotation, and truth conditions. Scientific language, however, is special, and most conversation does not require such precision. Wine scientists, especially enologists writing scientific articles, must agree on terminology if communication is to succeed. Ann Noble, among others, has developed olfactory standards so that others can learn to discriminate and label accurately.
Kerwin LeeKlein
- Published in print:
- 2011
- Published Online:
- March 2012
- ISBN:
- 9780520268814
- eISBN:
- 9780520948297
- Item type:
- book
- Publisher:
- University of California Press
- DOI:
- 10.1525/california/9780520268814.001.0001
- Subject:
- History, Historiography
This book describes major changes in the conceptual language of the humanities, particularly in the discourse of history. The chapters trace the development of academic vocabularies through the ...
More
This book describes major changes in the conceptual language of the humanities, particularly in the discourse of history. The chapters trace the development of academic vocabularies through the dynamically shifting cultural, political, and linguistic landscapes of the twentieth century. It considers the rise and fall of the “philosophy of history” and discusses past attempts to imbue historical discourse with scientific precision. The book explores the development of the “meta-narrative” and the post-Marxist view of history and shows how the present resurgence of old words—such as “memory”—in new contexts is providing a way to address marginalized peoples. In analyzing linguistic changes in the North American academy, this book ties semantic shifts in academic discourse to key trends in American society, culture, and politics.Less
This book describes major changes in the conceptual language of the humanities, particularly in the discourse of history. The chapters trace the development of academic vocabularies through the dynamically shifting cultural, political, and linguistic landscapes of the twentieth century. It considers the rise and fall of the “philosophy of history” and discusses past attempts to imbue historical discourse with scientific precision. The book explores the development of the “meta-narrative” and the post-Marxist view of history and shows how the present resurgence of old words—such as “memory”—in new contexts is providing a way to address marginalized peoples. In analyzing linguistic changes in the North American academy, this book ties semantic shifts in academic discourse to key trends in American society, culture, and politics.
Alexa Riehle, Sébastien Roux, Bjørg Elisabeth Kilavik, and Sonja Grün
- Published in print:
- 2010
- Published Online:
- January 2011
- ISBN:
- 9780195395273
- eISBN:
- 9780199863518
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195395273.003.0006
- Subject:
- Neuroscience, Sensory and Motor Systems
The temporal coding hypothesis suggests that not only changes in firing rate but also precise spike timing, especially synchrony, constitute an important part of the representational substrate for ...
More
The temporal coding hypothesis suggests that not only changes in firing rate but also precise spike timing, especially synchrony, constitute an important part of the representational substrate for perception and action. In this framework, the concept of cell assemblies uses synchrony as an additional dimension to firing rate, as a candidate for information processing. Consequently, the observation of spike synchrony between neurons might be interpreted as an activation of a functional cell assembly. When, in an instructed delay task, prior information is provided about movement parameters, such as movement direction (spatial parameters) or the moment when to move (temporal parameters), movement initiation is faster. Cortical neurons selectively modulate their activity in relation to this information. To indicate the end of an instructed delay, motor cortical neurons synchronize significantly their activity at the moment of signal expectancy, often without any detectable modulation in firing rate. The observed increase of the temporal precision of synchrony toward the end of an instructed delay is interpreted to facilitate the efficiency of the motor output, leading to an increase of performance speed. Finally, the chapter shows that the timing of the task is dynamically represented in the temporal structure of significant spike synchrony at the population level, which is shaped by learning and practice. The emergence of significant synchrony becomes more structured; that, is it becomes stronger and more localized in time with practice, in parallel with a decrease in firing rate and an improvement of the behavioral performance. Performance optimization through practice might therefore be achieved by boosting the computational contribution of spike synchrony, allowing an overall reduction in population activity.Less
The temporal coding hypothesis suggests that not only changes in firing rate but also precise spike timing, especially synchrony, constitute an important part of the representational substrate for perception and action. In this framework, the concept of cell assemblies uses synchrony as an additional dimension to firing rate, as a candidate for information processing. Consequently, the observation of spike synchrony between neurons might be interpreted as an activation of a functional cell assembly. When, in an instructed delay task, prior information is provided about movement parameters, such as movement direction (spatial parameters) or the moment when to move (temporal parameters), movement initiation is faster. Cortical neurons selectively modulate their activity in relation to this information. To indicate the end of an instructed delay, motor cortical neurons synchronize significantly their activity at the moment of signal expectancy, often without any detectable modulation in firing rate. The observed increase of the temporal precision of synchrony toward the end of an instructed delay is interpreted to facilitate the efficiency of the motor output, leading to an increase of performance speed. Finally, the chapter shows that the timing of the task is dynamically represented in the temporal structure of significant spike synchrony at the population level, which is shaped by learning and practice. The emergence of significant synchrony becomes more structured; that, is it becomes stronger and more localized in time with practice, in parallel with a decrease in firing rate and an improvement of the behavioral performance. Performance optimization through practice might therefore be achieved by boosting the computational contribution of spike synchrony, allowing an overall reduction in population activity.
Nicole Bolleyer
- Published in print:
- 2009
- Published Online:
- September 2009
- ISBN:
- 9780199570607
- eISBN:
- 9780191721953
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199570607.003.0007
- Subject:
- Political Science, Comparative Politics, European Union
This chapter argues that the interplay of institutionalization and integration of intergovernmental arrangements affects the precision, the substantial depth and function of informal ...
More
This chapter argues that the interplay of institutionalization and integration of intergovernmental arrangements affects the precision, the substantial depth and function of informal intergovernmental agreements which, in turn, affects the effectiveness of intergovernmental cooperation. It comparatively assesses non-binding intergovernmental agreements issued in 2004 and 2005 by both generalist and policy-specific arrangements in Canada, Switzerland, and the United States. The findings indicate that intragovernmental dynamics simultaneously affect the set-up of arrangements and nature of agreements. While institutionalization has a direct impact on agreements as well, the findings imply that it matters most in those contexts where the intragovernmental incentives are favourable towards strong arrangements in the first place. Federal reforms – as far as reforms are supposed to counteract dominant intragovernmental incentive structures – are unlikely to have a strong impact in intergovernmental cooperation.Less
This chapter argues that the interplay of institutionalization and integration of intergovernmental arrangements affects the precision, the substantial depth and function of informal intergovernmental agreements which, in turn, affects the effectiveness of intergovernmental cooperation. It comparatively assesses non-binding intergovernmental agreements issued in 2004 and 2005 by both generalist and policy-specific arrangements in Canada, Switzerland, and the United States. The findings indicate that intragovernmental dynamics simultaneously affect the set-up of arrangements and nature of agreements. While institutionalization has a direct impact on agreements as well, the findings imply that it matters most in those contexts where the intragovernmental incentives are favourable towards strong arrangements in the first place. Federal reforms – as far as reforms are supposed to counteract dominant intragovernmental incentive structures – are unlikely to have a strong impact in intergovernmental cooperation.
Donald Laming
- Published in print:
- 1997
- Published Online:
- January 2008
- ISBN:
- 9780198523420
- eISBN:
- 9780191712425
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198523420.003.0008
- Subject:
- Psychology, Cognitive Neuroscience
Different stimulus continua have different Weber fractions. If Weber's Law holds, matching jnds across continua generates a power law relation. If numbers be regarded as an artificial continuum, ...
More
Different stimulus continua have different Weber fractions. If Weber's Law holds, matching jnds across continua generates a power law relation. If numbers be regarded as an artificial continuum, Stevens' Power Law results; so too does Ekman's Law (Weber's Law applied to sensation). This chapter looks at the relationship between the Weber fraction and the power law exponent, at the magnitude estimation of 1 kHz tones (a stimulus continuum that deviates from Weber's Law), and at the precision of magnitude estimates in relation to thresholds. The idea that the power law results from a matching of jnds across continua cannot be sustained.Less
Different stimulus continua have different Weber fractions. If Weber's Law holds, matching jnds across continua generates a power law relation. If numbers be regarded as an artificial continuum, Stevens' Power Law results; so too does Ekman's Law (Weber's Law applied to sensation). This chapter looks at the relationship between the Weber fraction and the power law exponent, at the magnitude estimation of 1 kHz tones (a stimulus continuum that deviates from Weber's Law), and at the precision of magnitude estimates in relation to thresholds. The idea that the power law results from a matching of jnds across continua cannot be sustained.
Thomas J. Stohlgren
- Published in print:
- 2006
- Published Online:
- September 2007
- ISBN:
- 9780195172331
- eISBN:
- 9780199790395
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195172331.003.0003
- Subject:
- Biology, Plant Sciences and Forestry
This chapter suggests an experimental approach to vegetation sampling in general, and to plant diversity sampling in particular. It is assumed that most available sampling techniques are like ...
More
This chapter suggests an experimental approach to vegetation sampling in general, and to plant diversity sampling in particular. It is assumed that most available sampling techniques are like hypotheses that must be proven — proven accurate, precise, complete, and cost-efficient. Only after careful observation, repeated trials, and comparisons with other techniques can the hypotheses (methods) be accepted or rejected. A framework for sampling plant diversity includes initial decisions on goals, objectives, scale, and sampling design. Sampling design is further complicated by decisions on plot size and shape, sample size, intensity of sampling, and pattern of sampling, which interact and affect the results of plant diversity studies. Following the generalized framework that follows may help in planning landscape-scale plant diversity studies, and in evaluating the strengths and weaknesses of alternative study designs and field techniques.Less
This chapter suggests an experimental approach to vegetation sampling in general, and to plant diversity sampling in particular. It is assumed that most available sampling techniques are like hypotheses that must be proven — proven accurate, precise, complete, and cost-efficient. Only after careful observation, repeated trials, and comparisons with other techniques can the hypotheses (methods) be accepted or rejected. A framework for sampling plant diversity includes initial decisions on goals, objectives, scale, and sampling design. Sampling design is further complicated by decisions on plot size and shape, sample size, intensity of sampling, and pattern of sampling, which interact and affect the results of plant diversity studies. Following the generalized framework that follows may help in planning landscape-scale plant diversity studies, and in evaluating the strengths and weaknesses of alternative study designs and field techniques.
Nicholas J. J. Smith
- Published in print:
- 2008
- Published Online:
- January 2009
- ISBN:
- 9780199233007
- eISBN:
- 9780191716430
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199233007.003.0007
- Subject:
- Philosophy, Metaphysics/Epistemology, Logic/Philosophy of Mathematics
This chapter continues the examination begun in the previous chapter of objections to the fuzzy view of vagueness in particular, and to degree theoretic treatments of vagueness in general. It covers ...
More
This chapter continues the examination begun in the previous chapter of objections to the fuzzy view of vagueness in particular, and to degree theoretic treatments of vagueness in general. It covers the major remaining objections to the fuzzy view: the problems of artificial precision and sharp boundaries. In response, a new version of the fuzzy view is proposed, called fuzzy plurivaluationism, which combines fuzzy models with semantic indeterminacy of the sort involved in plurivaluationism. The chapter concludes that fuzzy plurivaluationism is the correct theory of vagueness, on the grounds that, first, it is a degree theory — and so satisfies the positive requirement on a theory of vagueness — and, second, it withstands all known objections to degree theories.Less
This chapter continues the examination begun in the previous chapter of objections to the fuzzy view of vagueness in particular, and to degree theoretic treatments of vagueness in general. It covers the major remaining objections to the fuzzy view: the problems of artificial precision and sharp boundaries. In response, a new version of the fuzzy view is proposed, called fuzzy plurivaluationism, which combines fuzzy models with semantic indeterminacy of the sort involved in plurivaluationism. The chapter concludes that fuzzy plurivaluationism is the correct theory of vagueness, on the grounds that, first, it is a degree theory — and so satisfies the positive requirement on a theory of vagueness — and, second, it withstands all known objections to degree theories.
William L. Harper
- Published in print:
- 2011
- Published Online:
- May 2012
- ISBN:
- 9780199570409
- eISBN:
- 9780191728679
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199570409.003.0006
- Subject:
- Philosophy, History of Philosophy, Philosophy of Science
Part I argues that the precision of Newton’s moon-test calculation goes beyond what modern least squares assessment can support from his cited data and that his data afford no support for his ...
More
Part I argues that the precision of Newton’s moon-test calculation goes beyond what modern least squares assessment can support from his cited data and that his data afford no support for his precession correction to offset the action of the sun; but, that Newton is innocent of Westfall’s main accusation of data fudging in the moon-test. Part II argues that Newton’s inference does not depend on his precession correction or on his selection of which lunar distance estimates to include. It argues that a correction for syzygy distances can defend the larger lunar distance Newton assigns in his moon-test of corollary 7 of proposition 37. Appendix 1 discusses the details of Newton’s moon-test calculation from corollary 7 of proposition 37 of book 3. It shows that Newton’s moon-test inference continues to hold up when simplifying assumptions of his basic calculation are replaced by more realistic approximations.Less
Part I argues that the precision of Newton’s moon-test calculation goes beyond what modern least squares assessment can support from his cited data and that his data afford no support for his precession correction to offset the action of the sun; but, that Newton is innocent of Westfall’s main accusation of data fudging in the moon-test. Part II argues that Newton’s inference does not depend on his precession correction or on his selection of which lunar distance estimates to include. It argues that a correction for syzygy distances can defend the larger lunar distance Newton assigns in his moon-test of corollary 7 of proposition 37. Appendix 1 discusses the details of Newton’s moon-test calculation from corollary 7 of proposition 37 of book 3. It shows that Newton’s moon-test inference continues to hold up when simplifying assumptions of his basic calculation are replaced by more realistic approximations.
Michael H. Best
- Published in print:
- 2001
- Published Online:
- November 2003
- ISBN:
- 9780198297451
- eISBN:
- 9780191595967
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198297459.003.0005
- Subject:
- Economics and Finance, Economic Systems
Boston's Route 128 illustrates the concepts of regional technology capability, technology genealogy, and technology roadmap. Precision machining and complex product systems are regional technological ...
More
Boston's Route 128 illustrates the concepts of regional technology capability, technology genealogy, and technology roadmap. Precision machining and complex product systems are regional technological capabilities that have been redefined, and product applications have evolved through a series of technology domain transitions from the mechanical to the electrical and electronic, and via an extension of Moore's Law into the age of nanotechnology and self‐assembly processes. Old industries have gone and new industries have emerged through a regional systems integration process reminiscent of Schumpeterian ‘creative destruction’. The recent transition from a vertical integration to an open‐systems business model has fostered a regional capability to rapidly integrate and reintegrate activities and technologies required for rapid new product development in complex product systems.Less
Boston's Route 128 illustrates the concepts of regional technology capability, technology genealogy, and technology roadmap. Precision machining and complex product systems are regional technological capabilities that have been redefined, and product applications have evolved through a series of technology domain transitions from the mechanical to the electrical and electronic, and via an extension of Moore's Law into the age of nanotechnology and self‐assembly processes. Old industries have gone and new industries have emerged through a regional systems integration process reminiscent of Schumpeterian ‘creative destruction’. The recent transition from a vertical integration to an open‐systems business model has fostered a regional capability to rapidly integrate and reintegrate activities and technologies required for rapid new product development in complex product systems.
Patrick Dattalo
- Published in print:
- 2008
- Published Online:
- January 2009
- ISBN:
- 9780195315493
- eISBN:
- 9780199865475
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195315493.003.0005
- Subject:
- Social Work, Research and Evaluation
This chapter is organized into two sections. First, additional considerations that can affect sample size are discussed, including ethical concerns, costs, and synthesis of power and precision. ...
More
This chapter is organized into two sections. First, additional considerations that can affect sample size are discussed, including ethical concerns, costs, and synthesis of power and precision. Second, recommendations concerning future efforts to refine tactics and techniques for determining sample size are presented.Less
This chapter is organized into two sections. First, additional considerations that can affect sample size are discussed, including ethical concerns, costs, and synthesis of power and precision. Second, recommendations concerning future efforts to refine tactics and techniques for determining sample size are presented.
XAVIER GABAIX and DAVID LAIBSON
- Published in print:
- 2008
- Published Online:
- October 2011
- ISBN:
- 9780195328318
- eISBN:
- 9780199851768
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195328318.003.0012
- Subject:
- Economics and Finance, Economic History
Models are significant as they are used to provide a supposedly reliable description or representation of the world. Most of the models that scientists attempt to generate and analyze are based on ...
More
Models are significant as they are used to provide a supposedly reliable description or representation of the world. Most of the models that scientists attempt to generate and analyze are based on assumptions that are only believed to be true since such models would not consider irregularities and inconsistencies with common theory. This chapter introduces seven key properties, whether they already be widely accepted or have yet to be accepted at all, that a good economic model should possess: 1) parsimony, 2) tractability, 3) conceptual insightfulness, 4) generalizability, 5) falsifiability, 6) empirical consistency, and 7) predictive precision. For this analysis, it is argued that classical optimization assumptions are not necessary in coming up with economic models, and that these should be regarded as hypotheses that require testing.Less
Models are significant as they are used to provide a supposedly reliable description or representation of the world. Most of the models that scientists attempt to generate and analyze are based on assumptions that are only believed to be true since such models would not consider irregularities and inconsistencies with common theory. This chapter introduces seven key properties, whether they already be widely accepted or have yet to be accepted at all, that a good economic model should possess: 1) parsimony, 2) tractability, 3) conceptual insightfulness, 4) generalizability, 5) falsifiability, 6) empirical consistency, and 7) predictive precision. For this analysis, it is argued that classical optimization assumptions are not necessary in coming up with economic models, and that these should be regarded as hypotheses that require testing.
Paul Glennie and Nigel Thrift
- Published in print:
- 2009
- Published Online:
- October 2011
- ISBN:
- 9780199278206
- eISBN:
- 9780191699979
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199278206.003.0001
- Subject:
- History, British and Irish Medieval History, Social History
As told by Vicenzio Viviani — Galileo Galilei's first biographer and former student — Galileo took an interest in the movement of pendulums some time before 1600 because of the swinging of the oil ...
More
As told by Vicenzio Viviani — Galileo Galilei's first biographer and former student — Galileo took an interest in the movement of pendulums some time before 1600 because of the swinging of the oil lamps of the cathedral in Pisa. In his attempts to come up with laws of motion to explain the rhythmic movements of the pendulum, Galileo compared varied lengths of strings, weights of the ‘bob’, heights, and different combinations of other such experiments. Because of the lack of accurate clocks, Galileo turned to human biology, specifically to pulse rates, in measuring time. Since Galileo was able to use other mechanisms to measure time, all of his practices were labeled ‘modern’. This chapter gives attention to the importance of precision as an indicator of time and to how Galileo is involved with several and various communities of practice, and how this affects his conduct in analysing clock time.Less
As told by Vicenzio Viviani — Galileo Galilei's first biographer and former student — Galileo took an interest in the movement of pendulums some time before 1600 because of the swinging of the oil lamps of the cathedral in Pisa. In his attempts to come up with laws of motion to explain the rhythmic movements of the pendulum, Galileo compared varied lengths of strings, weights of the ‘bob’, heights, and different combinations of other such experiments. Because of the lack of accurate clocks, Galileo turned to human biology, specifically to pulse rates, in measuring time. Since Galileo was able to use other mechanisms to measure time, all of his practices were labeled ‘modern’. This chapter gives attention to the importance of precision as an indicator of time and to how Galileo is involved with several and various communities of practice, and how this affects his conduct in analysing clock time.
Paul Glennie and Nigel Thrift
- Published in print:
- 2009
- Published Online:
- October 2011
- ISBN:
- 9780199278206
- eISBN:
- 9780191699979
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199278206.003.0006
- Subject:
- History, British and Irish Medieval History, Social History
‘Everyday life’ and ‘ordinary people’ refer to a broad collection of non-elite people, actions, and places that involve large amounts of temporal specialization in conducting activities out of ...
More
‘Everyday life’ and ‘ordinary people’ refer to a broad collection of non-elite people, actions, and places that involve large amounts of temporal specialization in conducting activities out of necessity. There is a need to re-establish the societal and geographical dimensions of everyday clock time, such as time as a resource for expressing time and other related purposes, since this is often taken for granted. This chapter focuses on the use of clock-time in everyday situations and the precision attributed to clock-time practices in early modern England. Attention is drawn to everyday temporal communities wherein clock-time practices contribute to ‘non-disciplinary’ aspects of time and society, instead of to the specialized communities that make use of specific clock times or to disciplinary organizations that produce formal documentation.Less
‘Everyday life’ and ‘ordinary people’ refer to a broad collection of non-elite people, actions, and places that involve large amounts of temporal specialization in conducting activities out of necessity. There is a need to re-establish the societal and geographical dimensions of everyday clock time, such as time as a resource for expressing time and other related purposes, since this is often taken for granted. This chapter focuses on the use of clock-time in everyday situations and the precision attributed to clock-time practices in early modern England. Attention is drawn to everyday temporal communities wherein clock-time practices contribute to ‘non-disciplinary’ aspects of time and society, instead of to the specialized communities that make use of specific clock times or to disciplinary organizations that produce formal documentation.
Paul Glennie and Nigel Thrift
- Published in print:
- 2009
- Published Online:
- October 2011
- ISBN:
- 9780199278206
- eISBN:
- 9780191699979
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199278206.003.0007
- Subject:
- History, British and Irish Medieval History, Social History
Similar to that attributed to clock time, the same problem is experienced by precision, but on a lesser degree since the precise uses of clock times are likely to be more explicit. Superior precision ...
More
Similar to that attributed to clock time, the same problem is experienced by precision, but on a lesser degree since the precise uses of clock times are likely to be more explicit. Superior precision generated documentation as precision supposedly meant rediscovering such matters that have been taken for granted. Precision, in this chapter, is taken on a broader instead of a self-consciously rational analysis. Attention is drawn to how people are able to distinguish and identify the specific times of the day wherein they have to perform what most people consider as necessary yet ‘everyday’ activities. In this chapter, we examine questions concerned with temporal referencing, the impulses involved in precise timing, and the positive or negative moral connotations that can possibly be associated with precision.Less
Similar to that attributed to clock time, the same problem is experienced by precision, but on a lesser degree since the precise uses of clock times are likely to be more explicit. Superior precision generated documentation as precision supposedly meant rediscovering such matters that have been taken for granted. Precision, in this chapter, is taken on a broader instead of a self-consciously rational analysis. Attention is drawn to how people are able to distinguish and identify the specific times of the day wherein they have to perform what most people consider as necessary yet ‘everyday’ activities. In this chapter, we examine questions concerned with temporal referencing, the impulses involved in precise timing, and the positive or negative moral connotations that can possibly be associated with precision.
Paul Glennie and Nigel Thrift
- Published in print:
- 2009
- Published Online:
- October 2011
- ISBN:
- 9780199278206
- eISBN:
- 9780191699979
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199278206.003.0009
- Subject:
- History, British and Irish Medieval History, Social History
As seen in the attempts of early societies to keep track of and exact correspondence between their calendar year and their celestial year, issues regarding precise time-measurement already existed ...
More
As seen in the attempts of early societies to keep track of and exact correspondence between their calendar year and their celestial year, issues regarding precise time-measurement already existed and even precise mechanical timekeepers were made available. Although such situations may have brought about new definitions of precision, in any case, the attempt to achieve superior precision expanded the horizons of technology and calculation in counting and keeping track of time. This chapter elaborates on the following components of precision: the definition of specific times of the day, the precise measurement of periods of time, the ability to specify the time of an event, recording of the duration of an event, and the coordination of various events or actions.Less
As seen in the attempts of early societies to keep track of and exact correspondence between their calendar year and their celestial year, issues regarding precise time-measurement already existed and even precise mechanical timekeepers were made available. Although such situations may have brought about new definitions of precision, in any case, the attempt to achieve superior precision expanded the horizons of technology and calculation in counting and keeping track of time. This chapter elaborates on the following components of precision: the definition of specific times of the day, the precise measurement of periods of time, the ability to specify the time of an event, recording of the duration of an event, and the coordination of various events or actions.
Barbara J. Evans
- Published in print:
- 2015
- Published Online:
- May 2016
- ISBN:
- 9780231171182
- eISBN:
- 9780231540070
- Item type:
- chapter
- Publisher:
- Columbia University Press
- DOI:
- 10.7312/columbia/9780231171182.003.0007
- Subject:
- Law, Medical Law
This chapter explores special challenges FDA will face when assessing the safety and effectiveness of 21st-century preventive medical products that aim to avert future disease (as opposed to treating ...
More
This chapter explores special challenges FDA will face when assessing the safety and effectiveness of 21st-century preventive medical products that aim to avert future disease (as opposed to treating manifest disease). The Food and Drug Administration Amendments Act of 2007 gave the agency important new powers to address these challenges but, to date, the agency has not yet fully tapped these new powers.Less
This chapter explores special challenges FDA will face when assessing the safety and effectiveness of 21st-century preventive medical products that aim to avert future disease (as opposed to treating manifest disease). The Food and Drug Administration Amendments Act of 2007 gave the agency important new powers to address these challenges but, to date, the agency has not yet fully tapped these new powers.
Michael Bordag, Galina Leonidovna Klimchitskaya, Umar Mohideen, and Vladimir Mikhaylovich Mostepanenko
- Published in print:
- 2009
- Published Online:
- September 2009
- ISBN:
- 9780199238743
- eISBN:
- 9780191716461
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199238743.001.0001
- Subject:
- Physics, Condensed Matter Physics / Materials, Atomic, Laser, and Optical Physics
The subject of this book is the Casimir effect, i.e., a manifestation of zero-point oscillations of the quantum vacuum in the form of forces acting between closely spaced bodies. It is a purely ...
More
The subject of this book is the Casimir effect, i.e., a manifestation of zero-point oscillations of the quantum vacuum in the form of forces acting between closely spaced bodies. It is a purely quantum effect. There is no force acting between neutral bodies in classical electrodynamics. The Casimir effect has become an interdisciplinary subject. It plays an important role in various fields of physics such as condensed matter physics, quantum field theory, atomic and molecular physics, gravitation and cosmology, and mathematical physics. Most recently, the Casimir effect has been applied to nanotechnology and for obtaining constraints on the predictions of unification theories beyond the Standard Model. The book assembles together the field-theoretical foundations of this phenomenon, the application of the general theory to real materials, and a comprehensive description of all recently performed measurements of the Casimir force, including the comparison between experiment and theory. There is increasing interest in forces of vacuum origin. Numerous new results have been obtained during the last few years which are not reflected in the literature, but are very promising for fundamental science and nanotechnology. The book provides a source of information which presents a critical assessment of all of the main results and approaches contained in published journal papers. It also proposes new ideas which are not yet universally accepted but are finding increasing support from experiment.Less
The subject of this book is the Casimir effect, i.e., a manifestation of zero-point oscillations of the quantum vacuum in the form of forces acting between closely spaced bodies. It is a purely quantum effect. There is no force acting between neutral bodies in classical electrodynamics. The Casimir effect has become an interdisciplinary subject. It plays an important role in various fields of physics such as condensed matter physics, quantum field theory, atomic and molecular physics, gravitation and cosmology, and mathematical physics. Most recently, the Casimir effect has been applied to nanotechnology and for obtaining constraints on the predictions of unification theories beyond the Standard Model. The book assembles together the field-theoretical foundations of this phenomenon, the application of the general theory to real materials, and a comprehensive description of all recently performed measurements of the Casimir force, including the comparison between experiment and theory. There is increasing interest in forces of vacuum origin. Numerous new results have been obtained during the last few years which are not reflected in the literature, but are very promising for fundamental science and nanotechnology. The book provides a source of information which presents a critical assessment of all of the main results and approaches contained in published journal papers. It also proposes new ideas which are not yet universally accepted but are finding increasing support from experiment.
Stewart Gordon
- Published in print:
- 2006
- Published Online:
- January 2010
- ISBN:
- 9780195177435
- eISBN:
- 9780199864690
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195177435.003.07
- Subject:
- Music, Performing Practice/Studies
Although we memorize constantly in daily living, four characteristics attend memorization in performance that may not ordinarily be present: complexity of material, level of precision, time pressure, ...
More
Although we memorize constantly in daily living, four characteristics attend memorization in performance that may not ordinarily be present: complexity of material, level of precision, time pressure, and anxiety. Each of these is discussed with attending mnemonic techniques for achieving a high level of secure memorization.Less
Although we memorize constantly in daily living, four characteristics attend memorization in performance that may not ordinarily be present: complexity of material, level of precision, time pressure, and anxiety. Each of these is discussed with attending mnemonic techniques for achieving a high level of secure memorization.
George Anastaplo
- Published in print:
- 2009
- Published Online:
- September 2011
- ISBN:
- 9780813125336
- eISBN:
- 9780813135243
- Item type:
- chapter
- Publisher:
- University Press of Kentucky
- DOI:
- 10.5810/kentucky/9780813125336.003.0012
- Subject:
- Political Science, American Politics
This chapter deals with the steady pounding that the by-then virtually undefended German cities were being subjected to by the American and British air forces. It notes that the civilian casualties ...
More
This chapter deals with the steady pounding that the by-then virtually undefended German cities were being subjected to by the American and British air forces. It notes that the civilian casualties from these air raids could not help but be substantial. It cites an article titled “The Morality of Obliteration Bombing”, published by John C. Ford, a New England Jesuit. It provides that Father Ford did not, in this article, speak as a pacifist as he was willing to consider the war against Nazi Germany a just war. It notes however, that Ford condemned as unlawful the systematic killing of noncombatants necessarily resulting from the air raids to which German cities were being subjected. It further notes that obliteration (or area) bombing was distinguishable for him from the precision bombing consistent with the long-accepted rules of war.Less
This chapter deals with the steady pounding that the by-then virtually undefended German cities were being subjected to by the American and British air forces. It notes that the civilian casualties from these air raids could not help but be substantial. It cites an article titled “The Morality of Obliteration Bombing”, published by John C. Ford, a New England Jesuit. It provides that Father Ford did not, in this article, speak as a pacifist as he was willing to consider the war against Nazi Germany a just war. It notes however, that Ford condemned as unlawful the systematic killing of noncombatants necessarily resulting from the air raids to which German cities were being subjected. It further notes that obliteration (or area) bombing was distinguishable for him from the precision bombing consistent with the long-accepted rules of war.
Cheryl B. Welch
- Published in print:
- 2000
- Published Online:
- October 2011
- ISBN:
- 9780198781318
- eISBN:
- 9780191695414
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198781318.003.0004
- Subject:
- Philosophy, History of Philosophy, Political Philosophy
This chapter examines the contributions to social and political analysis of Tocqueville's major published texts: Democracy in America I, and Democracy in America II. In the first text, he argues that ...
More
This chapter examines the contributions to social and political analysis of Tocqueville's major published texts: Democracy in America I, and Democracy in America II. In the first text, he argues that revolution temporarily perverts democracy as exhibited in France by class warfare, political extremism and distrust. In his second text, he notes the tendency of the revolutionary spirit to intensify mutual distrust and increase individualism. The chapter also considers the loose terminologies that forcers the reader to read carefully to his specific intentions and relevant content of discussion. The failure of others in understanding the true significance of his texts is attributed to his lack of precision in the use of important concepts.Less
This chapter examines the contributions to social and political analysis of Tocqueville's major published texts: Democracy in America I, and Democracy in America II. In the first text, he argues that revolution temporarily perverts democracy as exhibited in France by class warfare, political extremism and distrust. In his second text, he notes the tendency of the revolutionary spirit to intensify mutual distrust and increase individualism. The chapter also considers the loose terminologies that forcers the reader to read carefully to his specific intentions and relevant content of discussion. The failure of others in understanding the true significance of his texts is attributed to his lack of precision in the use of important concepts.