Anthony Garratt, Kevin Lee, M. Hashem Pesaran, and Yongcheol Shin
- Published in print:
- 2006
- Published Online:
- September 2006
- ISBN:
- 9780199296859
- eISBN:
- 9780191603853
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0199296855.003.0013
- Subject:
- Economics and Finance, Econometrics
The final chapter provides some concluding comments, including a summary of the main contributions of the book and an invitation to others to apply the methods in new contexts using the data and code ...
More
The final chapter provides some concluding comments, including a summary of the main contributions of the book and an invitation to others to apply the methods in new contexts using the data and code provided in the Appendices.Less
The final chapter provides some concluding comments, including a summary of the main contributions of the book and an invitation to others to apply the methods in new contexts using the data and code provided in the Appendices.
Pierluigi Frisco
- Published in print:
- 2009
- Published Online:
- September 2009
- ISBN:
- 9780199542864
- eISBN:
- 9780191715679
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199542864.001.0001
- Subject:
- Mathematics, Applied Mathematics, Mathematical Biology
How could we use living cells to perform computation? Would our definition of computation change as a consequence of this? Could such a cell-computer outperform digital computers? These are some of ...
More
How could we use living cells to perform computation? Would our definition of computation change as a consequence of this? Could such a cell-computer outperform digital computers? These are some of the questions that the study of Membrane Computing tries to answer and are at the base of what is treated by this monograph. Descriptional and computational complexity of models in Membrane Computing are the two lines of research on which is the focus here. In this context this book reports the results of only some of the models present in this framework. The models considered here represent a very relevant part of all the models introduced so far in the study of Membrane Computing. They are in between the most studied models in the field and they cover a broad range of features (using symbol objects or string objects, based only on communications, inspired by intra- and intercellular processes, having or not having a tree as underlying structure, etc.) that gives a grasp of the enormous flexibility of this framework. Links with biology and Petri nets are constant through this book. This book aims also to inspire research. This book gives suggestions for research of various levels of difficulty and this book clearly indicates their importance and the relevance of the possible outcomes. Readers new to this field of research will find the provided examples particularly useful in the understanding of the treated topics.Less
How could we use living cells to perform computation? Would our definition of computation change as a consequence of this? Could such a cell-computer outperform digital computers? These are some of the questions that the study of Membrane Computing tries to answer and are at the base of what is treated by this monograph. Descriptional and computational complexity of models in Membrane Computing are the two lines of research on which is the focus here. In this context this book reports the results of only some of the models present in this framework. The models considered here represent a very relevant part of all the models introduced so far in the study of Membrane Computing. They are in between the most studied models in the field and they cover a broad range of features (using symbol objects or string objects, based only on communications, inspired by intra- and intercellular processes, having or not having a tree as underlying structure, etc.) that gives a grasp of the enormous flexibility of this framework. Links with biology and Petri nets are constant through this book. This book aims also to inspire research. This book gives suggestions for research of various levels of difficulty and this book clearly indicates their importance and the relevance of the possible outcomes. Readers new to this field of research will find the provided examples particularly useful in the understanding of the treated topics.
Patrick Dattalo
- Published in print:
- 2008
- Published Online:
- January 2009
- ISBN:
- 9780195315493
- eISBN:
- 9780199865475
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195315493.001.0001
- Subject:
- Social Work, Research and Evaluation
Sample size determination is an important and often difficult step in planning an empirical study. From a statistical perspective, sample size depends on the following factors: type of analysis to be ...
More
Sample size determination is an important and often difficult step in planning an empirical study. From a statistical perspective, sample size depends on the following factors: type of analysis to be performed, desired precision of estimates, kind and number of comparisons to be made, number of variables to be examined, and heterogeneity of the population to be sampled. Other important considerations include feasibility, such as ethical limitations on access to a population of interest and the availability of time and money. The primary assumption of the book is that, within the context of ethical and practical limitations, efforts to obtain samples of appropriate size and quality remain an important and viable component of social science research. This text describes the following available approaches for estimating sample size in social work research and discusses their strengths and weaknesses: power analysis; heuristics or rules-of-thumb; confidence intervals; computer-intensive strategies; and ethical and cost considerations. In addition, strategies for mitigating pressures to increase sample size, such as emphasis on model parsimony (e.g., fewer dependent and independent variables), simpler study designs, an emphasis on replication, and careful planning of analyses are discussed. The book covers sample-size determination for advanced and emerging statistical strategies, such as structural equation modeling, multi-level analysis, repeated measures MANOVA, and repeated measures ANOVA which are not discussed in other texts.Less
Sample size determination is an important and often difficult step in planning an empirical study. From a statistical perspective, sample size depends on the following factors: type of analysis to be performed, desired precision of estimates, kind and number of comparisons to be made, number of variables to be examined, and heterogeneity of the population to be sampled. Other important considerations include feasibility, such as ethical limitations on access to a population of interest and the availability of time and money. The primary assumption of the book is that, within the context of ethical and practical limitations, efforts to obtain samples of appropriate size and quality remain an important and viable component of social science research. This text describes the following available approaches for estimating sample size in social work research and discusses their strengths and weaknesses: power analysis; heuristics or rules-of-thumb; confidence intervals; computer-intensive strategies; and ethical and cost considerations. In addition, strategies for mitigating pressures to increase sample size, such as emphasis on model parsimony (e.g., fewer dependent and independent variables), simpler study designs, an emphasis on replication, and careful planning of analyses are discussed. The book covers sample-size determination for advanced and emerging statistical strategies, such as structural equation modeling, multi-level analysis, repeated measures MANOVA, and repeated measures ANOVA which are not discussed in other texts.
Mark Newman
- Published in print:
- 2010
- Published Online:
- September 2010
- ISBN:
- 9780199206650
- eISBN:
- 9780191594175
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199206650.001.0001
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
The scientific study of networks, including computer networks, social networks, and biological networks, has received an enormous amount of interest in the last few years. The rise of the Internet ...
More
The scientific study of networks, including computer networks, social networks, and biological networks, has received an enormous amount of interest in the last few years. The rise of the Internet and the wide availability of inexpensive computers have made it possible to gather and analyze network data on a large scale, and the development of a variety of new theoretical tools has allowed us to extract new knowledge from many different kinds of networks. The study of networks is broadly interdisciplinary and important developments have occurred in many fields, including mathematics, physics, computer and information sciences, biology, and the social sciences. This book brings together the most important breakthroughs in each of these fields and presents them in a coherent fashion, highlighting the strong interconnections between work in different areas. Subjects covered include the measurement and structure of networks in many branches of science, methods for analyzing network data, including methods developed in physics, statistics, and sociology, the fundamentals of graph theory, computer algorithms, and spectral methods, mathematical models of networks, including random graph models and generative models, and theories of dynamical processes taking place on networks.Less
The scientific study of networks, including computer networks, social networks, and biological networks, has received an enormous amount of interest in the last few years. The rise of the Internet and the wide availability of inexpensive computers have made it possible to gather and analyze network data on a large scale, and the development of a variety of new theoretical tools has allowed us to extract new knowledge from many different kinds of networks. The study of networks is broadly interdisciplinary and important developments have occurred in many fields, including mathematics, physics, computer and information sciences, biology, and the social sciences. This book brings together the most important breakthroughs in each of these fields and presents them in a coherent fashion, highlighting the strong interconnections between work in different areas. Subjects covered include the measurement and structure of networks in many branches of science, methods for analyzing network data, including methods developed in physics, statistics, and sociology, the fundamentals of graph theory, computer algorithms, and spectral methods, mathematical models of networks, including random graph models and generative models, and theories of dynamical processes taking place on networks.
B. Jack Copeland (ed.)
- Published in print:
- 2005
- Published Online:
- January 2008
- ISBN:
- 9780198565932
- eISBN:
- 9780191714016
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198565932.001.0001
- Subject:
- Mathematics, History of Mathematics
The mathematical genius Alan Turing (1912-1954) was one of the greatest scientists and thinkers of the 20th century. Now well known for his crucial wartime role in breaking the ENIGMA code, he was ...
More
The mathematical genius Alan Turing (1912-1954) was one of the greatest scientists and thinkers of the 20th century. Now well known for his crucial wartime role in breaking the ENIGMA code, he was the first to conceive of the fundamental principle of the modern computer — the idea of controlling a computing machine's operations by means of coded instructions, stored in the machine's ‘memory’. In 1945, Turing drew up his revolutionary design for an electronic computing machine — his Automatic Computing Engine (‘ACE’). A pilot model of the ACE ran its first programme in 1950 and the production version, the ‘DEUCE’, went on to become a cornerstone of the fledgling British computer industry. The first ‘personal’ computer was based on Turing's ACE. This book describes Turing's struggle to build the modern computer. It contains first-hand accounts by Turing and by the pioneers of computing who worked with him. The book describes the hardware and software of the ACE and contains chapters describing Turing's path-breaking research in the fields of Artificial Intelligence (AI) and Artificial Life (A-Life).Less
The mathematical genius Alan Turing (1912-1954) was one of the greatest scientists and thinkers of the 20th century. Now well known for his crucial wartime role in breaking the ENIGMA code, he was the first to conceive of the fundamental principle of the modern computer — the idea of controlling a computing machine's operations by means of coded instructions, stored in the machine's ‘memory’. In 1945, Turing drew up his revolutionary design for an electronic computing machine — his Automatic Computing Engine (‘ACE’). A pilot model of the ACE ran its first programme in 1950 and the production version, the ‘DEUCE’, went on to become a cornerstone of the fledgling British computer industry. The first ‘personal’ computer was based on Turing's ACE. This book describes Turing's struggle to build the modern computer. It contains first-hand accounts by Turing and by the pioneers of computing who worked with him. The book describes the hardware and software of the ACE and contains chapters describing Turing's path-breaking research in the fields of Artificial Intelligence (AI) and Artificial Life (A-Life).
David Robey
- Published in print:
- 2000
- Published Online:
- October 2011
- ISBN:
- 9780198184980
- eISBN:
- 9780191674419
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198184980.001.0001
- Subject:
- Literature, European Literature
The importance of sound in poetry is indisputable, yet it is not at all an easy subject to discuss, and is rarely treated systematically by literary scholars. This book uses a variety of ...
More
The importance of sound in poetry is indisputable, yet it is not at all an easy subject to discuss, and is rarely treated systematically by literary scholars. This book uses a variety of computer-based processes to construct a systematic analytical description of the sounds of Dante's Divine Comedy in the sense of their overall distribution within the text. The description is developed through a comparative treatment of the same features in a range of related texts, with a view to defining the distinctive characteristics of Dante's practice; and by a discussion of the function and effect of sounds in the work, with special attention to unusually high incidences of particular features. The book is thus both a contribution to the scholarly debate about Dante's poem, and an illustration and discussion of the ways in which new electronic technology can be used for this kind of purpose.Less
The importance of sound in poetry is indisputable, yet it is not at all an easy subject to discuss, and is rarely treated systematically by literary scholars. This book uses a variety of computer-based processes to construct a systematic analytical description of the sounds of Dante's Divine Comedy in the sense of their overall distribution within the text. The description is developed through a comparative treatment of the same features in a range of related texts, with a view to defining the distinctive characteristics of Dante's practice; and by a discussion of the function and effect of sounds in the work, with special attention to unusually high incidences of particular features. The book is thus both a contribution to the scholarly debate about Dante's poem, and an illustration and discussion of the ways in which new electronic technology can be used for this kind of purpose.
James W. Cortada
- Published in print:
- 2007
- Published Online:
- January 2008
- ISBN:
- 9780195165869
- eISBN:
- 9780199868025
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195165869.003.0004
- Subject:
- Business and Management, Business History
This chapter discusses technologies adopted by the law enforcement community over a half century. Specifically, it looks at the use of computing by policing agencies, courts, and corrections, with a ...
More
This chapter discusses technologies adopted by the law enforcement community over a half century. Specifically, it looks at the use of computing by policing agencies, courts, and corrections, with a brief introduction to the early history of computer crime as it currently represents a new class of criminal activity made possible by the existence of the digital hand.Less
This chapter discusses technologies adopted by the law enforcement community over a half century. Specifically, it looks at the use of computing by policing agencies, courts, and corrections, with a brief introduction to the early history of computer crime as it currently represents a new class of criminal activity made possible by the existence of the digital hand.
Michael Heim
- Published in print:
- 1994
- Published Online:
- October 2011
- ISBN:
- 9780195092585
- eISBN:
- 9780199852987
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195092585.001.0001
- Subject:
- Philosophy, Metaphysics/Epistemology
Computers have dramatically altered life in the late 20th century. Today we can draw on worldwide computer links, speeding up communications for radio, newspapers, and television. Ideas fly back and ...
More
Computers have dramatically altered life in the late 20th century. Today we can draw on worldwide computer links, speeding up communications for radio, newspapers, and television. Ideas fly back and forth and circle the globe at the speed of electricity. And just around the corner lurks full-blown virtual reality, in which we will be able to immerse ourselves in a computer simulation not only of the actual physical world, but of any imagined world. As we begin to move in and out of a computer-generated world, this book asks, how will the way we perceive our world change? This book considers this and other philosophical issues of the Information Age. With an eye for the dark as well as the bright side of computer technology, it explores the logical and historical origins of our computer-generated world and speculates about the future direction of our computerized lives. The book discusses such topics as the effect of word-processing on the English language. The book also looks into the new kind of literacy promised by Hypertext. And it also probes the notion of virtual reality, “cyberspace”—the computer-simulated environments that have captured the popular imagination and may ultimately change the way we define reality itself. Just as the definition of interface itself has evolved from the actual adaptor plug used to connect electronic circuits into human entry into a self-contained cyberspace, so too will the notion of reality change with the current technological drive. Like the introduction of the automobile, the advent of virtual reality will change the whole context in which our knowledge and awareness of life are rooted. And along the way, the book covers such intriguing topics as how computers have altered our thought habits, how we will be able to distinguish virtual from real reality, and the appearance of virtual reality in popular culture (as in Star Trek's holodeck, William Gibson's Neuromancer, and Stephen King's Lawnmower Man).Less
Computers have dramatically altered life in the late 20th century. Today we can draw on worldwide computer links, speeding up communications for radio, newspapers, and television. Ideas fly back and forth and circle the globe at the speed of electricity. And just around the corner lurks full-blown virtual reality, in which we will be able to immerse ourselves in a computer simulation not only of the actual physical world, but of any imagined world. As we begin to move in and out of a computer-generated world, this book asks, how will the way we perceive our world change? This book considers this and other philosophical issues of the Information Age. With an eye for the dark as well as the bright side of computer technology, it explores the logical and historical origins of our computer-generated world and speculates about the future direction of our computerized lives. The book discusses such topics as the effect of word-processing on the English language. The book also looks into the new kind of literacy promised by Hypertext. And it also probes the notion of virtual reality, “cyberspace”—the computer-simulated environments that have captured the popular imagination and may ultimately change the way we define reality itself. Just as the definition of interface itself has evolved from the actual adaptor plug used to connect electronic circuits into human entry into a self-contained cyberspace, so too will the notion of reality change with the current technological drive. Like the introduction of the automobile, the advent of virtual reality will change the whole context in which our knowledge and awareness of life are rooted. And along the way, the book covers such intriguing topics as how computers have altered our thought habits, how we will be able to distinguish virtual from real reality, and the appearance of virtual reality in popular culture (as in Star Trek's holodeck, William Gibson's Neuromancer, and Stephen King's Lawnmower Man).
Gary A. Glatzmaier
- Published in print:
- 2013
- Published Online:
- October 2017
- ISBN:
- 9780691141725
- eISBN:
- 9781400848904
- Item type:
- book
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691141725.001.0001
- Subject:
- Physics, Particle Physics / Astrophysics / Cosmology
This book provides readers with the skills they need to write computer codes that simulate convection, internal gravity waves, and magnetic field generation in the interiors and atmospheres of ...
More
This book provides readers with the skills they need to write computer codes that simulate convection, internal gravity waves, and magnetic field generation in the interiors and atmospheres of rotating planets and stars. Using a teaching method perfected in the classroom, the book begins by offering a step-by-step guide on how to design codes for simulating nonlinear time-dependent thermal convection in a 2D box using Fourier expansions in the horizontal direction and finite differences in the vertical direction. It then describes how to implement more efficient a nd accurate numerical methods and more realistic geometries in two and three dimensions. The third part of the book demonstrates how to incorporate more sophisticated physics, including the effects of magnetic field, density stratification, and rotation. The book features numerous exercises throughout, and is an ideal textbook for students and an essential resource for researchers. It explains how to create codes that simulate the internal dynamics of planets and stars, and builds on basic concepts and simple methods. The book shows how to improve the efficiency and accuracy of the numerical methods. It considers more relevant geometries and boundary conditions.Less
This book provides readers with the skills they need to write computer codes that simulate convection, internal gravity waves, and magnetic field generation in the interiors and atmospheres of rotating planets and stars. Using a teaching method perfected in the classroom, the book begins by offering a step-by-step guide on how to design codes for simulating nonlinear time-dependent thermal convection in a 2D box using Fourier expansions in the horizontal direction and finite differences in the vertical direction. It then describes how to implement more efficient a nd accurate numerical methods and more realistic geometries in two and three dimensions. The third part of the book demonstrates how to incorporate more sophisticated physics, including the effects of magnetic field, density stratification, and rotation. The book features numerous exercises throughout, and is an ideal textbook for students and an essential resource for researchers. It explains how to create codes that simulate the internal dynamics of planets and stars, and builds on basic concepts and simple methods. The book shows how to improve the efficiency and accuracy of the numerical methods. It considers more relevant geometries and boundary conditions.
Željko Ivezic, Andrew J. Connolly, Jacob T VanderPlas, and Alexander Gray
- Published in print:
- 2014
- Published Online:
- October 2017
- ISBN:
- 9780691151687
- eISBN:
- 9781400848911
- Item type:
- book
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691151687.001.0001
- Subject:
- Physics, Particle Physics / Astrophysics / Cosmology
As telescopes, detectors, and computers grow ever more powerful, the volume of data at the disposal of astronomers and astrophysicists will enter the petabyte domain, providing accurate measurements ...
More
As telescopes, detectors, and computers grow ever more powerful, the volume of data at the disposal of astronomers and astrophysicists will enter the petabyte domain, providing accurate measurements for billions of celestial objects. This book provides a comprehensive and accessible introduction to the cutting-edge statistical methods needed to efficiently analyze complex data sets from astronomical surveys such as the Panoramic Survey Telescope and Rapid Response System, the Dark Energy Survey, and the upcoming Large Synoptic Survey Telescope. It serves as a practical handbook for graduate students and advanced undergraduates in physics and astronomy, and as an indispensable reference for researchers. The book presents a wealth of practical analysis problems, evaluates techniques for solving them, and explains how to use various approaches for different types and sizes of data sets. For all applications described in the book, Python code and example data sets are provided. The supporting data sets have been carefully selected from contemporary astronomical surveys (for example, the Sloan Digital Sky Survey) and are easy to download and use. The accompanying Python code is publicly available, well documented, and follows uniform coding standards. Together, the data sets and code enable readers to reproduce all the figures and examples, evaluate the methods, and adapt them to their own fields of interest.Less
As telescopes, detectors, and computers grow ever more powerful, the volume of data at the disposal of astronomers and astrophysicists will enter the petabyte domain, providing accurate measurements for billions of celestial objects. This book provides a comprehensive and accessible introduction to the cutting-edge statistical methods needed to efficiently analyze complex data sets from astronomical surveys such as the Panoramic Survey Telescope and Rapid Response System, the Dark Energy Survey, and the upcoming Large Synoptic Survey Telescope. It serves as a practical handbook for graduate students and advanced undergraduates in physics and astronomy, and as an indispensable reference for researchers. The book presents a wealth of practical analysis problems, evaluates techniques for solving them, and explains how to use various approaches for different types and sizes of data sets. For all applications described in the book, Python code and example data sets are provided. The supporting data sets have been carefully selected from contemporary astronomical surveys (for example, the Sloan Digital Sky Survey) and are easy to download and use. The accompanying Python code is publicly available, well documented, and follows uniform coding standards. Together, the data sets and code enable readers to reproduce all the figures and examples, evaluate the methods, and adapt them to their own fields of interest.
Gary A. Glatzmaier
- Published in print:
- 2013
- Published Online:
- October 2017
- ISBN:
- 9780691141725
- eISBN:
- 9781400848904
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691141725.003.0006
- Subject:
- Physics, Particle Physics / Astrophysics / Cosmology
This chapter focuses on internal gravity waves in a stable thermal stratification. When the amplitude of the fluid velocity is small relative to the amplitude of the phase velocity, a linear ...
More
This chapter focuses on internal gravity waves in a stable thermal stratification. When the amplitude of the fluid velocity is small relative to the amplitude of the phase velocity, a linear analysis, which neglects advection, provides insight to the relation between the wavelength and frequency of internal gravity waves. Furthermore, when thermal and viscous diffusion play relatively minor roles the system can be further simplified by neglecting diffusion. The chapter first describes the linear dispersion relation before discussing the computer code modifications and simulations. In particular, it explains what modifications would be needed to convert one's thermal convection code to a code that simulates internal gravity waves, including the nonlinear and diffusive terms. Finally, it considers the computer analysis of wave energy.Less
This chapter focuses on internal gravity waves in a stable thermal stratification. When the amplitude of the fluid velocity is small relative to the amplitude of the phase velocity, a linear analysis, which neglects advection, provides insight to the relation between the wavelength and frequency of internal gravity waves. Furthermore, when thermal and viscous diffusion play relatively minor roles the system can be further simplified by neglecting diffusion. The chapter first describes the linear dispersion relation before discussing the computer code modifications and simulations. In particular, it explains what modifications would be needed to convert one's thermal convection code to a code that simulates internal gravity waves, including the nonlinear and diffusive terms. Finally, it considers the computer analysis of wave energy.
Susan W. Brenner
- Published in print:
- 2007
- Published Online:
- January 2009
- ISBN:
- 9780195333480
- eISBN:
- 9780199855353
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195333480.001.0001
- Subject:
- Law, Intellectual Property, IT, and Media Law
In this book, Susan Brenner analyzes the complex and evolving interactions between law and technology and provides a thorough and detailed account of the law in technology at the beginning of the ...
More
In this book, Susan Brenner analyzes the complex and evolving interactions between law and technology and provides a thorough and detailed account of the law in technology at the beginning of the 21st century. She draws upon recent technological advances, evaluating how developing technologies may alter how humans interact with each other and with their environment. She analyzes the development of technology as shifting from one of “use” to one of “interaction,” and argues that this interchange requires us to reconceptualize our approach to legal rules, which were originally designed to prevent the “misuse” of older technologies. Brenner argues that as technologies continue to evolve, the laws targeting the relationship between humans and technology must become, and should remain, neutral. She explains how older technologies rely on human implementation, but new, “smart” technologies are intelligent and autonomous, in varying degrees. This, she notes, will eventually lead to the ultimate progression in our relationship with technology: the fusion of human physiology and technology. Law in an Era of “Smart” Technology provides a detailed, historically-grounded analysis of why our traditional relationship with technology is evolving in ways that require a corresponding shift in our law.Less
In this book, Susan Brenner analyzes the complex and evolving interactions between law and technology and provides a thorough and detailed account of the law in technology at the beginning of the 21st century. She draws upon recent technological advances, evaluating how developing technologies may alter how humans interact with each other and with their environment. She analyzes the development of technology as shifting from one of “use” to one of “interaction,” and argues that this interchange requires us to reconceptualize our approach to legal rules, which were originally designed to prevent the “misuse” of older technologies. Brenner argues that as technologies continue to evolve, the laws targeting the relationship between humans and technology must become, and should remain, neutral. She explains how older technologies rely on human implementation, but new, “smart” technologies are intelligent and autonomous, in varying degrees. This, she notes, will eventually lead to the ultimate progression in our relationship with technology: the fusion of human physiology and technology. Law in an Era of “Smart” Technology provides a detailed, historically-grounded analysis of why our traditional relationship with technology is evolving in ways that require a corresponding shift in our law.
Averil Cameron (ed.)
- Published in print:
- 2003
- Published Online:
- January 2012
- ISBN:
- 9780197262924
- eISBN:
- 9780191734434
- Item type:
- book
- Publisher:
- British Academy
- DOI:
- 10.5871/bacad/9780197262924.001.0001
- Subject:
- History, Historiography
This book presents an interdisciplinary discussion of the important methodological tool known as prosopography — the collection of all known information about individuals within a given period. With ...
More
This book presents an interdisciplinary discussion of the important methodological tool known as prosopography — the collection of all known information about individuals within a given period. With the advent of computer technology it is now possible to gather and store such information in increasingly sophisticated and searchable databases, which can bring a new dimension to traditional historical research. The book surveys the transition in prosopographical research from more traditional methods to the new technology, and discusses the central role of the British Academy, as well as that of French, German and Austrian academic institutions, in developing prosopographical research on the Later Roman Empire, Byzantium and now Anglo-Saxon and other periods. The chapters discuss both national histories of the discipline and its potential for future research. The book demonstrates mutual benefits and complementarity in such studies between the use of new technology and the highest standards of traditional scholarship, and in doing so it sets forth new perspectives and methodologies for future work.Less
This book presents an interdisciplinary discussion of the important methodological tool known as prosopography — the collection of all known information about individuals within a given period. With the advent of computer technology it is now possible to gather and store such information in increasingly sophisticated and searchable databases, which can bring a new dimension to traditional historical research. The book surveys the transition in prosopographical research from more traditional methods to the new technology, and discusses the central role of the British Academy, as well as that of French, German and Austrian academic institutions, in developing prosopographical research on the Later Roman Empire, Byzantium and now Anglo-Saxon and other periods. The chapters discuss both national histories of the discipline and its potential for future research. The book demonstrates mutual benefits and complementarity in such studies between the use of new technology and the highest standards of traditional scholarship, and in doing so it sets forth new perspectives and methodologies for future work.
Vaclav Smil
- Published in print:
- 2006
- Published Online:
- September 2006
- ISBN:
- 9780195168754
- eISBN:
- 9780199783601
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0195168755.003.0005
- Subject:
- Economics and Finance, Economic History
Private transportation was transformed by mass ownership of automobiles while long-distance public transport benefited from new high-speed trains and from affordable flying. Freight transportation ...
More
Private transportation was transformed by mass ownership of automobiles while long-distance public transport benefited from new high-speed trains and from affordable flying. Freight transportation was transformed by containers moved by ships, trains, and trucks. Communication and the processing and dissemination of information were revolutionized first by transistors, then by integrated circuits and microprocessors, the key components of mainframe and personal computers, televisions, and a multitude of electronic devices, many of them now taking advantage of the Internet.Less
Private transportation was transformed by mass ownership of automobiles while long-distance public transport benefited from new high-speed trains and from affordable flying. Freight transportation was transformed by containers moved by ships, trains, and trucks. Communication and the processing and dissemination of information were revolutionized first by transistors, then by integrated circuits and microprocessors, the key components of mainframe and personal computers, televisions, and a multitude of electronic devices, many of them now taking advantage of the Internet.
Franco Malerba, Richard Nelson, Luigi Orsenigo, and Sidney Winter
- Published in print:
- 2006
- Published Online:
- May 2006
- ISBN:
- 9780199290475
- eISBN:
- 9780191603495
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0199290474.003.0007
- Subject:
- Economics and Finance, Economic Systems
This chapter analyzes the changing boundaries of firms in terms of vertical integration and dis-integration (specialization) in dynamic and uncertain technological and market environments. In ...
More
This chapter analyzes the changing boundaries of firms in terms of vertical integration and dis-integration (specialization) in dynamic and uncertain technological and market environments. In particular, it addresses the question of stability and change in firms’ decisions to ‘make or buy’ in contexts characterized by periods of technological revolutions punctuating periods of relative technological stability and smooth technical progress. The chapter is inspired by the case of the computer and semiconductor industries, and proposes the building blocks of a model in the ‘history-friendly’ style, showing how alternative dynamics of demand and technical change might generate profoundly different patterns of evolution in the two industries. The main argument proposed concerns the role of co-evolution in the upstream and downstream industries in explaining the changing boundaries of firms.Less
This chapter analyzes the changing boundaries of firms in terms of vertical integration and dis-integration (specialization) in dynamic and uncertain technological and market environments. In particular, it addresses the question of stability and change in firms’ decisions to ‘make or buy’ in contexts characterized by periods of technological revolutions punctuating periods of relative technological stability and smooth technical progress. The chapter is inspired by the case of the computer and semiconductor industries, and proposes the building blocks of a model in the ‘history-friendly’ style, showing how alternative dynamics of demand and technical change might generate profoundly different patterns of evolution in the two industries. The main argument proposed concerns the role of co-evolution in the upstream and downstream industries in explaining the changing boundaries of firms.
James W. Cortada
- Published in print:
- 2007
- Published Online:
- January 2008
- ISBN:
- 9780195165869
- eISBN:
- 9780199868025
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195165869.001.0001
- Subject:
- Business and Management, Business History
This book, the third of three volumes, completes the sweeping survey of the effect of computers on American industry began in the first volume and continued in the second volume. It turns finally to ...
More
This book, the third of three volumes, completes the sweeping survey of the effect of computers on American industry began in the first volume and continued in the second volume. It turns finally to the public sector, examining how computers have fundamentally changed the nature of work in government and education. This book goes far beyond generalizations about the Information Age to the specifics of how industries have functioned, now function, and will function in the years to come. The book provides a broad overview of computing's and telecommunications' role in the entire public sector, including federal, state, and local governments, and in K-12 and higher education. Beginning in 1950, when commercial applications of digital technology began to appear, the book examines the unique ways different public sector industries adopted new technologies, showcasing the manner in which their innovative applications influenced other industries, as well as the US economy as a whole. The book builds on the surveys presented in the first volume, which examined sixteen manufacturing, process, transportation, wholesale and retail industries, and the second volume, which examined over a dozen financial, telecommunications, media, and entertainment industries. This book completes the trilogy and provides a picture of what the infrastructure of the Information Age really looks like and how we got there.Less
This book, the third of three volumes, completes the sweeping survey of the effect of computers on American industry began in the first volume and continued in the second volume. It turns finally to the public sector, examining how computers have fundamentally changed the nature of work in government and education. This book goes far beyond generalizations about the Information Age to the specifics of how industries have functioned, now function, and will function in the years to come. The book provides a broad overview of computing's and telecommunications' role in the entire public sector, including federal, state, and local governments, and in K-12 and higher education. Beginning in 1950, when commercial applications of digital technology began to appear, the book examines the unique ways different public sector industries adopted new technologies, showcasing the manner in which their innovative applications influenced other industries, as well as the US economy as a whole. The book builds on the surveys presented in the first volume, which examined sixteen manufacturing, process, transportation, wholesale and retail industries, and the second volume, which examined over a dozen financial, telecommunications, media, and entertainment industries. This book completes the trilogy and provides a picture of what the infrastructure of the Information Age really looks like and how we got there.
John Horty
- Published in print:
- 2009
- Published Online:
- October 2011
- ISBN:
- 9780199732715
- eISBN:
- 9780199852628
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199732715.001.0001
- Subject:
- Philosophy, Philosophy of Language
This book explores the difficulties presented for Gottlob Frege's semantic theory, as well as its modern descendents, by the treatment of defined expressions. The book begins by focusing on the ...
More
This book explores the difficulties presented for Gottlob Frege's semantic theory, as well as its modern descendents, by the treatment of defined expressions. The book begins by focusing on the psychological constraints governing Frege's notion of sense, or meaning, and argues that, given these constraints, even the treatment of simple stipulative definitions led Frege to important difficulties. This book suggests ways out of these difficulties that are both philosophically and logically plausible and Fregean in spirit. This discussion is then connected to a number of more familiar topics, such as indexicality and the discussion of concepts in recent theories of mind and language. The latter part of the book, after introducing a simple semantic model of senses as procedures, considers the problems that definitions present for Frege's idea that the sense of an expression should mirror its grammatical structure. The requirement can be satisfied, the book argues, only if defined expressions—and incomplete expressions as well—are assigned senses of their own, rather than treated contextually. The book then explores one way in which these senses might be reified within the procedural model, drawing on ideas from work in the semantics of computer programming languages. With its combination of technical semantics and history of philosophy, the book tackles some of the hardest questions in the philosophy of language.Less
This book explores the difficulties presented for Gottlob Frege's semantic theory, as well as its modern descendents, by the treatment of defined expressions. The book begins by focusing on the psychological constraints governing Frege's notion of sense, or meaning, and argues that, given these constraints, even the treatment of simple stipulative definitions led Frege to important difficulties. This book suggests ways out of these difficulties that are both philosophically and logically plausible and Fregean in spirit. This discussion is then connected to a number of more familiar topics, such as indexicality and the discussion of concepts in recent theories of mind and language. The latter part of the book, after introducing a simple semantic model of senses as procedures, considers the problems that definitions present for Frege's idea that the sense of an expression should mirror its grammatical structure. The requirement can be satisfied, the book argues, only if defined expressions—and incomplete expressions as well—are assigned senses of their own, rather than treated contextually. The book then explores one way in which these senses might be reified within the procedural model, drawing on ideas from work in the semantics of computer programming languages. With its combination of technical semantics and history of philosophy, the book tackles some of the hardest questions in the philosophy of language.
Mike Finnis
- Published in print:
- 2003
- Published Online:
- January 2010
- ISBN:
- 9780198509776
- eISBN:
- 9780191709180
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198509776.001.0001
- Subject:
- Physics, Atomic, Laser, and Optical Physics
There is a continuing growth of interest in the computer simulation of materials at the atomic scale, using a variety of academic and commercial computer programs. In all such programs there is some ...
More
There is a continuing growth of interest in the computer simulation of materials at the atomic scale, using a variety of academic and commercial computer programs. In all such programs there is some physical model of the interatomic forces. For a student or researcher, the basis of such models is often shrouded in mystery. It is usually unclear how well founded they are, since it is hard to find a discussion of the physical assumptions that have been made in their construction. The lack of clear understanding of the scope and limitations of a given model may lead to its innocent misuse, resulting either in unfair criticism of the model or in the dissemination of nonsensical results. In this book, models of interatomic forces are derived from a common physical basis, namely the density functional theory. The book includes the detailed derivation of pairwise potentials in simple metals, tight-binding models from the simplest to the most sophisticated (self-consistent) kind, and ionic models. It provides a critical appreciation of the broad range of models in current use, and provides the tools for understanding other variants that are described in the literature. Some of the material is new, and some pointers are given to possible future avenues of model development.Less
There is a continuing growth of interest in the computer simulation of materials at the atomic scale, using a variety of academic and commercial computer programs. In all such programs there is some physical model of the interatomic forces. For a student or researcher, the basis of such models is often shrouded in mystery. It is usually unclear how well founded they are, since it is hard to find a discussion of the physical assumptions that have been made in their construction. The lack of clear understanding of the scope and limitations of a given model may lead to its innocent misuse, resulting either in unfair criticism of the model or in the dissemination of nonsensical results. In this book, models of interatomic forces are derived from a common physical basis, namely the density functional theory. The book includes the detailed derivation of pairwise potentials in simple metals, tight-binding models from the simplest to the most sophisticated (self-consistent) kind, and ionic models. It provides a critical appreciation of the broad range of models in current use, and provides the tools for understanding other variants that are described in the literature. Some of the material is new, and some pointers are given to possible future avenues of model development.
James W. Cortada
- Published in print:
- 2004
- Published Online:
- September 2007
- ISBN:
- 9780195165883
- eISBN:
- 9780199789672
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195165883.003.0004
- Subject:
- Business and Management, Business History
This chapter is a history of key computer applications in manufacturing across three periods of time, beginning in the 1940s and extending to the early 2000s. Key uses included business and ...
More
This chapter is a history of key computer applications in manufacturing across three periods of time, beginning in the 1940s and extending to the early 2000s. Key uses included business and accounting, numerical control, integrated computer manufacturing, CAD/CAM, Computer Aided Manufacturing, robotics, and flexible manufacturing systems (FMS). It concludes with a description of supply chains and extent of deployment of all uses in manufacturing.Less
This chapter is a history of key computer applications in manufacturing across three periods of time, beginning in the 1940s and extending to the early 2000s. Key uses included business and accounting, numerical control, integrated computer manufacturing, CAD/CAM, Computer Aided Manufacturing, robotics, and flexible manufacturing systems (FMS). It concludes with a description of supply chains and extent of deployment of all uses in manufacturing.
Ziheng Yang
- Published in print:
- 2006
- Published Online:
- April 2010
- ISBN:
- 9780198567028
- eISBN:
- 9780191728280
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198567028.001.0001
- Subject:
- Biology, Evolutionary Biology / Genetics
The field of molecular evolution has experienced explosive growth in recent years due to the rapid accumulation of genetic sequence data, continuous improvements to computer hardware and software, ...
More
The field of molecular evolution has experienced explosive growth in recent years due to the rapid accumulation of genetic sequence data, continuous improvements to computer hardware and software, and the development of sophisticated analytical methods. The increasing availability of large genomic data sets requires powerful statistical methods to analyse and interpret them, generating both computational and conceptual challenges for the field. This book provides a comprehensive coverage of modern statistical and computational methods used in molecular evolutionary analysis, such as maximum likelihood and Bayesian statistics. It describes the models, methods and algorithms that are most useful for analysing the ever-increasing supply of molecular sequence data, with a view to furthering our understanding of the evolution of genes and genomes. The book emphasizes essential concepts rather than mathematical proofs. It includes detailed derivations and implementation details, as well as numerous illustrations, worked examples, and exercises.Less
The field of molecular evolution has experienced explosive growth in recent years due to the rapid accumulation of genetic sequence data, continuous improvements to computer hardware and software, and the development of sophisticated analytical methods. The increasing availability of large genomic data sets requires powerful statistical methods to analyse and interpret them, generating both computational and conceptual challenges for the field. This book provides a comprehensive coverage of modern statistical and computational methods used in molecular evolutionary analysis, such as maximum likelihood and Bayesian statistics. It describes the models, methods and algorithms that are most useful for analysing the ever-increasing supply of molecular sequence data, with a view to furthering our understanding of the evolution of genes and genomes. The book emphasizes essential concepts rather than mathematical proofs. It includes detailed derivations and implementation details, as well as numerous illustrations, worked examples, and exercises.