Melissa Terras
- Published in print:
- 2006
- Published Online:
- September 2007
- ISBN:
- 9780199204557
- eISBN:
- 9780191708121
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199204557.001.0001
- Subject:
- Classical Studies, British and Irish History: BCE to 500CE
The ink and stylus tablets discovered at the Roman fort of Vindolanda are a unique resource for scholars of ancient history. However, the stylus tablets in particular are extremely difficult to read. ...
More
The ink and stylus tablets discovered at the Roman fort of Vindolanda are a unique resource for scholars of ancient history. However, the stylus tablets in particular are extremely difficult to read. This book details the development of what appears to be the first system constructed to aid experts in the process of reading an ancient document, exploring the extent to which techniques from artificial intelligence can be used to develop a system that could aid historians in reading the stylus texts. Using knowledge elicitation techniques (borrowed from artificial intelligence and engineering science), a model is proposed for how experts construct a reading of a text. A prototype system is presented that can read in image data and produce realistic and plausible textual interpretations of the writing that appears on the documents. Incorporating knowledge elicited from experts working on the texts, and utilizing image processing techniques developed in engineering science to analyze the stylus tablets, the book includes a corpora of letter forms generated from the Vindolanda text corpus, and a detailed description of the architecture of the system. This research presents the first stages towards developing a cognitive visual system that can propagate realistic interpretations from image data.Less
The ink and stylus tablets discovered at the Roman fort of Vindolanda are a unique resource for scholars of ancient history. However, the stylus tablets in particular are extremely difficult to read. This book details the development of what appears to be the first system constructed to aid experts in the process of reading an ancient document, exploring the extent to which techniques from artificial intelligence can be used to develop a system that could aid historians in reading the stylus texts. Using knowledge elicitation techniques (borrowed from artificial intelligence and engineering science), a model is proposed for how experts construct a reading of a text. A prototype system is presented that can read in image data and produce realistic and plausible textual interpretations of the writing that appears on the documents. Incorporating knowledge elicited from experts working on the texts, and utilizing image processing techniques developed in engineering science to analyze the stylus tablets, the book includes a corpora of letter forms generated from the Vindolanda text corpus, and a detailed description of the architecture of the system. This research presents the first stages towards developing a cognitive visual system that can propagate realistic interpretations from image data.
Jon Williamson
- Published in print:
- 2004
- Published Online:
- September 2007
- ISBN:
- 9780198530794
- eISBN:
- 9780191712982
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198530794.003.0001
- Subject:
- Mathematics, Logic / Computer Science / Mathematical Philosophy
This chapter describes the central claims of the book. From a philosophical point of view, the book argues for an objective Bayesian interpretation of probability and an epistemic interpretation of ...
More
This chapter describes the central claims of the book. From a philosophical point of view, the book argues for an objective Bayesian interpretation of probability and an epistemic interpretation of causality, and claims that these offer a firm foundation for causal modelling. From the computational point of view, the book investigates the relationship between Bayesian nets and maximum entropy methods, and develops a general computational framework for probabilistic and causal reasoning.Less
This chapter describes the central claims of the book. From a philosophical point of view, the book argues for an objective Bayesian interpretation of probability and an epistemic interpretation of causality, and claims that these offer a firm foundation for causal modelling. From the computational point of view, the book investigates the relationship between Bayesian nets and maximum entropy methods, and develops a general computational framework for probabilistic and causal reasoning.
Pierluigi Frisco
- Published in print:
- 2009
- Published Online:
- September 2009
- ISBN:
- 9780199542864
- eISBN:
- 9780191715679
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199542864.001.0001
- Subject:
- Mathematics, Applied Mathematics, Mathematical Biology
How could we use living cells to perform computation? Would our definition of computation change as a consequence of this? Could such a cell-computer outperform digital computers? These are some of ...
More
How could we use living cells to perform computation? Would our definition of computation change as a consequence of this? Could such a cell-computer outperform digital computers? These are some of the questions that the study of Membrane Computing tries to answer and are at the base of what is treated by this monograph. Descriptional and computational complexity of models in Membrane Computing are the two lines of research on which is the focus here. In this context this book reports the results of only some of the models present in this framework. The models considered here represent a very relevant part of all the models introduced so far in the study of Membrane Computing. They are in between the most studied models in the field and they cover a broad range of features (using symbol objects or string objects, based only on communications, inspired by intra- and intercellular processes, having or not having a tree as underlying structure, etc.) that gives a grasp of the enormous flexibility of this framework. Links with biology and Petri nets are constant through this book. This book aims also to inspire research. This book gives suggestions for research of various levels of difficulty and this book clearly indicates their importance and the relevance of the possible outcomes. Readers new to this field of research will find the provided examples particularly useful in the understanding of the treated topics.Less
How could we use living cells to perform computation? Would our definition of computation change as a consequence of this? Could such a cell-computer outperform digital computers? These are some of the questions that the study of Membrane Computing tries to answer and are at the base of what is treated by this monograph. Descriptional and computational complexity of models in Membrane Computing are the two lines of research on which is the focus here. In this context this book reports the results of only some of the models present in this framework. The models considered here represent a very relevant part of all the models introduced so far in the study of Membrane Computing. They are in between the most studied models in the field and they cover a broad range of features (using symbol objects or string objects, based only on communications, inspired by intra- and intercellular processes, having or not having a tree as underlying structure, etc.) that gives a grasp of the enormous flexibility of this framework. Links with biology and Petri nets are constant through this book. This book aims also to inspire research. This book gives suggestions for research of various levels of difficulty and this book clearly indicates their importance and the relevance of the possible outcomes. Readers new to this field of research will find the provided examples particularly useful in the understanding of the treated topics.
Pierluigi Frisco
- Published in print:
- 2009
- Published Online:
- September 2009
- ISBN:
- 9780199542864
- eISBN:
- 9780191715679
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199542864.003.0001
- Subject:
- Mathematics, Applied Mathematics, Mathematical Biology
This chapter gives a very brief introduction to computability emphasising concepts playing an important role here. The chapter describes how in the 1920s the interest of Alan Turing in describing in ...
More
This chapter gives a very brief introduction to computability emphasising concepts playing an important role here. The chapter describes how in the 1920s the interest of Alan Turing in describing in mathematical terms the activity of computers, clerks performing computations, led to the definition of an abstract device called a Turing machine, to the start the study of computability and to the enunciation of the Church-Turing thesis. In a similar way the chapter indicates how in 2000 the internal organization of eukariotic cells inspired Gheorghe Păun to define membrane systems, also called P systems, where ‘P’ stands for ‘Păun’. Moreover, the chapter explains some of the advantages offered by membrane computing, the field of research using membrane systems to define computability models in order to study computation and computational complexity issues and to model processes of biology, linguistics, economics, etc., with respect to more classical approaches.Less
This chapter gives a very brief introduction to computability emphasising concepts playing an important role here. The chapter describes how in the 1920s the interest of Alan Turing in describing in mathematical terms the activity of computers, clerks performing computations, led to the definition of an abstract device called a Turing machine, to the start the study of computability and to the enunciation of the Church-Turing thesis. In a similar way the chapter indicates how in 2000 the internal organization of eukariotic cells inspired Gheorghe Păun to define membrane systems, also called P systems, where ‘P’ stands for ‘Păun’. Moreover, the chapter explains some of the advantages offered by membrane computing, the field of research using membrane systems to define computability models in order to study computation and computational complexity issues and to model processes of biology, linguistics, economics, etc., with respect to more classical approaches.
John Garrison and Raymond Chiao
- Published in print:
- 2008
- Published Online:
- September 2008
- ISBN:
- 9780198508861
- eISBN:
- 9780191708640
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198508861.001.0001
- Subject:
- Physics, Atomic, Laser, and Optical Physics
Quantum optics is the field of physics describing the interaction of individual photons with matter, but in recent years it has expanded beyond pure physics to become an important driving force for ...
More
Quantum optics is the field of physics describing the interaction of individual photons with matter, but in recent years it has expanded beyond pure physics to become an important driving force for technological innovation. This book starts with an elementary description of the underlying physics and then builds up a more advanced treatment. The theory begins with the quantum description of the simple harmonic oscillator, and is subsequently extended to provide the tools required to discuss coherent states, the interaction of light with atoms, entangled states, quantum noise and dissipation, linear optical amplifiers, and the fundamental issues associated with Bell's theorem. There is an equally strong emphasis on experimental methods. A quantum description of lenses, mirrors, beam splitters, Y-junctions, circulators, and stops is applied to a collection of important experiments in linear optics. A description of the most important methods of primary photon detection is followed by an explanation of heterodyne and homodyne techniques. Spontaneous down conversion and quantum tomography are discussed, together with important experimental applications. These experimental and theoretical considerations come together in a chapter briefly discussing quantum noise and its suppression in telecommunications; the limitations and possibilities for quantum cloning; the principles and techniques of quantum cryptography; and the physical basis for quantum computing.Less
Quantum optics is the field of physics describing the interaction of individual photons with matter, but in recent years it has expanded beyond pure physics to become an important driving force for technological innovation. This book starts with an elementary description of the underlying physics and then builds up a more advanced treatment. The theory begins with the quantum description of the simple harmonic oscillator, and is subsequently extended to provide the tools required to discuss coherent states, the interaction of light with atoms, entangled states, quantum noise and dissipation, linear optical amplifiers, and the fundamental issues associated with Bell's theorem. There is an equally strong emphasis on experimental methods. A quantum description of lenses, mirrors, beam splitters, Y-junctions, circulators, and stops is applied to a collection of important experiments in linear optics. A description of the most important methods of primary photon detection is followed by an explanation of heterodyne and homodyne techniques. Spontaneous down conversion and quantum tomography are discussed, together with important experimental applications. These experimental and theoretical considerations come together in a chapter briefly discussing quantum noise and its suppression in telecommunications; the limitations and possibilities for quantum cloning; the principles and techniques of quantum cryptography; and the physical basis for quantum computing.
Sadamichi Maekawa (ed.)
- Published in print:
- 2006
- Published Online:
- September 2007
- ISBN:
- 9780198568216
- eISBN:
- 9780191718212
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198568216.001.0001
- Subject:
- Physics, Condensed Matter Physics / Materials
Nowadays, information technology is based on semiconductor and ferromagnetic materials. Information processing and computation are performed using electron charge in semiconductor transistors and ...
More
Nowadays, information technology is based on semiconductor and ferromagnetic materials. Information processing and computation are performed using electron charge in semiconductor transistors and integrated circuits, and the information is stored by electron spins on magnetic high-density hard disks. Recently, a new branch of physics and nanotechnology, called magneto-electronics, spintronics, or spin electronics, has emerged, which aims to exploit both the charge and the spin of electrons in the same device. A broader goal is to develop new functionality that does not exist separately in a ferromagnet or a semiconductor. This book presents new directions in the development of spin electronics in both the basic physics and the technology which will become the foundation of future electronics.Less
Nowadays, information technology is based on semiconductor and ferromagnetic materials. Information processing and computation are performed using electron charge in semiconductor transistors and integrated circuits, and the information is stored by electron spins on magnetic high-density hard disks. Recently, a new branch of physics and nanotechnology, called magneto-electronics, spintronics, or spin electronics, has emerged, which aims to exploit both the charge and the spin of electrons in the same device. A broader goal is to develop new functionality that does not exist separately in a ferromagnet or a semiconductor. This book presents new directions in the development of spin electronics in both the basic physics and the technology which will become the foundation of future electronics.
Neil Rhodes
- Published in print:
- 2004
- Published Online:
- September 2007
- ISBN:
- 9780199245727
- eISBN:
- 9780191715259
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199245727.003.0001
- Subject:
- Literature, Shakespeare Studies
This introductory chapter presents the book both as a study of English before English existed as a subject in its own right (comparable with what were known as ‘conjectural histories’ in the 18th ...
More
This introductory chapter presents the book both as a study of English before English existed as a subject in its own right (comparable with what were known as ‘conjectural histories’ in the 18th century), and as an explanation of Shakespeare’s role in the origins of the subject. The method used will involve making analogies between modern and early modern cultural practices: terms such as ‘media studies’ and ‘creative writing’, and terms associated with the technology of computing will be transferred to earlier cultural contexts to establish some continuity between the modern subject of English and its earlier manifestations. At the same time and for the same purpose, the older study of rhetoric and the practices associated with it will be shown to persist in different forms within English Studies today.Less
This introductory chapter presents the book both as a study of English before English existed as a subject in its own right (comparable with what were known as ‘conjectural histories’ in the 18th century), and as an explanation of Shakespeare’s role in the origins of the subject. The method used will involve making analogies between modern and early modern cultural practices: terms such as ‘media studies’ and ‘creative writing’, and terms associated with the technology of computing will be transferred to earlier cultural contexts to establish some continuity between the modern subject of English and its earlier manifestations. At the same time and for the same purpose, the older study of rhetoric and the practices associated with it will be shown to persist in different forms within English Studies today.
Vaclav Smil
- Published in print:
- 2006
- Published Online:
- September 2006
- ISBN:
- 9780195168754
- eISBN:
- 9780199783601
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0195168755.001.0001
- Subject:
- Economics and Finance, Economic History
This book is a systematic interdisciplinary account of two epochal trends: the history of the 20th century’s technical transformation based on the unprecedented surge of innovation that took place in ...
More
This book is a systematic interdisciplinary account of two epochal trends: the history of the 20th century’s technical transformation based on the unprecedented surge of innovation that took place in Europe and North America during the three pre-WWI generation (1867-1914); and the history of new fundamental inventions during the period 1914-2000. Mass consumption of fossil fuels provided the energetic foundation of this progress. New ways of making steel — the leading metal of our civilization — and new materials including plastics and silicon, opened entirely new technical possibilities. Rationalized production, be it in agriculture or manufacturing, benefited from advancing mechanization, automation, and robotization. New epochal inventions included the discovery of nuclear fission, followed by the rapid development of nuclear weapons and commercial generation of nuclear electricity; the discovery of gas turbines (and their use in jet airplanes as well as in stationary applications); and the invention of solid-state electronics based on semiconductors used to make transistors, integrated circuits, and microprocessors, the key components of modern computing. The new economy based on unprecedented levels of energy consumption brought not only mass consumption and higher quality of life, but also some worrisome social problems and environmental changes; its prospects remain uncertain.Less
This book is a systematic interdisciplinary account of two epochal trends: the history of the 20th century’s technical transformation based on the unprecedented surge of innovation that took place in Europe and North America during the three pre-WWI generation (1867-1914); and the history of new fundamental inventions during the period 1914-2000. Mass consumption of fossil fuels provided the energetic foundation of this progress. New ways of making steel — the leading metal of our civilization — and new materials including plastics and silicon, opened entirely new technical possibilities. Rationalized production, be it in agriculture or manufacturing, benefited from advancing mechanization, automation, and robotization. New epochal inventions included the discovery of nuclear fission, followed by the rapid development of nuclear weapons and commercial generation of nuclear electricity; the discovery of gas turbines (and their use in jet airplanes as well as in stationary applications); and the invention of solid-state electronics based on semiconductors used to make transistors, integrated circuits, and microprocessors, the key components of modern computing. The new economy based on unprecedented levels of energy consumption brought not only mass consumption and higher quality of life, but also some worrisome social problems and environmental changes; its prospects remain uncertain.
Arthur L. Norberg
- Published in print:
- 2004
- Published Online:
- September 2007
- ISBN:
- 9780199241057
- eISBN:
- 9780191714290
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199241057.003.0002
- Subject:
- Business and Management, Information Technology
This chapter examines the dominant role of government agencies in providing support for computing during the last four decades. By extending the discussion over three phases, the chapter enlarges the ...
More
This chapter examines the dominant role of government agencies in providing support for computing during the last four decades. By extending the discussion over three phases, the chapter enlarges the historical appreciation of the role of government in R&D for computing. This 50-year analysis also illustrates similarities in approach to selecting problems and funding R&D over the three phases. The changes discussed resulted both from sophistication within computing and from altered attitudes and circumstances in the society around the computing enterprise.Less
This chapter examines the dominant role of government agencies in providing support for computing during the last four decades. By extending the discussion over three phases, the chapter enlarges the historical appreciation of the role of government in R&D for computing. This 50-year analysis also illustrates similarities in approach to selecting problems and funding R&D over the three phases. The changes discussed resulted both from sophistication within computing and from altered attitudes and circumstances in the society around the computing enterprise.
James W. Cortada
- Published in print:
- 2007
- Published Online:
- January 2008
- ISBN:
- 9780195165869
- eISBN:
- 9780199868025
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195165869.003.0010
- Subject:
- Business and Management, Business History
This chapter addresses the various issues raised in this book. It also identifies patterns of behavior and extracts lessons, finally situating the role of IT of the public sector into the broader ...
More
This chapter addresses the various issues raised in this book. It also identifies patterns of behavior and extracts lessons, finally situating the role of IT of the public sector into the broader experiences of modern American society. Topics discussed include the public sector as a galaxy of industries, how the digital hand changed the work of government and education, IT adoption patterns, and the role of the public sector as creator of today's economy.Less
This chapter addresses the various issues raised in this book. It also identifies patterns of behavior and extracts lessons, finally situating the role of IT of the public sector into the broader experiences of modern American society. Topics discussed include the public sector as a galaxy of industries, how the digital hand changed the work of government and education, IT adoption patterns, and the role of the public sector as creator of today's economy.
James W. Cortada
- Published in print:
- 2007
- Published Online:
- January 2008
- ISBN:
- 9780195165869
- eISBN:
- 9780199868025
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195165869.003.0002
- Subject:
- Business and Management, Business History
One of the largest and most pervasive uses of computers by all governments across the American economy has been for accounting applications. While the fundamental missions and tasks have not changed ...
More
One of the largest and most pervasive uses of computers by all governments across the American economy has been for accounting applications. While the fundamental missions and tasks have not changed over time, how the work of accounting, financial, and tax departments has been accomplished has. This chapter discusses the introduction and use of computing in tax filing, collections, and compliance. Topics covered include the Internal Revenue Service, state tax and financial applications, local government tax applications, and the adoption of software tools by tax preparers and payers.Less
One of the largest and most pervasive uses of computers by all governments across the American economy has been for accounting applications. While the fundamental missions and tasks have not changed over time, how the work of accounting, financial, and tax departments has been accomplished has. This chapter discusses the introduction and use of computing in tax filing, collections, and compliance. Topics covered include the Internal Revenue Service, state tax and financial applications, local government tax applications, and the adoption of software tools by tax preparers and payers.
James W. Cortada
- Published in print:
- 2007
- Published Online:
- January 2008
- ISBN:
- 9780195165869
- eISBN:
- 9780199868025
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195165869.003.0004
- Subject:
- Business and Management, Business History
This chapter discusses technologies adopted by the law enforcement community over a half century. Specifically, it looks at the use of computing by policing agencies, courts, and corrections, with a ...
More
This chapter discusses technologies adopted by the law enforcement community over a half century. Specifically, it looks at the use of computing by policing agencies, courts, and corrections, with a brief introduction to the early history of computer crime as it currently represents a new class of criminal activity made possible by the existence of the digital hand.Less
This chapter discusses technologies adopted by the law enforcement community over a half century. Specifically, it looks at the use of computing by policing agencies, courts, and corrections, with a brief introduction to the early history of computer crime as it currently represents a new class of criminal activity made possible by the existence of the digital hand.
James W. Cortada
- Published in print:
- 2007
- Published Online:
- January 2008
- ISBN:
- 9780195165869
- eISBN:
- 9780199868025
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195165869.003.0005
- Subject:
- Business and Management, Business History
This chapter discusses the impact of IT on the work of the Social Security Administration (SSA), the Bureau of the Census, and the US Postal Service (USPS). All three organizations extensively use ...
More
This chapter discusses the impact of IT on the work of the Social Security Administration (SSA), the Bureau of the Census, and the US Postal Service (USPS). All three organizations extensively use information technology, in fact, to such an extent that it would be difficult to imagine how they could function in the future without its use. How each came to such a point reflects various experiences unique to each agency. The rate of adoption and extent of deployment reflects internal operational and managerial issues and as with other federal agencies and departments, digital tools had to be configured in ways specific to their needs.Less
This chapter discusses the impact of IT on the work of the Social Security Administration (SSA), the Bureau of the Census, and the US Postal Service (USPS). All three organizations extensively use information technology, in fact, to such an extent that it would be difficult to imagine how they could function in the future without its use. How each came to such a point reflects various experiences unique to each agency. The rate of adoption and extent of deployment reflects internal operational and managerial issues and as with other federal agencies and departments, digital tools had to be configured in ways specific to their needs.
James W. Cortada
- Published in print:
- 2007
- Published Online:
- January 2008
- ISBN:
- 9780195165869
- eISBN:
- 9780199868025
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195165869.003.0008
- Subject:
- Business and Management, Business History
This chapter discusses the deployment and use of information technology in public schools. Topics covered include computing in educational administration from the 1950s to 2000s, the use of computers ...
More
This chapter discusses the deployment and use of information technology in public schools. Topics covered include computing in educational administration from the 1950s to 2000s, the use of computers in teaching from the 1960s to 1980s, debate about the role of computing in education, and the role of IT in education. It is argued that looking at the educational sector demonstrates some of the limits of the nation's transformation to a digitally rich economy. The technology had to enhance, then transform, how the daily core tasks of an industry are done. When that has not been the case, deployment has proved limited.Less
This chapter discusses the deployment and use of information technology in public schools. Topics covered include computing in educational administration from the 1950s to 2000s, the use of computers in teaching from the 1960s to 1980s, debate about the role of computing in education, and the role of IT in education. It is argued that looking at the educational sector demonstrates some of the limits of the nation's transformation to a digitally rich economy. The technology had to enhance, then transform, how the daily core tasks of an industry are done. When that has not been the case, deployment has proved limited.
James W. Cortada
- Published in print:
- 2007
- Published Online:
- January 2008
- ISBN:
- 9780195165869
- eISBN:
- 9780199868025
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195165869.003.0009
- Subject:
- Business and Management, Business History
This chapter discusses the deployment and use of computing in higher education. Topics covered include administrative uses, teaching and computers, role of computing in academic research, IT in the ...
More
This chapter discusses the deployment and use of computing in higher education. Topics covered include administrative uses, teaching and computers, role of computing in academic research, IT in the library, personal use of computers, and special role of the Internet. It is shown that in addition to being a supplier of computer science and technology, the higher education industry trained (or educated) tens of millions of people, equipping many of them with the values, work practices, and skills that have defined the economy and society of modern America and, indeed, of many individuals and firms around the world. Its use of all manner of technology also reflects patterns of application evident in many parts of the nation's economy, including its use of the digital hand. Because it educates so many workers, and influences the values and activities of so many individuals, its use of computing is influential and important.Less
This chapter discusses the deployment and use of computing in higher education. Topics covered include administrative uses, teaching and computers, role of computing in academic research, IT in the library, personal use of computers, and special role of the Internet. It is shown that in addition to being a supplier of computer science and technology, the higher education industry trained (or educated) tens of millions of people, equipping many of them with the values, work practices, and skills that have defined the economy and society of modern America and, indeed, of many individuals and firms around the world. Its use of all manner of technology also reflects patterns of application evident in many parts of the nation's economy, including its use of the digital hand. Because it educates so many workers, and influences the values and activities of so many individuals, its use of computing is influential and important.
B. Jack Copeland
- Published in print:
- 2005
- Published Online:
- January 2008
- ISBN:
- 9780198565932
- eISBN:
- 9780191714016
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198565932.003.0001
- Subject:
- Mathematics, History of Mathematics
This introductory chapter discusses the development of Alan Turing's ‘universal computing machine’, better known as the universal Turing Machine. The earliest large-scale electronic digital ...
More
This introductory chapter discusses the development of Alan Turing's ‘universal computing machine’, better known as the universal Turing Machine. The earliest large-scale electronic digital computers, the British Colossus (1943) and American ENIAC (1945), did not store programmes in memory. In 1936, Turing came up with an idea for a machine with limitless memory, in which both data and instructions were to be stored. By 1945, groups in Britain and the US began developing hardware for a universal Turing machine. Turing headed a group at the National Physical Laboratory in London that designed the Automatic Computing Engine (ACE), the first relatively complete specification of an electronic stored-programme digital computer.Less
This introductory chapter discusses the development of Alan Turing's ‘universal computing machine’, better known as the universal Turing Machine. The earliest large-scale electronic digital computers, the British Colossus (1943) and American ENIAC (1945), did not store programmes in memory. In 1936, Turing came up with an idea for a machine with limitless memory, in which both data and instructions were to be stored. By 1945, groups in Britain and the US began developing hardware for a universal Turing machine. Turing headed a group at the National Physical Laboratory in London that designed the Automatic Computing Engine (ACE), the first relatively complete specification of an electronic stored-programme digital computer.
Mary Croarken
- Published in print:
- 2005
- Published Online:
- January 2008
- ISBN:
- 9780198565932
- eISBN:
- 9780191714016
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198565932.003.0003
- Subject:
- Mathematics, History of Mathematics
In April 1945, the journal Nature announced that the National Physical Laboratory would ‘extend its activities by the establishment of a Mathematics Division’. The new Mathematics Division was ...
More
In April 1945, the journal Nature announced that the National Physical Laboratory would ‘extend its activities by the establishment of a Mathematics Division’. The new Mathematics Division was intended to act as a ‘central mathematics station’ and was the first of the three main centres of early electronic computer development in Britain. The Division had two main functions: to undertake research into new computing methods and machines, and to provide computing services and advice to government departments and industry. It was soon providing a national computing service, and became a leading centre for numerical analysis. This chapter sets the stage for these developments in computing, focusing on the circumstances surrounding the creation of the NPL Mathematics Division. Four questions are addressed: why was a central mathematics station needed? Why was it established at the NPL? Why was John Womersley chosen as superintendent? And finally, to what extent did the NPL Mathematics Division succeed as a central mathematics station?Less
In April 1945, the journal Nature announced that the National Physical Laboratory would ‘extend its activities by the establishment of a Mathematics Division’. The new Mathematics Division was intended to act as a ‘central mathematics station’ and was the first of the three main centres of early electronic computer development in Britain. The Division had two main functions: to undertake research into new computing methods and machines, and to provide computing services and advice to government departments and industry. It was soon providing a national computing service, and became a leading centre for numerical analysis. This chapter sets the stage for these developments in computing, focusing on the circumstances surrounding the creation of the NPL Mathematics Division. Four questions are addressed: why was a central mathematics station needed? Why was it established at the NPL? Why was John Womersley chosen as superintendent? And finally, to what extent did the NPL Mathematics Division succeed as a central mathematics station?
Luciano Floridi
- Published in print:
- 2011
- Published Online:
- May 2011
- ISBN:
- 9780199232383
- eISBN:
- 9780191594809
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199232383.001.0001
- Subject:
- Philosophy, General, Logic/Philosophy of Mathematics
This book brings together the outcome of ten years of research. It is based on a simple project, which was begun towards the end of the 1990s: information is a crucial concept, which deserves a ...
More
This book brings together the outcome of ten years of research. It is based on a simple project, which was begun towards the end of the 1990s: information is a crucial concept, which deserves a thorough philosophical investigation. So the book lays down the conceptual foundations of a new area of research: the philosophy of information. It does so systematically, by pursuing three goals. The first is metatheoretical. The book describes what the philosophy of information is, its problems, and its method of levels of abstraction. These are the topics of the first part, which comprises chapters one, two and three. The second goal is introductory. In chapters four and five, the book explores the complex and diverse nature of several informational concepts and phenomena. The third goal is constructive. In the remaining ten chapters, the book answers some classic philosophical questions in information-theoretical terms. As a result, the book provides the first, unified and coherent research programme for the philosophy of information, understood as a new, independent area of research, concerned with (1) the critical investigation of the conceptual nature and basic principles of information, including its dynamics, utilization, and sciences; and (2) the elaboration and application of information-theoretic and computational methodologies to philosophical problems.Less
This book brings together the outcome of ten years of research. It is based on a simple project, which was begun towards the end of the 1990s: information is a crucial concept, which deserves a thorough philosophical investigation. So the book lays down the conceptual foundations of a new area of research: the philosophy of information. It does so systematically, by pursuing three goals. The first is metatheoretical. The book describes what the philosophy of information is, its problems, and its method of levels of abstraction. These are the topics of the first part, which comprises chapters one, two and three. The second goal is introductory. In chapters four and five, the book explores the complex and diverse nature of several informational concepts and phenomena. The third goal is constructive. In the remaining ten chapters, the book answers some classic philosophical questions in information-theoretical terms. As a result, the book provides the first, unified and coherent research programme for the philosophy of information, understood as a new, independent area of research, concerned with (1) the critical investigation of the conceptual nature and basic principles of information, including its dynamics, utilization, and sciences; and (2) the elaboration and application of information-theoretic and computational methodologies to philosophical problems.
Wendell Wallach and Colin Allen
- Published in print:
- 2009
- Published Online:
- January 2009
- ISBN:
- 9780195374049
- eISBN:
- 9780199871889
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195374049.003.0002
- Subject:
- Philosophy, Moral Philosophy
Artificial moral agents are necessary and inevitable. Innovative technologies are converging on sophisticated systems that will require some capacity for moral decision making. With the ...
More
Artificial moral agents are necessary and inevitable. Innovative technologies are converging on sophisticated systems that will require some capacity for moral decision making. With the implementation of driverless trains, the “trolley cases” invented by ethicists to study moral dilemmas may represent actual challenges for artificial moral agents. Among the difficult tasks for designers of such systems is to specify what the goals should be, i.e. what is meant by a “good” artificial moral agent? Computer viruses are among the software agents that already cause harm. Credit card approval systems are among the examples of autonomous systems that already affect daily life in ethically significant ways but are “ethically blind” because they lack moral decision‐making capacities. Pervasive and ubiquitous computing, the introduction of service robots in the home to care for the elderly, and the deployment of machine‐gun‐carrying military robots expand the possibilities of software and robots without sensitivity to ethical considerations harming people.Less
Artificial moral agents are necessary and inevitable. Innovative technologies are converging on sophisticated systems that will require some capacity for moral decision making. With the implementation of driverless trains, the “trolley cases” invented by ethicists to study moral dilemmas may represent actual challenges for artificial moral agents. Among the difficult tasks for designers of such systems is to specify what the goals should be, i.e. what is meant by a “good” artificial moral agent? Computer viruses are among the software agents that already cause harm. Credit card approval systems are among the examples of autonomous systems that already affect daily life in ethically significant ways but are “ethically blind” because they lack moral decision‐making capacities. Pervasive and ubiquitous computing, the introduction of service robots in the home to care for the elderly, and the deployment of machine‐gun‐carrying military robots expand the possibilities of software and robots without sensitivity to ethical considerations harming people.
B. Jack Copeland
- Published in print:
- 2005
- Published Online:
- January 2008
- ISBN:
- 9780198565932
- eISBN:
- 9780191714016
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198565932.003.0004
- Subject:
- Mathematics, History of Mathematics
This chapter details the history of the Automatic Computing Engine (ACE) project. The story of the ACE begins with John Womersley's appointment as superintendent of the newly created Mathematics ...
More
This chapter details the history of the Automatic Computing Engine (ACE) project. The story of the ACE begins with John Womersley's appointment as superintendent of the newly created Mathematics Division of the National Physical Laboratory. Womersley's proposed research programme for his new division included the goals ‘To explore the application of switching methods (mechanical, electrical and electronic) to computations of all kinds’, ‘Investigation of the possible adaptation of automatic telephone equipment to scientific computing’, and ‘ Development of electronic counting device suitable for rapid computing’. Womersley convinced Turing to join the ACE project.Less
This chapter details the history of the Automatic Computing Engine (ACE) project. The story of the ACE begins with John Womersley's appointment as superintendent of the newly created Mathematics Division of the National Physical Laboratory. Womersley's proposed research programme for his new division included the goals ‘To explore the application of switching methods (mechanical, electrical and electronic) to computations of all kinds’, ‘Investigation of the possible adaptation of automatic telephone equipment to scientific computing’, and ‘ Development of electronic counting device suitable for rapid computing’. Womersley convinced Turing to join the ACE project.