Anthony Garratt, Kevin Lee, M. Hashem Pesaran, and Yongcheol Shin
- Published in print:
- 2006
- Published Online:
- September 2006
- ISBN:
- 9780199296859
- eISBN:
- 9780191603853
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0199296855.003.0013
- Subject:
- Economics and Finance, Econometrics
The final chapter provides some concluding comments, including a summary of the main contributions of the book and an invitation to others to apply the methods in new contexts using the data and code ...
More
The final chapter provides some concluding comments, including a summary of the main contributions of the book and an invitation to others to apply the methods in new contexts using the data and code provided in the Appendices.Less
The final chapter provides some concluding comments, including a summary of the main contributions of the book and an invitation to others to apply the methods in new contexts using the data and code provided in the Appendices.
Eric T. Olson
- Published in print:
- 2007
- Published Online:
- September 2007
- ISBN:
- 9780195176421
- eISBN:
- 9780199872008
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195176421.003.0006
- Subject:
- Philosophy, Philosophy of Mind
This chapter considers Hume's proposal that we are made up entirely of particular mental states and events: the bundle view. An argument for the bundle view is based on the claim that the traditional ...
More
This chapter considers Hume's proposal that we are made up entirely of particular mental states and events: the bundle view. An argument for the bundle view is based on the claim that the traditional idea of substance is dismissed. The bundle view is then shown to follow naturally from widely held claims about diachronic and synchronic personal identity. Reid's objection that bundles of thoughts cannot be thinkers is elaborated and endorsed. It is then argued that the bundle view cannot easily avoid the thinking‐animal problem. There follows a critical discussion of two related views: that we are bundles of universals and that we are something like computer programs. Both are found to be hopeless.Less
This chapter considers Hume's proposal that we are made up entirely of particular mental states and events: the bundle view. An argument for the bundle view is based on the claim that the traditional idea of substance is dismissed. The bundle view is then shown to follow naturally from widely held claims about diachronic and synchronic personal identity. Reid's objection that bundles of thoughts cannot be thinkers is elaborated and endorsed. It is then argued that the bundle view cannot easily avoid the thinking‐animal problem. There follows a critical discussion of two related views: that we are bundles of universals and that we are something like computer programs. Both are found to be hopeless.
ANGELO GAVEZZOTTI
- Published in print:
- 2006
- Published Online:
- January 2010
- ISBN:
- 9780198570806
- eISBN:
- 9780191718779
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198570806.003.0010
- Subject:
- Physics, Atomic, Laser, and Optical Physics
Chemical applications of force field simulations, of quantum mechanics, as well as X-ray data processing, lattice dynamics, and molecular dynamics simulations are all made possible by fast and ...
More
Chemical applications of force field simulations, of quantum mechanics, as well as X-ray data processing, lattice dynamics, and molecular dynamics simulations are all made possible by fast and reliable numerical computation. Therefore, electronic computers are a theoretical chemist's vital tool, and very few — if any — quantitative results can be obtained without them. Computers handle a very large and very diversified range of tasks on a surprisingly small fundamental basis: the electric representation of only two numbers, zero and one, called binary digits (or bits). Computers use bits to represent numbers in binary notation. In a very successful metaphoric style, all items of the computer world that have to do with programs are called software, while all the rest (electronic parts, wires, input-output devices) are called hardware. This chapter provides an overview of computers, operating systems, computer programming, bugs, program checking and validation, and reproducibility.Less
Chemical applications of force field simulations, of quantum mechanics, as well as X-ray data processing, lattice dynamics, and molecular dynamics simulations are all made possible by fast and reliable numerical computation. Therefore, electronic computers are a theoretical chemist's vital tool, and very few — if any — quantitative results can be obtained without them. Computers handle a very large and very diversified range of tasks on a surprisingly small fundamental basis: the electric representation of only two numbers, zero and one, called binary digits (or bits). Computers use bits to represent numbers in binary notation. In a very successful metaphoric style, all items of the computer world that have to do with programs are called software, while all the rest (electronic parts, wires, input-output devices) are called hardware. This chapter provides an overview of computers, operating systems, computer programming, bugs, program checking and validation, and reproducibility.
Rein Taagepera
- Published in print:
- 2008
- Published Online:
- September 2008
- ISBN:
- 9780199534661
- eISBN:
- 9780191715921
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199534661.003.0014
- Subject:
- Political Science, Comparative Politics, Political Economy
Routine statistical approaches are essentially descriptive, giving answers within a narrow range of questions. Quantitatively formulated logical model force us to ask further questions and are ...
More
Routine statistical approaches are essentially descriptive, giving answers within a narrow range of questions. Quantitatively formulated logical model force us to ask further questions and are predictive in an explanatory way. Descriptive approaches are not conducive to detection of social laws, especially if one simultaneously feeds in variables which actually connect sequentially. Rather than a single sequence of “hypothesis testing,” scientific procedure involves repeat cycles where predictive and descriptive approaches enter intermixed. Directional models and reams of numbers ground out by canned computer programs must make room for quantitative logical models and sparse conceptually grounded constants.Less
Routine statistical approaches are essentially descriptive, giving answers within a narrow range of questions. Quantitatively formulated logical model force us to ask further questions and are predictive in an explanatory way. Descriptive approaches are not conducive to detection of social laws, especially if one simultaneously feeds in variables which actually connect sequentially. Rather than a single sequence of “hypothesis testing,” scientific procedure involves repeat cycles where predictive and descriptive approaches enter intermixed. Directional models and reams of numbers ground out by canned computer programs must make room for quantitative logical models and sparse conceptually grounded constants.
Marina Umaschi Bers
- Published in print:
- 2012
- Published Online:
- May 2012
- ISBN:
- 9780199757022
- eISBN:
- 9780199933037
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199757022.003.0006
- Subject:
- Psychology, Developmental Psychology, Social Psychology
This chapter focuses on technologies as spaces to create content, and in the process, to promote the development of competence. Competence is defined as the possession of a skill, knowledge, or ...
More
This chapter focuses on technologies as spaces to create content, and in the process, to promote the development of competence. Competence is defined as the possession of a skill, knowledge, or capacity. In this context, competence refers to the ability to use technology to create or design personally meaningful projects, to accomplish a goal, and to debug projects and problem-solve. When one uses technology to create content, one also learns how to organize the content domain and how to choose the core ideas. This set of new ideas will be referred in the book as powerful ideas from the content domain. This chapter highlights how children can create digital content, especially through computer programming. Aside from programming, children can engage in the participatory culture of the internet, creating content that is shared with millions. As children become competent by creating digital projects, they are prepared to enter a world in which technological fluency is key.Less
This chapter focuses on technologies as spaces to create content, and in the process, to promote the development of competence. Competence is defined as the possession of a skill, knowledge, or capacity. In this context, competence refers to the ability to use technology to create or design personally meaningful projects, to accomplish a goal, and to debug projects and problem-solve. When one uses technology to create content, one also learns how to organize the content domain and how to choose the core ideas. This set of new ideas will be referred in the book as powerful ideas from the content domain. This chapter highlights how children can create digital content, especially through computer programming. Aside from programming, children can engage in the participatory culture of the internet, creating content that is shared with millions. As children become competent by creating digital projects, they are prepared to enter a world in which technological fluency is key.
Nathan Ensmenger
- Published in print:
- 2010
- Published Online:
- August 2013
- ISBN:
- 9780262050937
- eISBN:
- 9780262289351
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262050937.003.0002
- Subject:
- Information Science, Information Science
This chapter traces the history of computer programming from its origins as low-status clerical work, often performed by women, into one of the highest-paid technical occupations of the late 1950s ...
More
This chapter traces the history of computer programming from its origins as low-status clerical work, often performed by women, into one of the highest-paid technical occupations of the late 1950s and early 1960s. It explores the emergence of the computer programmer as a well-compensated technical expert and explains while they continued to struggle with questions of status and identity, they were generally considered to be anything but routine clerical workers by the end of the 1950s. It also highlights widespread perception that programming was a black art during 1950s and 1960s.Less
This chapter traces the history of computer programming from its origins as low-status clerical work, often performed by women, into one of the highest-paid technical occupations of the late 1950s and early 1960s. It explores the emergence of the computer programmer as a well-compensated technical expert and explains while they continued to struggle with questions of status and identity, they were generally considered to be anything but routine clerical workers by the end of the 1950s. It also highlights widespread perception that programming was a black art during 1950s and 1960s.
Mike Finnis
- Published in print:
- 2003
- Published Online:
- January 2010
- ISBN:
- 9780198509776
- eISBN:
- 9780191709180
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198509776.001.0001
- Subject:
- Physics, Atomic, Laser, and Optical Physics
There is a continuing growth of interest in the computer simulation of materials at the atomic scale, using a variety of academic and commercial computer programs. In all such programs there is some ...
More
There is a continuing growth of interest in the computer simulation of materials at the atomic scale, using a variety of academic and commercial computer programs. In all such programs there is some physical model of the interatomic forces. For a student or researcher, the basis of such models is often shrouded in mystery. It is usually unclear how well founded they are, since it is hard to find a discussion of the physical assumptions that have been made in their construction. The lack of clear understanding of the scope and limitations of a given model may lead to its innocent misuse, resulting either in unfair criticism of the model or in the dissemination of nonsensical results. In this book, models of interatomic forces are derived from a common physical basis, namely the density functional theory. The book includes the detailed derivation of pairwise potentials in simple metals, tight-binding models from the simplest to the most sophisticated (self-consistent) kind, and ionic models. It provides a critical appreciation of the broad range of models in current use, and provides the tools for understanding other variants that are described in the literature. Some of the material is new, and some pointers are given to possible future avenues of model development.Less
There is a continuing growth of interest in the computer simulation of materials at the atomic scale, using a variety of academic and commercial computer programs. In all such programs there is some physical model of the interatomic forces. For a student or researcher, the basis of such models is often shrouded in mystery. It is usually unclear how well founded they are, since it is hard to find a discussion of the physical assumptions that have been made in their construction. The lack of clear understanding of the scope and limitations of a given model may lead to its innocent misuse, resulting either in unfair criticism of the model or in the dissemination of nonsensical results. In this book, models of interatomic forces are derived from a common physical basis, namely the density functional theory. The book includes the detailed derivation of pairwise potentials in simple metals, tight-binding models from the simplest to the most sophisticated (self-consistent) kind, and ionic models. It provides a critical appreciation of the broad range of models in current use, and provides the tools for understanding other variants that are described in the literature. Some of the material is new, and some pointers are given to possible future avenues of model development.
John Horty
- Published in print:
- 2009
- Published Online:
- October 2011
- ISBN:
- 9780199732715
- eISBN:
- 9780199852628
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199732715.001.0001
- Subject:
- Philosophy, Philosophy of Language
This book explores the difficulties presented for Gottlob Frege's semantic theory, as well as its modern descendents, by the treatment of defined expressions. The book begins by focusing on the ...
More
This book explores the difficulties presented for Gottlob Frege's semantic theory, as well as its modern descendents, by the treatment of defined expressions. The book begins by focusing on the psychological constraints governing Frege's notion of sense, or meaning, and argues that, given these constraints, even the treatment of simple stipulative definitions led Frege to important difficulties. This book suggests ways out of these difficulties that are both philosophically and logically plausible and Fregean in spirit. This discussion is then connected to a number of more familiar topics, such as indexicality and the discussion of concepts in recent theories of mind and language. The latter part of the book, after introducing a simple semantic model of senses as procedures, considers the problems that definitions present for Frege's idea that the sense of an expression should mirror its grammatical structure. The requirement can be satisfied, the book argues, only if defined expressions—and incomplete expressions as well—are assigned senses of their own, rather than treated contextually. The book then explores one way in which these senses might be reified within the procedural model, drawing on ideas from work in the semantics of computer programming languages. With its combination of technical semantics and history of philosophy, the book tackles some of the hardest questions in the philosophy of language.Less
This book explores the difficulties presented for Gottlob Frege's semantic theory, as well as its modern descendents, by the treatment of defined expressions. The book begins by focusing on the psychological constraints governing Frege's notion of sense, or meaning, and argues that, given these constraints, even the treatment of simple stipulative definitions led Frege to important difficulties. This book suggests ways out of these difficulties that are both philosophically and logically plausible and Fregean in spirit. This discussion is then connected to a number of more familiar topics, such as indexicality and the discussion of concepts in recent theories of mind and language. The latter part of the book, after introducing a simple semantic model of senses as procedures, considers the problems that definitions present for Frege's idea that the sense of an expression should mirror its grammatical structure. The requirement can be satisfied, the book argues, only if defined expressions—and incomplete expressions as well—are assigned senses of their own, rather than treated contextually. The book then explores one way in which these senses might be reified within the procedural model, drawing on ideas from work in the semantics of computer programming languages. With its combination of technical semantics and history of philosophy, the book tackles some of the hardest questions in the philosophy of language.
Christine Ogan, Jean C. Robinson, Manju Ahuja, and Susan C. Herring
- Published in print:
- 2006
- Published Online:
- August 2013
- ISBN:
- 9780262033459
- eISBN:
- 9780262255929
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262033459.003.0009
- Subject:
- Business and Management, Information Technology
This chapter reports on a study that compares the demographics, attitudes, and computing-related behaviors of undergraduate and graduate students majoring in computer science with those majoring in ...
More
This chapter reports on a study that compares the demographics, attitudes, and computing-related behaviors of undergraduate and graduate students majoring in computer science with those majoring in applied IT disciplines. The results show that women do not feel as good about their abilities related to computers and computer programming as men do. The lack of confidence might stem from a lack of encouragement from teachers, friends, and family since half of women in the applied IT group and one-quarter of women in the computer science group said nobody had encouraged them to go into an IT field. The biggest differences between men and women in the two groups are demographic: men and women in the applied IT units tend to be older, and men and women in computer science tend to fall into traditional age groups for undergraduate and graduate students.Less
This chapter reports on a study that compares the demographics, attitudes, and computing-related behaviors of undergraduate and graduate students majoring in computer science with those majoring in applied IT disciplines. The results show that women do not feel as good about their abilities related to computers and computer programming as men do. The lack of confidence might stem from a lack of encouragement from teachers, friends, and family since half of women in the applied IT group and one-quarter of women in the computer science group said nobody had encouraged them to go into an IT field. The biggest differences between men and women in the two groups are demographic: men and women in the applied IT units tend to be older, and men and women in computer science tend to fall into traditional age groups for undergraduate and graduate students.
Nathan Ensmenger
- Published in print:
- 2010
- Published Online:
- August 2013
- ISBN:
- 9780262050937
- eISBN:
- 9780262289351
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262050937.003.0007
- Subject:
- Information Science, Information Science
This chapter examines the history of the professionalization of computer programming. It describes the need of programmers to establish the institutional structures associated with professionalism, ...
More
This chapter examines the history of the professionalization of computer programming. It describes the need of programmers to establish the institutional structures associated with professionalism, including professional societies, certification programs, educational standards, and codes of ethics and suggests that the professionalization of computer programming represented a potential solution to the looming software crisis that appeared to programmers and employers alike. It highlights the struggle of computer programmers to prove that they possessed a unique set of skills and training that allowed them to lay claim to professional autonomy.Less
This chapter examines the history of the professionalization of computer programming. It describes the need of programmers to establish the institutional structures associated with professionalism, including professional societies, certification programs, educational standards, and codes of ethics and suggests that the professionalization of computer programming represented a potential solution to the looming software crisis that appeared to programmers and employers alike. It highlights the struggle of computer programmers to prove that they possessed a unique set of skills and training that allowed them to lay claim to professional autonomy.
Sandra Katz, John Aronis, Christine Wilson, David Allbritton, and Mary Lou Soffa
- Published in print:
- 2006
- Published Online:
- August 2013
- ISBN:
- 9780262033459
- eISBN:
- 9780262255929
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262033459.003.0012
- Subject:
- Business and Management, Information Technology
The underrepresentation of females among computer science Bachelor of Science degree recipients is attributed to two main forces: (i) fewer women than men enroll in computer science programs at the ...
More
The underrepresentation of females among computer science Bachelor of Science degree recipients is attributed to two main forces: (i) fewer women than men enroll in computer science programs at the undergraduate level; and (ii) attrition, i.e., more women than men leave these programs. This chapter focuses on the attrition aspect. It demonstrates the strong impact of performance on both student persistence in an undergraduate program and the gender gap in persistence. Performance alone does not explain this gap; rather, it is performance at an expected level that is the important factor, and women appear to be more negatively affected than men when their achieved grades do not measure up to their desired grades.Less
The underrepresentation of females among computer science Bachelor of Science degree recipients is attributed to two main forces: (i) fewer women than men enroll in computer science programs at the undergraduate level; and (ii) attrition, i.e., more women than men leave these programs. This chapter focuses on the attrition aspect. It demonstrates the strong impact of performance on both student persistence in an undergraduate program and the gender gap in persistence. Performance alone does not explain this gap; rather, it is performance at an expected level that is the important factor, and women appear to be more negatively affected than men when their achieved grades do not measure up to their desired grades.
Noam Shemtov
- Published in print:
- 2017
- Published Online:
- October 2017
- ISBN:
- 9780198716792
- eISBN:
- 9780191848377
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198716792.003.0003
- Subject:
- Law, Intellectual Property, IT, and Media Law
This chapter examines reverse engineering and the decompilation of computer programs, both of which are highly regulated under the current copyright regime. It begins with a practical overview of ...
More
This chapter examines reverse engineering and the decompilation of computer programs, both of which are highly regulated under the current copyright regime. It begins with a practical overview of reverse engineering and decompilation of software, focusing on types of reverse engineering prevalent in the software industry, the various stages of reverse engineering, and the motivation and methods for reverse engineering. It then looks at the reasons for and benefits of decompilation, which is a category of reverse engineering, and examines software interoperability. At this stage the chapter considers what EU and US copyright laws say about decompilation, with particular emphasis on the role that the idea-expression dichotomy plays in decompilation scenarios. It also discusses the problem of entitlement with respect to intellectual property rules, and more specifically in the case of decompilation of computer programs. It provides a critical evaluation of Article 6 of the Software Directive in enabling decompilation in order to achieve interoperability. The chapter concludes with a commentary on reverse engineering in the cloud environment under copyright law.Less
This chapter examines reverse engineering and the decompilation of computer programs, both of which are highly regulated under the current copyright regime. It begins with a practical overview of reverse engineering and decompilation of software, focusing on types of reverse engineering prevalent in the software industry, the various stages of reverse engineering, and the motivation and methods for reverse engineering. It then looks at the reasons for and benefits of decompilation, which is a category of reverse engineering, and examines software interoperability. At this stage the chapter considers what EU and US copyright laws say about decompilation, with particular emphasis on the role that the idea-expression dichotomy plays in decompilation scenarios. It also discusses the problem of entitlement with respect to intellectual property rules, and more specifically in the case of decompilation of computer programs. It provides a critical evaluation of Article 6 of the Software Directive in enabling decompilation in order to achieve interoperability. The chapter concludes with a commentary on reverse engineering in the cloud environment under copyright law.
Steven W. Usselman
- Published in print:
- 2007
- Published Online:
- August 2013
- ISBN:
- 9780262122894
- eISBN:
- 9780262277884
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262122894.003.0009
- Subject:
- Economics and Finance, Economic History
This chapter adds certain moderate adjustments to the body of work promulgated by the National Academy of Science, and also the work of David Mowery, that deals with the notion of modern computing ...
More
This chapter adds certain moderate adjustments to the body of work promulgated by the National Academy of Science, and also the work of David Mowery, that deals with the notion of modern computing being the product of massive public investment and government funding. The goal of the chapter, however, is to suggest that private enterprise and private capital—and not just government funding—played certain roles in influencing computing. IBM, in particular, is given more focus here to determine how the firm has contributed to the emergence and refinement of the storage capacity of the computer from the end of World War II until the development of the System/360. The conclusion arrived at is that while certain activities of the government may have benefited IBM, the government also drew IBM away from various opportunities that might have allowed them to blossom without the required intervention from the public sector.Less
This chapter adds certain moderate adjustments to the body of work promulgated by the National Academy of Science, and also the work of David Mowery, that deals with the notion of modern computing being the product of massive public investment and government funding. The goal of the chapter, however, is to suggest that private enterprise and private capital—and not just government funding—played certain roles in influencing computing. IBM, in particular, is given more focus here to determine how the firm has contributed to the emergence and refinement of the storage capacity of the computer from the end of World War II until the development of the System/360. The conclusion arrived at is that while certain activities of the government may have benefited IBM, the government also drew IBM away from various opportunities that might have allowed them to blossom without the required intervention from the public sector.
Per-Erik Werner
- Published in print:
- 2006
- Published Online:
- January 2010
- ISBN:
- 9780199205530
- eISBN:
- 9780191718076
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199205530.003.0007
- Subject:
- Physics, Condensed Matter Physics / Materials
Assigning hkl indices to the peaks in the powder diffraction pattern is the essential first stage of the data analysis for structure solution. This chapter begins by outlining the basic relationships ...
More
Assigning hkl indices to the peaks in the powder diffraction pattern is the essential first stage of the data analysis for structure solution. This chapter begins by outlining the basic relationships linking the positions of the Bragg peaks to the underlying crystal lattice and shows why translating observed peak positions into unit cell parameters is a non-trivial operation. Three quite distinct indexing strategies, as used in the programs ITO, DICVOL, and TREOR, are discussed in detail and practical reasons for utilising more than one indexing program are given. Considerable attention is paid to the effect that measurement errors can have upon the indexing process and figures of merit for assessing potential solutions are clearly explained. The impact of impurity lines is explained and modern strategies for handling such impurity lines discussed. The importance of databases for relating determined unit cell parameters to known phases within a sample is also assessed.Less
Assigning hkl indices to the peaks in the powder diffraction pattern is the essential first stage of the data analysis for structure solution. This chapter begins by outlining the basic relationships linking the positions of the Bragg peaks to the underlying crystal lattice and shows why translating observed peak positions into unit cell parameters is a non-trivial operation. Three quite distinct indexing strategies, as used in the programs ITO, DICVOL, and TREOR, are discussed in detail and practical reasons for utilising more than one indexing program are given. Considerable attention is paid to the effect that measurement errors can have upon the indexing process and figures of merit for assessing potential solutions are clearly explained. The impact of impurity lines is explained and modern strategies for handling such impurity lines discussed. The importance of databases for relating determined unit cell parameters to known phases within a sample is also assessed.
Mireille Hildebrandt
- Published in print:
- 2020
- Published Online:
- July 2020
- ISBN:
- 9780198860877
- eISBN:
- 9780191892936
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198860877.003.0007
- Subject:
- Law, Intellectual Property, IT, and Media Law
This chapter is an introduction to the domain of intellectual property (IP) rights, notably copyright. For computer scientists, the most relevant part of copyright law concerns copyright on computer ...
More
This chapter is an introduction to the domain of intellectual property (IP) rights, notably copyright. For computer scientists, the most relevant part of copyright law concerns copyright on computer programs, or software. Copyright on software is the enabling precondition for the General Public Licence (GPL) and the open source initiative. Before discussing copyright on software, however, this chapter first investigates the position of IP law in the context of constitutional democracy and clarifies that IP law is private law. From there, the chapter provides an overview of the various types of IP that are most relevant, after which it turns to the history, objectives, and scope of copyright protection. Finally, this chapter discusses EU copyright law and the issues of open source and free access.Less
This chapter is an introduction to the domain of intellectual property (IP) rights, notably copyright. For computer scientists, the most relevant part of copyright law concerns copyright on computer programs, or software. Copyright on software is the enabling precondition for the General Public Licence (GPL) and the open source initiative. Before discussing copyright on software, however, this chapter first investigates the position of IP law in the context of constitutional democracy and clarifies that IP law is private law. From there, the chapter provides an overview of the various types of IP that are most relevant, after which it turns to the history, objectives, and scope of copyright protection. Finally, this chapter discusses EU copyright law and the issues of open source and free access.
Wendy Hui Kyong Chun
- Published in print:
- 2008
- Published Online:
- August 2013
- ISBN:
- 9780262062749
- eISBN:
- 9780262273343
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262062749.003.0032
- Subject:
- Society and Culture, Media Studies
This chapter briefly discusses the different conceptions of program and programmability in the digital computer field; the discussion continues with two different grammatical definitions of the term ...
More
This chapter briefly discusses the different conceptions of program and programmability in the digital computer field; the discussion continues with two different grammatical definitions of the term “program,” verb and noun. It also focuses on ENIAC, the first working electronic digital computer. The chapter describes the requirements and the process of programming the analog and digital computer machines along with the works and arguments of various computer scientists. It furthermore states that programming an analog computer is descriptive while programming a digital one is prescriptive. The conclusion explains how the programmability concept affected the world of computers, from quantum computers to biology computing fields like DNA and RNA computing.Less
This chapter briefly discusses the different conceptions of program and programmability in the digital computer field; the discussion continues with two different grammatical definitions of the term “program,” verb and noun. It also focuses on ENIAC, the first working electronic digital computer. The chapter describes the requirements and the process of programming the analog and digital computer machines along with the works and arguments of various computer scientists. It furthermore states that programming an analog computer is descriptive while programming a digital one is prescriptive. The conclusion explains how the programmability concept affected the world of computers, from quantum computers to biology computing fields like DNA and RNA computing.
Stephen M. Kosslyn
- Published in print:
- 2006
- Published Online:
- March 2012
- ISBN:
- 9780195311846
- eISBN:
- 9780199847075
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195311846.001.0001
- Subject:
- Psychology, Cognitive Psychology
Graphs have become a fixture of everyday life, used in scientific and business publications, in magazines and newspapers, on television, on billboards, and even on cereal boxes. Nonetheless, ...
More
Graphs have become a fixture of everyday life, used in scientific and business publications, in magazines and newspapers, on television, on billboards, and even on cereal boxes. Nonetheless, surprisingly few graphs communicate effectively, and most graphs fail because they do not take into account the goals, needs, and abilities of the viewers. This book addresses these problems by presenting eight psychological principles for constructing effective graphs. Each principle is solidly rooted both in the scientific literature on how we perceive and comprehend graphs and in general facts about how our eyes and brains process visual information. The author uses these eight psychological principles as the basis for hundreds of specific recommendations that serve as a concrete, step-by-step guide to deciding whether a graph is an appropriate display to use, choosing the correct type of graph for a specific type of data and message, and then constructing graphs that will be understood at a glance. The book includes a complete review of the scientific literature on graph perception and comprehension, appendices that provide a quick tutorial on basic statistics, and a checklist for evaluating computer-graphics programs.Less
Graphs have become a fixture of everyday life, used in scientific and business publications, in magazines and newspapers, on television, on billboards, and even on cereal boxes. Nonetheless, surprisingly few graphs communicate effectively, and most graphs fail because they do not take into account the goals, needs, and abilities of the viewers. This book addresses these problems by presenting eight psychological principles for constructing effective graphs. Each principle is solidly rooted both in the scientific literature on how we perceive and comprehend graphs and in general facts about how our eyes and brains process visual information. The author uses these eight psychological principles as the basis for hundreds of specific recommendations that serve as a concrete, step-by-step guide to deciding whether a graph is an appropriate display to use, choosing the correct type of graph for a specific type of data and message, and then constructing graphs that will be understood at a glance. The book includes a complete review of the scientific literature on graph perception and comprehension, appendices that provide a quick tutorial on basic statistics, and a checklist for evaluating computer-graphics programs.
Joasia Krysa and Grzesiek Sedek
- Published in print:
- 2008
- Published Online:
- August 2013
- ISBN:
- 9780262062749
- eISBN:
- 9780262273343
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262062749.003.0034
- Subject:
- Society and Culture, Media Studies
Source code is a set of human readable computer commands written in high-level programming languages. It also shares some of the parameters of the natural language concept. This chapter starts with a ...
More
Source code is a set of human readable computer commands written in high-level programming languages. It also shares some of the parameters of the natural language concept. This chapter starts with a source code program and explains the principle and concept of source code; it also describes the importance and advantages of source code in the field of software. Furthermore, the chapter explains that source code is a non-executable computer program stored in source files and can be called a repository; it also focuses on the compiling process of source codes. The chapter describes the two conditions of source code—open and closed—and its contribution to the field of software licensing.Less
Source code is a set of human readable computer commands written in high-level programming languages. It also shares some of the parameters of the natural language concept. This chapter starts with a source code program and explains the principle and concept of source code; it also describes the importance and advantages of source code in the field of software. Furthermore, the chapter explains that source code is a non-executable computer program stored in source files and can be called a repository; it also focuses on the compiling process of source codes. The chapter describes the two conditions of source code—open and closed—and its contribution to the field of software licensing.
Jeffrey A. Summit
- Published in print:
- 2016
- Published Online:
- August 2016
- ISBN:
- 9780199844081
- eISBN:
- 9780190497071
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199844081.003.0010
- Subject:
- Music, History, American
Students who were born digital are not asking whether or not they should use computers, smartphones and online resources when learning how to chant Torah. Technology is so deeply integrated into our ...
More
Students who were born digital are not asking whether or not they should use computers, smartphones and online resources when learning how to chant Torah. Technology is so deeply integrated into our lives that no one is seriously questioning whether computer programs, tablet computers, smartphone apps, YouTube videos or Skype are appropriate venues with which to teach and learn sacred chant. Instead, this chapter examines how the use of these technologies impacts Jewish religious and cultural life in the twenty-first century. I consider how digital media changes the transmission of tradition, affects the relationship between students and teachers, challenges community authority and the primacy of minhag, local tradition, in contemporary Judaism.Less
Students who were born digital are not asking whether or not they should use computers, smartphones and online resources when learning how to chant Torah. Technology is so deeply integrated into our lives that no one is seriously questioning whether computer programs, tablet computers, smartphone apps, YouTube videos or Skype are appropriate venues with which to teach and learn sacred chant. Instead, this chapter examines how the use of these technologies impacts Jewish religious and cultural life in the twenty-first century. I consider how digital media changes the transmission of tradition, affects the relationship between students and teachers, challenges community authority and the primacy of minhag, local tradition, in contemporary Judaism.
Chris Bleakley
- Published in print:
- 2020
- Published Online:
- October 2020
- ISBN:
- 9780198853732
- eISBN:
- 9780191888168
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198853732.003.0005
- Subject:
- Mathematics, History of Mathematics, Logic / Computer Science / Mathematical Philosophy
Chapter 5 delves into the origins of artificial intelligence (AI). By the end of the 1940s, a few visionaries realised that computers were more than mere automatic calculators. They believed that ...
More
Chapter 5 delves into the origins of artificial intelligence (AI). By the end of the 1940s, a few visionaries realised that computers were more than mere automatic calculators. They believed that computers running the right algorithms could perform tasks previously thought to require human intelligence. Christopher Strachey completed the first artificially intelligent computer program in 1952. The program played the board game Checkers. Arthur Samuel of IBM extended and improved on Strachey’s program by including machine learning - the ability of a program to learn from experience. A team from Carnegie Melon University developed the first computer program that could perform algebra. The program eventually reproduced 38 of the 52 proofs in a classic mathematics textbook. Flushed by these successes, serious scientists made wildly optimistic pronouncements about the future of AI. In the event, project after project failed to deliver and the first “AI winter” set in.Less
Chapter 5 delves into the origins of artificial intelligence (AI). By the end of the 1940s, a few visionaries realised that computers were more than mere automatic calculators. They believed that computers running the right algorithms could perform tasks previously thought to require human intelligence. Christopher Strachey completed the first artificially intelligent computer program in 1952. The program played the board game Checkers. Arthur Samuel of IBM extended and improved on Strachey’s program by including machine learning - the ability of a program to learn from experience. A team from Carnegie Melon University developed the first computer program that could perform algebra. The program eventually reproduced 38 of the 52 proofs in a classic mathematics textbook. Flushed by these successes, serious scientists made wildly optimistic pronouncements about the future of AI. In the event, project after project failed to deliver and the first “AI winter” set in.