Seiichiro Yonekura
- Published in print:
- 2004
- Published Online:
- September 2007
- ISBN:
- 9780199241057
- eISBN:
- 9780191714290
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199241057.003.0005
- Subject:
- Business and Management, Information Technology
This chapter argues that Japan's Ministry of International Trade and Industry (MITI), whether fortuitously or otherwise, hit upon a strategy for the computer industry, which called for coordination ...
More
This chapter argues that Japan's Ministry of International Trade and Industry (MITI), whether fortuitously or otherwise, hit upon a strategy for the computer industry, which called for coordination between appropriate administrative guidance and intervention on the one hand, and allowed autonomy and self-determination for private companies on the other. By heeding the advice of industry and cooperating positively with private companies, MITI adopted either a ‘planned coordination’ approach or a ‘market coordination’ approach according to industry function. The intervention by function approach worked well for the computer industry.Less
This chapter argues that Japan's Ministry of International Trade and Industry (MITI), whether fortuitously or otherwise, hit upon a strategy for the computer industry, which called for coordination between appropriate administrative guidance and intervention on the one hand, and allowed autonomy and self-determination for private companies on the other. By heeding the advice of industry and cooperating positively with private companies, MITI adopted either a ‘planned coordination’ approach or a ‘market coordination’ approach according to industry function. The intervention by function approach worked well for the computer industry.
Benjamin H. Bratton
- Published in print:
- 2016
- Published Online:
- September 2016
- ISBN:
- 9780262029575
- eISBN:
- 9780262330183
- Item type:
- book
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262029575.001.0001
- Subject:
- Society and Culture, Cultural Studies
Planetary-scale computation presents a fundamental challenge to Modern geopolitical architectures. As calculative reason and as global infrastructure, it not only deforms and distorts Westphalian ...
More
Planetary-scale computation presents a fundamental challenge to Modern geopolitical architectures. As calculative reason and as global infrastructure, it not only deforms and distorts Westphalian political geography it creates new territories in its own image, ones that don’t necessarily replace the old but which are superimposed on them, each grinding against the other. These thickened and noisy jurisdictions are our new normal. They are the scaffolds through which our cultures evolve through them, and they represent our most difficult and important design challenge. Computation is changing not only how governments govern, but what government even is in the first place: less governance of computation than computation as governance. Global cloud platforms take on roles that have traditionally been the domain of States, cities become hardware/software platforms organized by physical and virtual interfaces, and strange new political subjects (some not even human) gain unforeseen sovereignties as the users of those interfaces. To understand (and to design) these transformations, we need to see them as part of a whole, an accidental megastructure called The Stack. This book examines each layer of The Stack–Earth, Cloud, City, Address, Interface, and User—as a dynamic technology that is re-structuring some part of our world at its particular scale and as part of the whole. The Stack is a platform, and so combines logics of both States and Markets, and produces forms of sovereignty that are unique to this technical and institutional form. Fortunately, stack platforms are made to be re-made. How the Stack-we-have becomes the Stack-to-come depends on how well we understand it as a totality, By seeing the whole we stand a better chance of designing a system we will want to inhabit. To formulate the “design brief” for that project, as this book does, requires a perspective that blends philosophical, geopolitical and technological understandings and methods.Less
Planetary-scale computation presents a fundamental challenge to Modern geopolitical architectures. As calculative reason and as global infrastructure, it not only deforms and distorts Westphalian political geography it creates new territories in its own image, ones that don’t necessarily replace the old but which are superimposed on them, each grinding against the other. These thickened and noisy jurisdictions are our new normal. They are the scaffolds through which our cultures evolve through them, and they represent our most difficult and important design challenge. Computation is changing not only how governments govern, but what government even is in the first place: less governance of computation than computation as governance. Global cloud platforms take on roles that have traditionally been the domain of States, cities become hardware/software platforms organized by physical and virtual interfaces, and strange new political subjects (some not even human) gain unforeseen sovereignties as the users of those interfaces. To understand (and to design) these transformations, we need to see them as part of a whole, an accidental megastructure called The Stack. This book examines each layer of The Stack–Earth, Cloud, City, Address, Interface, and User—as a dynamic technology that is re-structuring some part of our world at its particular scale and as part of the whole. The Stack is a platform, and so combines logics of both States and Markets, and produces forms of sovereignty that are unique to this technical and institutional form. Fortunately, stack platforms are made to be re-made. How the Stack-we-have becomes the Stack-to-come depends on how well we understand it as a totality, By seeing the whole we stand a better chance of designing a system we will want to inhabit. To formulate the “design brief” for that project, as this book does, requires a perspective that blends philosophical, geopolitical and technological understandings and methods.
Lothar Determann and David Nimmer
- Published in print:
- 2015
- Published Online:
- May 2016
- ISBN:
- 9780262029407
- eISBN:
- 9780262331166
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262029407.003.0008
- Subject:
- Computer Science, Programming Languages
Clouds are on the horizon for software copyrights. The open source movement has been active to turn copyright into “copyleft”. Courts around the world are reshaping the first sale doctrine. Software ...
More
Clouds are on the horizon for software copyrights. The open source movement has been active to turn copyright into “copyleft”. Courts around the world are reshaping the first sale doctrine. Software manufacturers flee from distribution to service models, into the Cloud. A perfect storm for software copyrights is brewing. The Cloud promises to enable software publishers to place their code outside the framework of copyright exhaustion under the first sale doctrine and the “distribution trigger” in open source code license terms. Users' inability, in the Cloud context, to directly access the underlying software threatens to exert various side effects, notably affecting software interoperability. New kids on the block lose the ability to reverse-engineer hosted software. Established platform providers gain the ability to prevent interoperability, based on laws prohibiting interference with computers and technical protection measures. These developments risk upsetting the delicate balance between exclusive rights for copyright owners and access rights for the public, a balance that courts and legislatures have carefully established over the years, in order to foster creativity and innovation. With unprecedented pressure on traditional distribution models, how will copyright law cope? This Chapter illuminates the immediate path ahead, presents possible answers, and asks more questions.Less
Clouds are on the horizon for software copyrights. The open source movement has been active to turn copyright into “copyleft”. Courts around the world are reshaping the first sale doctrine. Software manufacturers flee from distribution to service models, into the Cloud. A perfect storm for software copyrights is brewing. The Cloud promises to enable software publishers to place their code outside the framework of copyright exhaustion under the first sale doctrine and the “distribution trigger” in open source code license terms. Users' inability, in the Cloud context, to directly access the underlying software threatens to exert various side effects, notably affecting software interoperability. New kids on the block lose the ability to reverse-engineer hosted software. Established platform providers gain the ability to prevent interoperability, based on laws prohibiting interference with computers and technical protection measures. These developments risk upsetting the delicate balance between exclusive rights for copyright owners and access rights for the public, a balance that courts and legislatures have carefully established over the years, in order to foster creativity and innovation. With unprecedented pressure on traditional distribution models, how will copyright law cope? This Chapter illuminates the immediate path ahead, presents possible answers, and asks more questions.
Seb Franklin
- Published in print:
- 2015
- Published Online:
- May 2016
- ISBN:
- 9780262029537
- eISBN:
- 9780262331135
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262029537.003.0003
- Subject:
- Society and Culture, Technology and Society
This chapter considers exclusion as an unmarked but fundamental principle of control, and thus as a central concern for contemporary theories of representation. Beginning from Neferti X.M. Tadiar’s ...
More
This chapter considers exclusion as an unmarked but fundamental principle of control, and thus as a central concern for contemporary theories of representation. Beginning from Neferti X.M. Tadiar’s critique of the totalizing concept of life that grounds much recent critical work on post-Fordism, the chapter works through the history of the black box concept and its relation to recent socioeconomic imaginaries. The chapter then addresses the methodological limitations of privileging computational media such as network diagrams and video games when seeking to theorize the social, political, and cultural implications of control.Less
This chapter considers exclusion as an unmarked but fundamental principle of control, and thus as a central concern for contemporary theories of representation. Beginning from Neferti X.M. Tadiar’s critique of the totalizing concept of life that grounds much recent critical work on post-Fordism, the chapter works through the history of the black box concept and its relation to recent socioeconomic imaginaries. The chapter then addresses the methodological limitations of privileging computational media such as network diagrams and video games when seeking to theorize the social, political, and cultural implications of control.
Nathan Cortez
- Published in print:
- 2015
- Published Online:
- May 2016
- ISBN:
- 9780231171182
- eISBN:
- 9780231540070
- Item type:
- chapter
- Publisher:
- Columbia University Press
- DOI:
- 10.7312/columbia/9780231171182.003.0031
- Subject:
- Law, Medical Law
This chapter examines how the FDA has regulated computer hardware and software the last 40 years, finding several recurrent problems that neither the agency nor Congress has effectively addressed. ...
More
This chapter examines how the FDA has regulated computer hardware and software the last 40 years, finding several recurrent problems that neither the agency nor Congress has effectively addressed. The FDA needs updated statutory authority and increased in-house resources to modernize its approach to computerized devices, which now comprise over half of all medical devices on the market.Less
This chapter examines how the FDA has regulated computer hardware and software the last 40 years, finding several recurrent problems that neither the agency nor Congress has effectively addressed. The FDA needs updated statutory authority and increased in-house resources to modernize its approach to computerized devices, which now comprise over half of all medical devices on the market.
Richard Stallman and Adolfo Plasencia
- Published in print:
- 2017
- Published Online:
- January 2018
- ISBN:
- 9780262036016
- eISBN:
- 9780262339308
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262036016.003.0022
- Subject:
- Society and Culture, Technology and Society
This dialogue is preceded by an introduction about Richard Stallman and the power of “code”, by Lawrence Lessing, as well as a detailed biography of Richard revised by himself. In the conversation ...
More
This dialogue is preceded by an introduction about Richard Stallman and the power of “code”, by Lawrence Lessing, as well as a detailed biography of Richard revised by himself. In the conversation following this, Stallman analyzes the origin and validity of the ‘hacking’ and ‘hack’ concepts and the differences between ‘hackers’ and ‘crackers’. He then describes in detail the concept, dimension, forms of creation and the development of software code, especially free software and its implementation framework. He later reflects on and outlines his vision of the relationship between the use of technology and ethics, and about ethical hackers. He also talks about the good and bad behavior of companies and, in this context, his criticism of Corporatocracy. Afterward, he describes concepts about how the creation of software code works compared with other creative arts, such as literature. He goes on to analyze the mechanisms for how ideas are patented in the industrial world, in particular the case of software development. He finally talks about why his vision of free software remains valid and how it should be dealt with during education.Less
This dialogue is preceded by an introduction about Richard Stallman and the power of “code”, by Lawrence Lessing, as well as a detailed biography of Richard revised by himself. In the conversation following this, Stallman analyzes the origin and validity of the ‘hacking’ and ‘hack’ concepts and the differences between ‘hackers’ and ‘crackers’. He then describes in detail the concept, dimension, forms of creation and the development of software code, especially free software and its implementation framework. He later reflects on and outlines his vision of the relationship between the use of technology and ethics, and about ethical hackers. He also talks about the good and bad behavior of companies and, in this context, his criticism of Corporatocracy. Afterward, he describes concepts about how the creation of software code works compared with other creative arts, such as literature. He goes on to analyze the mechanisms for how ideas are patented in the industrial world, in particular the case of software development. He finally talks about why his vision of free software remains valid and how it should be dealt with during education.
Eaton E. Lattman, Thomas D. Grant, and Edward H. Snell
- Published in print:
- 2018
- Published Online:
- September 2018
- ISBN:
- 9780199670871
- eISBN:
- 9780191749575
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780199670871.003.0015
- Subject:
- Physics, Soft Matter / Biological Physics
This chapter summarizes the book, describes publications that provide complementary information, and sets the tone for the future.
This chapter summarizes the book, describes publications that provide complementary information, and sets the tone for the future.
Mark Selikowitz
- Published in print:
- 1993
- Published Online:
- November 2020
- ISBN:
- 9780192622990
- eISBN:
- 9780191918391
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780192622990.003.0016
- Subject:
- Education, Teaching of Specific Groups and Special Educational Needs
This chapter deals with two separate areas of learning: attention and sequential organization. Difficulties in either area can occur in isolation or in ...
More
This chapter deals with two separate areas of learning: attention and sequential organization. Difficulties in either area can occur in isolation or in combination with other forms of specific learning difficulty. The ability to ignore distractions and to focus on one activity at a time is a skill that children usually develop gradually as they grow. It is quite normal for toddlers and pre-school-aged children to be easily distractible, but the ability to channel attention selectively usually increases progressively once children start school. Some children experience significant difficulties in learning to attend. As a result, they are easily distractible and do not persist for long with tasks. If this is a significant problem, it is referred to by the umbrella term ‘attention-deficit/hyperactivity disorder’ (ADHD). This means attention-deficit with or without hyperactivity. Such children may be overactive and impulsive, although this is not always the case. It is this overactivity that has given rise to the term hyperactivity (‘hyper’ is Greek for ‘over’). All children with attention-deficit/hyperactivity disorder experience difficulty with concentration. There are two forms of the condition: one where overactivity and impulsivity are present and the other where these coexisting problems are absent. The two forms of attention-deficit/hyperactivity disorder may be clarified by describing two children, each with one of the forms of the disorder. George is his mother’s third child. She describes him as completely different from the other two. As a baby he slept very little and cried constantly. As a toddler he was always on the go, ‘as if driven by a motor’. Now, aged nine years, his teacher describes him as ‘disorganized, disruptive, and fidgety’. His mother reports that he hardly ever sits still at home. He will not sit through a favourite TV programme or a meal. He is still so disorganized that if she did not help him to dress in the morning, he would not be in time for school. He is also very impulsive. He does not seem to think before he acts. He takes terrible risks and often says the first thing that comes in to his head.
Less
This chapter deals with two separate areas of learning: attention and sequential organization. Difficulties in either area can occur in isolation or in combination with other forms of specific learning difficulty. The ability to ignore distractions and to focus on one activity at a time is a skill that children usually develop gradually as they grow. It is quite normal for toddlers and pre-school-aged children to be easily distractible, but the ability to channel attention selectively usually increases progressively once children start school. Some children experience significant difficulties in learning to attend. As a result, they are easily distractible and do not persist for long with tasks. If this is a significant problem, it is referred to by the umbrella term ‘attention-deficit/hyperactivity disorder’ (ADHD). This means attention-deficit with or without hyperactivity. Such children may be overactive and impulsive, although this is not always the case. It is this overactivity that has given rise to the term hyperactivity (‘hyper’ is Greek for ‘over’). All children with attention-deficit/hyperactivity disorder experience difficulty with concentration. There are two forms of the condition: one where overactivity and impulsivity are present and the other where these coexisting problems are absent. The two forms of attention-deficit/hyperactivity disorder may be clarified by describing two children, each with one of the forms of the disorder. George is his mother’s third child. She describes him as completely different from the other two. As a baby he slept very little and cried constantly. As a toddler he was always on the go, ‘as if driven by a motor’. Now, aged nine years, his teacher describes him as ‘disorganized, disruptive, and fidgety’. His mother reports that he hardly ever sits still at home. He will not sit through a favourite TV programme or a meal. He is still so disorganized that if she did not help him to dress in the morning, he would not be in time for school. He is also very impulsive. He does not seem to think before he acts. He takes terrible risks and often says the first thing that comes in to his head.
Clifford A. Behrens
- Published in print:
- 1996
- Published Online:
- November 2020
- ISBN:
- 9780195085754
- eISBN:
- 9780197560495
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780195085754.003.0007
- Subject:
- Archaeology, Archaeological Methodology and Techniques
What is the process by which indigenous Amazonian people intensify their utilization of tropical forest resources, and what are the roles of population demography, ...
More
What is the process by which indigenous Amazonian people intensify their utilization of tropical forest resources, and what are the roles of population demography, settlement patterns, and resource degradation in this process? These are the central problems of this chapter. Over the last fifty years, ecologically oriented anthropologists have focused on these questions because of their significance for explaining the socioecological variability found among Amazonian Indians. A common theme in many attempts to account for socioecological variability in the Amazon is that large, sedentary populations necessitate increasing levels of social integration. Therefore, some explanations for this variability have sought factors that limit population density, such as the local availability of arable soils and protein-rich faunal foods. Simple single-factor frameworks have been criticized, yielding slightly more complex kinds of explanation, some based on evolutionary ecology and decision theory. Nevertheless, none of these approaches has successfully managed to relate population growth, village formation, resource degradation, and intensification of land use together in a single formalism that derives its first principles from a comparative analysis of the ethnographic literature. As a result, culture has not been assigned the central role it deserves in any theory purporting to characterize the process of land use intensification among indigenous Amazonians. This paper will review the ethnographic literature on the Amazon to (1) establish an empirical basis for the ingredients required to formulate cultural ecological theories of land-use intensification among indigenous Amazonians and (2) propose a developmental sequence based on increasing sedentism, intensification of land utilization, and growing market demand for production. Thus, this paper attempts to integrate seemingly disparate ideas from the past and present, each with some “ring of truth,” in the kind of mathematical framework advocated but never really achieved by Steward. The resulting paradigm converges on one very much resembling “land scape ecology,” but with greater emphasis on the role of culture and human decision making in a generative process. The need for detailed land-use data on a regional scale implicates the application of new technologies, such as remote sensing and geographical information systems, to test the proposed theories.
Less
What is the process by which indigenous Amazonian people intensify their utilization of tropical forest resources, and what are the roles of population demography, settlement patterns, and resource degradation in this process? These are the central problems of this chapter. Over the last fifty years, ecologically oriented anthropologists have focused on these questions because of their significance for explaining the socioecological variability found among Amazonian Indians. A common theme in many attempts to account for socioecological variability in the Amazon is that large, sedentary populations necessitate increasing levels of social integration. Therefore, some explanations for this variability have sought factors that limit population density, such as the local availability of arable soils and protein-rich faunal foods. Simple single-factor frameworks have been criticized, yielding slightly more complex kinds of explanation, some based on evolutionary ecology and decision theory. Nevertheless, none of these approaches has successfully managed to relate population growth, village formation, resource degradation, and intensification of land use together in a single formalism that derives its first principles from a comparative analysis of the ethnographic literature. As a result, culture has not been assigned the central role it deserves in any theory purporting to characterize the process of land use intensification among indigenous Amazonians. This paper will review the ethnographic literature on the Amazon to (1) establish an empirical basis for the ingredients required to formulate cultural ecological theories of land-use intensification among indigenous Amazonians and (2) propose a developmental sequence based on increasing sedentism, intensification of land utilization, and growing market demand for production. Thus, this paper attempts to integrate seemingly disparate ideas from the past and present, each with some “ring of truth,” in the kind of mathematical framework advocated but never really achieved by Steward. The resulting paradigm converges on one very much resembling “land scape ecology,” but with greater emphasis on the role of culture and human decision making in a generative process. The need for detailed land-use data on a regional scale implicates the application of new technologies, such as remote sensing and geographical information systems, to test the proposed theories.
David Jhave Johnston
- Published in print:
- 2016
- Published Online:
- January 2017
- ISBN:
- 9780262034517
- eISBN:
- 9780262334396
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262034517.003.0004
- Subject:
- Philosophy, Aesthetics
Software defines what digital poetry is. This chapter explores the temporal implications of animation time lines on the literary imagination. Read it if you are concerned with software studies and/or ...
More
Software defines what digital poetry is. This chapter explores the temporal implications of animation time lines on the literary imagination. Read it if you are concerned with software studies and/or creative media. It argues that an authoring environment specific to the literary must emerge. Chapter includes a rapid overview of most of the authoring tools and code languages used by contemporary poets.
Software examined includes: After Effects, Mudbox, Mr Softie, Flash, VMRL, and Second Life; and the programming languages Processing, RitaJS, Python, and C++.Less
Software defines what digital poetry is. This chapter explores the temporal implications of animation time lines on the literary imagination. Read it if you are concerned with software studies and/or creative media. It argues that an authoring environment specific to the literary must emerge. Chapter includes a rapid overview of most of the authoring tools and code languages used by contemporary poets.
Software examined includes: After Effects, Mudbox, Mr Softie, Flash, VMRL, and Second Life; and the programming languages Processing, RitaJS, Python, and C++.
Adam Treister
- Published in print:
- 2005
- Published Online:
- November 2020
- ISBN:
- 9780195183146
- eISBN:
- 9780197561898
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780195183146.003.0012
- Subject:
- Chemistry, Physical Chemistry
Flow cytometry is a result of the computer revolution. Biologists used fluorescent dyes in microscopy and medicine almost a hundred years before the first ...
More
Flow cytometry is a result of the computer revolution. Biologists used fluorescent dyes in microscopy and medicine almost a hundred years before the first flow cytometer. Only after electronics became sophisticated enough to control individual cells and computers became fast enough to analyze the data coming out of the instrument, and to make a decision in time to deflect the stream, did cell sorting become viable. Since the 1970s, the capabilities of computers have grown exponentially. According to the famed Moore’s Law, the size of the computer, as tracked by the number of transistors on a chip, doubles every 18 months. This rule has held for three decades so far, and new technologies continue to appear to keep that growth on track. The clock speed of chips is now measured in gigahertz—billions of instructions per second—and hard drives are now available with capacities measured in terabytes. Having computers so powerful, cheap, and ubiquitous changes the nature of scientific exploration. We are in the early steps of a long march of biotechnology breakthroughs spawned from this excess of compute power. From genomics to proteomics to high-throughput flow cytometry, the trend in biological research is toward massproduced, high-volume experiments. Automation is the key to scaling their size and scope and to lowering their cost per test. Each step that was previously done by human hands is being delegated to a computer or a robot for the implementation to be more precise and to scale efficiently. From making sort decisions in milliseconds to creating data archives that may last for centuries, computers control the information involved with cytometry, and software controls the computers. As the technology matures and the size and number of exper iments increase, the emphasis of software development switches from instrument control to analysis and management. The challenge for computers is not in running the cytometer any more. The more modern challenge for informatics is to analyze, aggregate, maintain, access, and exchange the huge volume of flow cytometry data. Clinical and other regulated use of cytometry necessitates more rigorous data administration techniques. These techniques introduce issues of security, integrity, and privacy into the processing of data.
Less
Flow cytometry is a result of the computer revolution. Biologists used fluorescent dyes in microscopy and medicine almost a hundred years before the first flow cytometer. Only after electronics became sophisticated enough to control individual cells and computers became fast enough to analyze the data coming out of the instrument, and to make a decision in time to deflect the stream, did cell sorting become viable. Since the 1970s, the capabilities of computers have grown exponentially. According to the famed Moore’s Law, the size of the computer, as tracked by the number of transistors on a chip, doubles every 18 months. This rule has held for three decades so far, and new technologies continue to appear to keep that growth on track. The clock speed of chips is now measured in gigahertz—billions of instructions per second—and hard drives are now available with capacities measured in terabytes. Having computers so powerful, cheap, and ubiquitous changes the nature of scientific exploration. We are in the early steps of a long march of biotechnology breakthroughs spawned from this excess of compute power. From genomics to proteomics to high-throughput flow cytometry, the trend in biological research is toward massproduced, high-volume experiments. Automation is the key to scaling their size and scope and to lowering their cost per test. Each step that was previously done by human hands is being delegated to a computer or a robot for the implementation to be more precise and to scale efficiently. From making sort decisions in milliseconds to creating data archives that may last for centuries, computers control the information involved with cytometry, and software controls the computers. As the technology matures and the size and number of exper iments increase, the emphasis of software development switches from instrument control to analysis and management. The challenge for computers is not in running the cytometer any more. The more modern challenge for informatics is to analyze, aggregate, maintain, access, and exchange the huge volume of flow cytometry data. Clinical and other regulated use of cytometry necessitates more rigorous data administration techniques. These techniques introduce issues of security, integrity, and privacy into the processing of data.
Aaron Perzanowski and Jason Schultz
- Published in print:
- 2016
- Published Online:
- May 2017
- ISBN:
- 9780262035019
- eISBN:
- 9780262335959
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262035019.003.0008
- Subject:
- Information Science, Library Science
The smart devices that make up the Internet of Things induce consumers to cede control over the products they buy. Devices like smartphones offer real benefits, but combined with embedded software, ...
More
The smart devices that make up the Internet of Things induce consumers to cede control over the products they buy. Devices like smartphones offer real benefits, but combined with embedded software, network connectivity, microscopic sensors and large-scale data analytics, they pose serious threats to ownership and consumer welfare. From coffee makers and toys to cars and medical devices, the products we buy are defined by software. That code gives device makers an increasing degree of control over how, when, and whether those products can be used even after consumers buy them. That shift of control has profound implications for ownership.Less
The smart devices that make up the Internet of Things induce consumers to cede control over the products they buy. Devices like smartphones offer real benefits, but combined with embedded software, network connectivity, microscopic sensors and large-scale data analytics, they pose serious threats to ownership and consumer welfare. From coffee makers and toys to cars and medical devices, the products we buy are defined by software. That code gives device makers an increasing degree of control over how, when, and whether those products can be used even after consumers buy them. That shift of control has profound implications for ownership.
Paul-Brian McInerney
- Published in print:
- 2014
- Published Online:
- May 2014
- ISBN:
- 9780804785129
- eISBN:
- 9780804789066
- Item type:
- chapter
- Publisher:
- Stanford University Press
- DOI:
- 10.11126/stanford/9780804785129.003.0006
- Subject:
- Sociology, Politics, Social Movements and Social Change
This chapter shows how competition among groups shapes moral markets. It explains how the Circuit Riders engaged with the new dominant actor in nonprofit technology assistance, NPower. Through ...
More
This chapter shows how competition among groups shapes moral markets. It explains how the Circuit Riders engaged with the new dominant actor in nonprofit technology assistance, NPower. Through successive interactions, new conventions of coordination reduced the uncertainty of interacting in the nonprofit technology assistance market. In response to NPower’s growing dominance, some in the Circuit Rider movement mobilized around an alternative platform, free/open source software. The strategy was an attempt to reassert the founding values of the Circuit Rider movement as articulated in technology. Ultimately, the Circuit Riders had limited success in splitting the technology services market. This chapter illustrates how, once institutionalized, organizational forms and practices like social enterprise are difficult to challenge, but also how social movements can create alternative niches for consumers who share their social values. Because markets are not organized strictly on principles of economic rationality, such pressure can nudge them in socially desirable directions.Less
This chapter shows how competition among groups shapes moral markets. It explains how the Circuit Riders engaged with the new dominant actor in nonprofit technology assistance, NPower. Through successive interactions, new conventions of coordination reduced the uncertainty of interacting in the nonprofit technology assistance market. In response to NPower’s growing dominance, some in the Circuit Rider movement mobilized around an alternative platform, free/open source software. The strategy was an attempt to reassert the founding values of the Circuit Rider movement as articulated in technology. Ultimately, the Circuit Riders had limited success in splitting the technology services market. This chapter illustrates how, once institutionalized, organizational forms and practices like social enterprise are difficult to challenge, but also how social movements can create alternative niches for consumers who share their social values. Because markets are not organized strictly on principles of economic rationality, such pressure can nudge them in socially desirable directions.
Jerome Lewis
- Published in print:
- 2014
- Published Online:
- September 2014
- ISBN:
- 9780262027168
- eISBN:
- 9780262322492
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262027168.003.0007
- Subject:
- Business and Management, Information Technology
Hunter-gatherer land-use in the Congo Basin leaves few traces. One consequence is that their presence is invisible on maps and ignored in land-use planning decisions over the areas they inhabit. ...
More
Hunter-gatherer land-use in the Congo Basin leaves few traces. One consequence is that their presence is invisible on maps and ignored in land-use planning decisions over the areas they inhabit. Governments do not recognise their rights to land, conservationists exclude them from rich forest areas, logging roads open up remaining areas to extractive outsiders, and global warming changes rainfall patterns and the seasonal events that normally guide people to wild foods. A forestry company in Congo-Brazzaville seeking a ‘green’ label for its timber sought anthropological advice on how to respect the rights of forest people. This chapter describes the challenges and participatory design process that developed in creating icon-driven software on converted military palmpilots. Maps produced using this technology have become a new way for non-literate communities to be heard by powerful outsiders. A community radio station broadcasting uniquely in local languages will facilitate forest people to develop their own understanding of the situations facing them, share insights, observations and analyses in order to better secure their long-term interests. The creative interaction of non-literate users and ICT is spawning new developments, from new software builds to monitor illegal logging or wildlife, to geographic information systems for non-literate users.Less
Hunter-gatherer land-use in the Congo Basin leaves few traces. One consequence is that their presence is invisible on maps and ignored in land-use planning decisions over the areas they inhabit. Governments do not recognise their rights to land, conservationists exclude them from rich forest areas, logging roads open up remaining areas to extractive outsiders, and global warming changes rainfall patterns and the seasonal events that normally guide people to wild foods. A forestry company in Congo-Brazzaville seeking a ‘green’ label for its timber sought anthropological advice on how to respect the rights of forest people. This chapter describes the challenges and participatory design process that developed in creating icon-driven software on converted military palmpilots. Maps produced using this technology have become a new way for non-literate communities to be heard by powerful outsiders. A community radio station broadcasting uniquely in local languages will facilitate forest people to develop their own understanding of the situations facing them, share insights, observations and analyses in order to better secure their long-term interests. The creative interaction of non-literate users and ICT is spawning new developments, from new software builds to monitor illegal logging or wildlife, to geographic information systems for non-literate users.
Alan F. Blackwell
- Published in print:
- 2014
- Published Online:
- September 2014
- ISBN:
- 9780262027168
- eISBN:
- 9780262322492
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262027168.003.0009
- Subject:
- Business and Management, Information Technology
This chapter describes the role of ethnography within a technical design process, as understood from an engineering perspective. It pays particular attention to the distinction between research ...
More
This chapter describes the role of ethnography within a technical design process, as understood from an engineering perspective. It pays particular attention to the distinction between research prototypes and manufactured products, arguing that there is often little resemblance between the academic study of technology users and the pragmatics of design for global markets. Software design depends on structured accounts of social affairs, and constructs these using methods appropriated from the social sciences. However, the designed artefact itself should also be read as a craft achievement that emerges into a socio-economic context.Less
This chapter describes the role of ethnography within a technical design process, as understood from an engineering perspective. It pays particular attention to the distinction between research prototypes and manufactured products, arguing that there is often little resemblance between the academic study of technology users and the pragmatics of design for global markets. Software design depends on structured accounts of social affairs, and constructs these using methods appropriated from the social sciences. However, the designed artefact itself should also be read as a craft achievement that emerges into a socio-economic context.
Hira Agrawal, Thomas F. Bowen, and Sanjai Narain
- Published in print:
- 2013
- Published Online:
- September 2015
- ISBN:
- 9780823244560
- eISBN:
- 9780823268948
- Item type:
- chapter
- Publisher:
- Fordham University Press
- DOI:
- 10.5422/fordham/9780823244560.003.0003
- Subject:
- Information Science, Information Science
Malware enters a software system along three avenues: it is hidden surreptitiously within applications by a malicious developer; it is inserted into the system due to an accidental or a deliberate ...
More
Malware enters a software system along three avenues: it is hidden surreptitiously within applications by a malicious developer; it is inserted into the system due to an accidental or a deliberate misconfiguration of the deployment environment; and it is injected into a running application by a malicious user by exploiting a programming flaw in the application logic. This chapter describes three tools developed at Telcordia for blocking all these avenues: Software Visualization and Analysis Toolsuite (TSVAT) system, ConfigAssure system, and Runtime Monitoring. TSVAT helps application testers conserve testing resources by guiding them to hidden code. ConfigAssure helps system administrators in creating vulnerability-free distributed application configuration. Runtime Monitoring protects against the exploitation of vulnerabilities not caught by any other technique. These tools have been trialed or are being deployed in real enterprises. Together, they offer a comprehensive defense against attacks on software systems throughout their lifecycle.Less
Malware enters a software system along three avenues: it is hidden surreptitiously within applications by a malicious developer; it is inserted into the system due to an accidental or a deliberate misconfiguration of the deployment environment; and it is injected into a running application by a malicious user by exploiting a programming flaw in the application logic. This chapter describes three tools developed at Telcordia for blocking all these avenues: Software Visualization and Analysis Toolsuite (TSVAT) system, ConfigAssure system, and Runtime Monitoring. TSVAT helps application testers conserve testing resources by guiding them to hidden code. ConfigAssure helps system administrators in creating vulnerability-free distributed application configuration. Runtime Monitoring protects against the exploitation of vulnerabilities not caught by any other technique. These tools have been trialed or are being deployed in real enterprises. Together, they offer a comprehensive defense against attacks on software systems throughout their lifecycle.
Jonathan Band and Masanobu Katoh
- Published in print:
- 2011
- Published Online:
- August 2013
- ISBN:
- 9780262015004
- eISBN:
- 9780262295543
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262015004.003.0005
- Subject:
- Information Science, Information Science
This chapter reviews the interoperability debate in the Pacific Rim, with stops in Australia, Singapore, Hong Kong, South Korea, and the Philippines. It discusses the Copyright Amendment Act of 1984 ...
More
This chapter reviews the interoperability debate in the Pacific Rim, with stops in Australia, Singapore, Hong Kong, South Korea, and the Philippines. It discusses the Copyright Amendment Act of 1984 in Australia which puts computer programs under the protection of the Australian copyright law, the amendment of the copyright law in Singapore to permit software reverse engineering, and the Philippines’ crafting of a hybrid of the fair-use provision of the U.S. Copyright Act and article 6 of the European Union (EU) Software Directive. It also suggests that these countries had to deal with political pressure from dominant U.S. software companies and from the Office of the U.S. Trade Representative (USTR) in confronting the issue of reverse engineering.Less
This chapter reviews the interoperability debate in the Pacific Rim, with stops in Australia, Singapore, Hong Kong, South Korea, and the Philippines. It discusses the Copyright Amendment Act of 1984 in Australia which puts computer programs under the protection of the Australian copyright law, the amendment of the copyright law in Singapore to permit software reverse engineering, and the Philippines’ crafting of a hybrid of the fair-use provision of the U.S. Copyright Act and article 6 of the European Union (EU) Software Directive. It also suggests that these countries had to deal with political pressure from dominant U.S. software companies and from the Office of the U.S. Trade Representative (USTR) in confronting the issue of reverse engineering.
Catelijne Coopmans
- Published in print:
- 2014
- Published Online:
- May 2014
- ISBN:
- 9780262525381
- eISBN:
- 9780262319157
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262525381.003.0003
- Subject:
- Society and Culture, Technology and Society
This chapter explores how relations between seeing and knowing are articulated in efforts to promote “visual analytics”: the practice of extracting insights from large datasets with the help of ...
More
This chapter explores how relations between seeing and knowing are articulated in efforts to promote “visual analytics”: the practice of extracting insights from large datasets with the help of on-screen, interactive displays of trends, outliers and other patterns. The focus is on online seminars organized by a software vendor, in which experienced business users demonstrate their practices to a less experienced audience. The chapter discusses how the idea that visual analytics can “reveal” insights is both manifested and qualified on these occasions. The user practices on display convey to audiences the impression that specific insights inhere in data and can be visually apprehended. Paradoxically, the demonstrations simultaneously render such insights conditional and elusive. The chapter characterizes this paradox as “artful revelation,” and proposes that it is rhetorically powerful in helping to foster imaginaries of, and investments in, data-driven discovery.Less
This chapter explores how relations between seeing and knowing are articulated in efforts to promote “visual analytics”: the practice of extracting insights from large datasets with the help of on-screen, interactive displays of trends, outliers and other patterns. The focus is on online seminars organized by a software vendor, in which experienced business users demonstrate their practices to a less experienced audience. The chapter discusses how the idea that visual analytics can “reveal” insights is both manifested and qualified on these occasions. The user practices on display convey to audiences the impression that specific insights inhere in data and can be visually apprehended. Paradoxically, the demonstrations simultaneously render such insights conditional and elusive. The chapter characterizes this paradox as “artful revelation,” and proposes that it is rhetorically powerful in helping to foster imaginaries of, and investments in, data-driven discovery.
Hal Abelson and Adolfo Plasencia
- Published in print:
- 2017
- Published Online:
- January 2018
- ISBN:
- 9780262036016
- eISBN:
- 9780262339308
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262036016.003.0014
- Subject:
- Society and Culture, Technology and Society
In this dialogue, Hal Abelson the acclaimed professor, scientist and distinguished member of MIT CSAIL, and co-chair of the MIT Council on Educational Technology (MITCET), firstly discusses the ...
More
In this dialogue, Hal Abelson the acclaimed professor, scientist and distinguished member of MIT CSAIL, and co-chair of the MIT Council on Educational Technology (MITCET), firstly discusses the potential of the digital revolution and Internet. He talks about the reasons that led him to initiate Creative Commons (Abelson was also involved with the start up of MIT OpenCourseWare, Public Knowledge, the Free Software Foundation, and the Center for Democracy and Technology). He then describes in detail the MIT model. On the one hand it is based on not making distinctions between teaching and research, and on the other it focuses on radical meritocracy, which gives rise to a culture of open exchange and openness. He moves on to explain details of the philosophy behind the prestigious MIT Course 6, which uses semiconductors to bring together the physical side of electrical engineering and the logic side of IT, thereby generating a range of innovative interactions. He finally talks about the foundations of leadership which MIT have laid and continue to maintain in education and innovation.Less
In this dialogue, Hal Abelson the acclaimed professor, scientist and distinguished member of MIT CSAIL, and co-chair of the MIT Council on Educational Technology (MITCET), firstly discusses the potential of the digital revolution and Internet. He talks about the reasons that led him to initiate Creative Commons (Abelson was also involved with the start up of MIT OpenCourseWare, Public Knowledge, the Free Software Foundation, and the Center for Democracy and Technology). He then describes in detail the MIT model. On the one hand it is based on not making distinctions between teaching and research, and on the other it focuses on radical meritocracy, which gives rise to a culture of open exchange and openness. He moves on to explain details of the philosophy behind the prestigious MIT Course 6, which uses semiconductors to bring together the physical side of electrical engineering and the logic side of IT, thereby generating a range of innovative interactions. He finally talks about the foundations of leadership which MIT have laid and continue to maintain in education and innovation.
Nathan Ensmenger
- Published in print:
- 2010
- Published Online:
- August 2013
- ISBN:
- 9780262050937
- eISBN:
- 9780262289351
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262050937.003.0008
- Subject:
- Information Science, Information Science
This chapter examines the history of the emergence of software engineering during the period from 1968 to 1972. It explains that the term software engineering was first used by hardware engineer J. ...
More
This chapter examines the history of the emergence of software engineering during the period from 1968 to 1972. It explains that the term software engineering was first used by hardware engineer J. Presper Eckert in reference to the growing conflict between computer programmers and their corporate employers. It discusses the highlights of the first-ever North Atlantic Treaty Organization (NATO) Conference on Software Engineering intended to address the impending crisis in software production.Less
This chapter examines the history of the emergence of software engineering during the period from 1968 to 1972. It explains that the term software engineering was first used by hardware engineer J. Presper Eckert in reference to the growing conflict between computer programmers and their corporate employers. It discusses the highlights of the first-ever North Atlantic Treaty Organization (NATO) Conference on Software Engineering intended to address the impending crisis in software production.