Victor J. Katz and Karen Hunger Parshall
- Published in print:
- 2014
- Published Online:
- October 2017
- ISBN:
- 9780691149059
- eISBN:
- 9781400850525
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691149059.003.0005
- Subject:
- Mathematics, History of Mathematics
This chapter takes a look at the scope of mathematics in Ancient and Medieval China. Although the Chinese engaged in numerical calculation as early as the middle of the second millennium BCE, the ...
More
This chapter takes a look at the scope of mathematics in Ancient and Medieval China. Although the Chinese engaged in numerical calculation as early as the middle of the second millennium BCE, the earliest detailed written evidence of the solution of mathematical problems in China is the Suan shu shu (or Book of Numbers and Computation), a book discovered in a tomb dated to approximately 200 BCE. The Suan shu shu was part of the Chinese intellectual culture shaped in part by China's tempestuous political history. Within this history, Chinese mathematicians—who seemingly worked in isolation and in widely disparate parts of the country—gradually developed new methods for treating various problems that their works needed to contain. Here, the chapter discusses various mathematical explorations set out by Chinese scholars, such as the Chinese remainder problem.Less
This chapter takes a look at the scope of mathematics in Ancient and Medieval China. Although the Chinese engaged in numerical calculation as early as the middle of the second millennium BCE, the earliest detailed written evidence of the solution of mathematical problems in China is the Suan shu shu (or Book of Numbers and Computation), a book discovered in a tomb dated to approximately 200 BCE. The Suan shu shu was part of the Chinese intellectual culture shaped in part by China's tempestuous political history. Within this history, Chinese mathematicians—who seemingly worked in isolation and in widely disparate parts of the country—gradually developed new methods for treating various problems that their works needed to contain. Here, the chapter discusses various mathematical explorations set out by Chinese scholars, such as the Chinese remainder problem.
Thomas J. Sargent and François R. Velde
- Published in print:
- 2001
- Published Online:
- November 2003
- ISBN:
- 9780199248278
- eISBN:
- 9780191596605
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0199248273.003.0009
- Subject:
- Economics and Finance, Macro- and Monetary Economics
The Lucas and Stokey (1983) economy without capital is used to exhibit features of the Lucas and Stokey model of optimal taxation, and show how they compare with Barro's (1979) tax‐smoothing model. ...
More
The Lucas and Stokey (1983) economy without capital is used to exhibit features of the Lucas and Stokey model of optimal taxation, and show how they compare with Barro's (1979) tax‐smoothing model. Computation of optimal fiscal policies for Lucas and Stokey's model requires repeated evaluations of the present value of the government's surplus, an object formally equivalent to an asset price. The functional equation for an asset price is typically difficult to solve. A linear quadratic version of Lucas and Stokey's model is specified, which makes both asset pricing computations and optimal fiscal policy calculations easy. Martingale returns on government debt are discussed, and examples and extensions of Lucas and Stokey's model given. Two appendices describe and discuss: the key steps for two basic kinds of stochastic process (a stochastic first‐order linear difference equation and a Markov chain), and time consistency and the structure of debt. Lastly, details are given of the appropriate MATLAB programs.Less
The Lucas and Stokey (1983) economy without capital is used to exhibit features of the Lucas and Stokey model of optimal taxation, and show how they compare with Barro's (1979) tax‐smoothing model. Computation of optimal fiscal policies for Lucas and Stokey's model requires repeated evaluations of the present value of the government's surplus, an object formally equivalent to an asset price. The functional equation for an asset price is typically difficult to solve. A linear quadratic version of Lucas and Stokey's model is specified, which makes both asset pricing computations and optimal fiscal policy calculations easy. Martingale returns on government debt are discussed, and examples and extensions of Lucas and Stokey's model given. Two appendices describe and discuss: the key steps for two basic kinds of stochastic process (a stochastic first‐order linear difference equation and a Markov chain), and time consistency and the structure of debt. Lastly, details are given of the appropriate MATLAB programs.
Ed Finn
- Published in print:
- 2017
- Published Online:
- September 2017
- ISBN:
- 9780262035927
- eISBN:
- 9780262338837
- Item type:
- book
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262035927.001.0001
- Subject:
- Computer Science, Programming Languages
This book explores the cultural figure of the algorithm as it operates through contemporary digital culture. Drawing on sources that range from Neal Stephenson’s Snow Crash to Diderot’s Encyclopédie, ...
More
This book explores the cultural figure of the algorithm as it operates through contemporary digital culture. Drawing on sources that range from Neal Stephenson’s Snow Crash to Diderot’s Encyclopédie, from Adam Smith to the Star Trek computer, it explores the gap between theoretical ideas and pragmatic instructions. Humans have always believed that certain invocations—the marriage vow, the shaman’s curse—do not merely describe the world but make it. This book argues that the algorithm—in practical terms, “a method for solving a problem”—has its roots not only in the mathematical concept of “effective computability” but also in cybernetics, philosophy, and magical thinking. After bringing the full history of the term into view, the book describes how the algorithm attempts to translate between the idealized space of computation and a messy reality, with unpredictable and sometimes fascinating results. Case studies of this implementation gap include the development of intelligent assistants like Siri, Google’s goal of anticipating our questions, the rise of algorithmic aesthetics at Netflix, Ian Bogost’s satiric Facebook game Cow Clicker, Uber’s cartoon maps and black box accounting, and the revolutionary economics of Bitcoin. If we want to understand the gap between abstraction and messy reality, we need to build a model of “algorithmic reading” and scholarship that attends to process as part of a new experimental humanities.Less
This book explores the cultural figure of the algorithm as it operates through contemporary digital culture. Drawing on sources that range from Neal Stephenson’s Snow Crash to Diderot’s Encyclopédie, from Adam Smith to the Star Trek computer, it explores the gap between theoretical ideas and pragmatic instructions. Humans have always believed that certain invocations—the marriage vow, the shaman’s curse—do not merely describe the world but make it. This book argues that the algorithm—in practical terms, “a method for solving a problem”—has its roots not only in the mathematical concept of “effective computability” but also in cybernetics, philosophy, and magical thinking. After bringing the full history of the term into view, the book describes how the algorithm attempts to translate between the idealized space of computation and a messy reality, with unpredictable and sometimes fascinating results. Case studies of this implementation gap include the development of intelligent assistants like Siri, Google’s goal of anticipating our questions, the rise of algorithmic aesthetics at Netflix, Ian Bogost’s satiric Facebook game Cow Clicker, Uber’s cartoon maps and black box accounting, and the revolutionary economics of Bitcoin. If we want to understand the gap between abstraction and messy reality, we need to build a model of “algorithmic reading” and scholarship that attends to process as part of a new experimental humanities.
Seb Franklin
- Published in print:
- 2015
- Published Online:
- May 2016
- ISBN:
- 9780262029537
- eISBN:
- 9780262331135
- Item type:
- book
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262029537.001.0001
- Subject:
- Society and Culture, Technology and Society
This book addresses the conditions of knowledge that make the concept of the “information economy” possible while at the same time obscuring its deleterious effects on material social spaces. In so ...
More
This book addresses the conditions of knowledge that make the concept of the “information economy” possible while at the same time obscuring its deleterious effects on material social spaces. In so doing, the book traces three intertwined threads: the relationships among information, labor, and social management that emerged in the nineteenth century; the mid-twentieth-century diffusion of computational metaphors; and the appearance of informatic principles in certain contemporary socioeconomic and cultural practices. Drawing on critical theory, media theory, and the history of science, the book names control as the episteme grounding late capitalism. Beyond any specific device or set of technically mediated practices, digitality functions within this episteme as the logical basis for reshaped concepts of labor, subjectivity, and collectivity, as well as for the intensification of older modes of exclusion and dispossession. In tracking the pervasiveness of this logical mode into the present, the book locates the cultural traces of control across a diverse body of objects and practices, from cybernetics to economic theory and management styles, and from concepts of language and subjectivity to literary texts, films, and video games.Less
This book addresses the conditions of knowledge that make the concept of the “information economy” possible while at the same time obscuring its deleterious effects on material social spaces. In so doing, the book traces three intertwined threads: the relationships among information, labor, and social management that emerged in the nineteenth century; the mid-twentieth-century diffusion of computational metaphors; and the appearance of informatic principles in certain contemporary socioeconomic and cultural practices. Drawing on critical theory, media theory, and the history of science, the book names control as the episteme grounding late capitalism. Beyond any specific device or set of technically mediated practices, digitality functions within this episteme as the logical basis for reshaped concepts of labor, subjectivity, and collectivity, as well as for the intensification of older modes of exclusion and dispossession. In tracking the pervasiveness of this logical mode into the present, the book locates the cultural traces of control across a diverse body of objects and practices, from cybernetics to economic theory and management styles, and from concepts of language and subjectivity to literary texts, films, and video games.
Marcos Cruz
- Published in print:
- 2018
- Published Online:
- May 2020
- ISBN:
- 9781474420570
- eISBN:
- 9781474453905
- Item type:
- chapter
- Publisher:
- Edinburgh University Press
- DOI:
- 10.3366/edinburgh/9781474420570.003.0005
- Subject:
- Philosophy, Aesthetics
In this chapter Marcos Cruz suggests a new way of biointegrated design which explores non-building, unthinkable novel materials that are products of in-vivo research on living organisms and forms ...
More
In this chapter Marcos Cruz suggests a new way of biointegrated design which explores non-building, unthinkable novel materials that are products of in-vivo research on living organisms and forms that are physically built and yield new aesthetics, resulting from novel hybrid techniques of production. Nonhuman creativity comes from this new aesthetics and from the in-vitro mathematical systems and material computation that run parallel. Not only a new aesthetics is put forward, but also a new way of dealing with environmental issues, critical to future living. The chapter dwells on the shift from the performance of materials to their performativity, as a way of explaining the interactivity between materials and their broader ecology. Moreover, through his work as teacher and as practising architect, the author illustrates how nature itself is potentially programmable matter.Less
In this chapter Marcos Cruz suggests a new way of biointegrated design which explores non-building, unthinkable novel materials that are products of in-vivo research on living organisms and forms that are physically built and yield new aesthetics, resulting from novel hybrid techniques of production. Nonhuman creativity comes from this new aesthetics and from the in-vitro mathematical systems and material computation that run parallel. Not only a new aesthetics is put forward, but also a new way of dealing with environmental issues, critical to future living. The chapter dwells on the shift from the performance of materials to their performativity, as a way of explaining the interactivity between materials and their broader ecology. Moreover, through his work as teacher and as practising architect, the author illustrates how nature itself is potentially programmable matter.
Christof Koch
- Published in print:
- 1998
- Published Online:
- November 2020
- ISBN:
- 9780195104912
- eISBN:
- 9780197562338
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780195104912.003.0006
- Subject:
- Computer Science, Mathematical Theory of Computation
The brain computes! This is accepted as a truism by the majority of neuroscientists engaged in discovering the principles employed in the design and operation of ...
More
The brain computes! This is accepted as a truism by the majority of neuroscientists engaged in discovering the principles employed in the design and operation of nervous systems. What is meant here is that any brain takes the incoming sensory data, encodes them into various biophysical variables, such as the membrane potential or neuronal firing rates, and subsequently performs a very large number of ill-specified operations, frequently termed computations, on these variables to extract relevant features from the input. The outcome of some of these computations can be stored for later access and will, ultimately, control the motor output of the animal in appropriate ways. The present book is dedicated to understanding in detail the biophysical mechanisms responsible for these computations. Its scope is the type of information processing underlying perception and motor control, occurring at the millisecond to fraction of a second time scale. When you look at a pair of stereo images trying to fuse them into a binocular percept, your brain is busily computing away trying to find the “best” solution. What are the computational primitives at the neuronal and subneuronal levels underlying this impressive performance, unmatched by any machine? Naively put and using the language of the electronic circuit designer, the book asks: “What are the diodes and the transistors of the brain?” and “What sort of operations do these elementary circuit elements implement?” Contrary to received opinion, nerve cells are considerably more complex than suggested by work in the neural network community. Like morons, they are reduced to computing nothing but a thresholded sum of their inputs. We know, for instance, that individual nerve cells in the locust perform an operation akin to a multiplication. Given synapses, ionic channels, and membranes, how is this actually carried out? How do neurons integrate, delay, or change their output gain? What are the relevant variables that carry information? The membrane potential? The concentration of intracellular Ca2+ ions? What is their temporal resolution? And how large is the variability of these signals that determines how accurately they can encode information? And what variables are used to store the intermediate results of these computations? And where does long-term memory reside? Natural philosophers and scientists in the western world have always compared the brain to the most advanced technology of the day.
Less
The brain computes! This is accepted as a truism by the majority of neuroscientists engaged in discovering the principles employed in the design and operation of nervous systems. What is meant here is that any brain takes the incoming sensory data, encodes them into various biophysical variables, such as the membrane potential or neuronal firing rates, and subsequently performs a very large number of ill-specified operations, frequently termed computations, on these variables to extract relevant features from the input. The outcome of some of these computations can be stored for later access and will, ultimately, control the motor output of the animal in appropriate ways. The present book is dedicated to understanding in detail the biophysical mechanisms responsible for these computations. Its scope is the type of information processing underlying perception and motor control, occurring at the millisecond to fraction of a second time scale. When you look at a pair of stereo images trying to fuse them into a binocular percept, your brain is busily computing away trying to find the “best” solution. What are the computational primitives at the neuronal and subneuronal levels underlying this impressive performance, unmatched by any machine? Naively put and using the language of the electronic circuit designer, the book asks: “What are the diodes and the transistors of the brain?” and “What sort of operations do these elementary circuit elements implement?” Contrary to received opinion, nerve cells are considerably more complex than suggested by work in the neural network community. Like morons, they are reduced to computing nothing but a thresholded sum of their inputs. We know, for instance, that individual nerve cells in the locust perform an operation akin to a multiplication. Given synapses, ionic channels, and membranes, how is this actually carried out? How do neurons integrate, delay, or change their output gain? What are the relevant variables that carry information? The membrane potential? The concentration of intracellular Ca2+ ions? What is their temporal resolution? And how large is the variability of these signals that determines how accurately they can encode information? And what variables are used to store the intermediate results of these computations? And where does long-term memory reside? Natural philosophers and scientists in the western world have always compared the brain to the most advanced technology of the day.
Christof Koch
- Published in print:
- 1998
- Published Online:
- November 2020
- ISBN:
- 9780195104912
- eISBN:
- 9780197562338
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780195104912.003.0027
- Subject:
- Computer Science, Mathematical Theory of Computation
We now have arrived at the end of the book. The first 16 chapters dealt with linear and nonlinear cable theory, voltage-dependent ionic currents, the biophysical ...
More
We now have arrived at the end of the book. The first 16 chapters dealt with linear and nonlinear cable theory, voltage-dependent ionic currents, the biophysical origin of spike initiation and propagation, the statistical properties of spike trains and neural coding, bursting, dendritic spines, synaptic transmission and plasticity, the types of interactions that can occur among synaptic inputs in a passive or active dendritic arbor, and the diffusion and buffering of calcium and other ions. We attempted to weave these disparate threads into a single tapestry in Chaps. 17-19, demonstrating how these elements interact within a single neuron. The penultimate chapter dealt with various unconventional biophysical and biochemical mechanisms that could instantiate computations at the molecular and the network levels. It is time to summarize. What have we learned about the way brains do or do not compute? The brain has frequently been compared to a universal Turing machine (for a very lucid account of this, see Hofstadter, 1979). A Turing machine is a mathematical abstraction meant to clarify what is meant by algorithm, computation, and computable. Think of it as a machine with a finite number of internal states and an infinite tape that can read messages composed with a finite alphabet, write an output, and store intermediate results as memory. A universal Turing machine is one that can mimic any arbitrary Turing machine. We are here not interested in the renewed debate as to whether or not the brain can, in principle, be treated as such a machine (Lucas, 1964; Penrose, 1989), but whether this is a useful way to conceptualize nervous systems in this manner. Because brains have limited precision, only finite amounts of memory and do not live forever, they cannot possibly be like “real” Turing machines. It is therefore more appropriate to ask: to what extent can brains be treated as finite state machines or automata! Such a machine only has finite computational and memory resources (Hopcroft and Ullman, 1979). The answer has to be an ambiguous “it depends.”
Less
We now have arrived at the end of the book. The first 16 chapters dealt with linear and nonlinear cable theory, voltage-dependent ionic currents, the biophysical origin of spike initiation and propagation, the statistical properties of spike trains and neural coding, bursting, dendritic spines, synaptic transmission and plasticity, the types of interactions that can occur among synaptic inputs in a passive or active dendritic arbor, and the diffusion and buffering of calcium and other ions. We attempted to weave these disparate threads into a single tapestry in Chaps. 17-19, demonstrating how these elements interact within a single neuron. The penultimate chapter dealt with various unconventional biophysical and biochemical mechanisms that could instantiate computations at the molecular and the network levels. It is time to summarize. What have we learned about the way brains do or do not compute? The brain has frequently been compared to a universal Turing machine (for a very lucid account of this, see Hofstadter, 1979). A Turing machine is a mathematical abstraction meant to clarify what is meant by algorithm, computation, and computable. Think of it as a machine with a finite number of internal states and an infinite tape that can read messages composed with a finite alphabet, write an output, and store intermediate results as memory. A universal Turing machine is one that can mimic any arbitrary Turing machine. We are here not interested in the renewed debate as to whether or not the brain can, in principle, be treated as such a machine (Lucas, 1964; Penrose, 1989), but whether this is a useful way to conceptualize nervous systems in this manner. Because brains have limited precision, only finite amounts of memory and do not live forever, they cannot possibly be like “real” Turing machines. It is therefore more appropriate to ask: to what extent can brains be treated as finite state machines or automata! Such a machine only has finite computational and memory resources (Hopcroft and Ullman, 1979). The answer has to be an ambiguous “it depends.”
Dana H. Ballard
- Published in print:
- 2015
- Published Online:
- September 2015
- ISBN:
- 9780262028615
- eISBN:
- 9780262323819
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262028615.003.0001
- Subject:
- Neuroscience, Research and Theory
Large-scale systems are hierarchically organized to manage their complexity. Silicon computing makes extensive use of hierarchies but computational models of the brain have tended to sidestep ...
More
Large-scale systems are hierarchically organized to manage their complexity. Silicon computing makes extensive use of hierarchies but computational models of the brain have tended to sidestep hierarchical organization. The central premise of the book is that the brain can be seen as an ordered layered system of hierarchical organization. Higher levels of abstraction provide economical descriptions cognitive problems that can be searched quickly as lower levels handle important bookkeeping details. Within the benefits of such an organization, the slowness of the neural circuitry is still a formidable enterprise. The brain’s principle computational tricks are introduced. Finally the adequacy of the computational enterprise itself is touched upon.Less
Large-scale systems are hierarchically organized to manage their complexity. Silicon computing makes extensive use of hierarchies but computational models of the brain have tended to sidestep hierarchical organization. The central premise of the book is that the brain can be seen as an ordered layered system of hierarchical organization. Higher levels of abstraction provide economical descriptions cognitive problems that can be searched quickly as lower levels handle important bookkeeping details. Within the benefits of such an organization, the slowness of the neural circuitry is still a formidable enterprise. The brain’s principle computational tricks are introduced. Finally the adequacy of the computational enterprise itself is touched upon.
Ed Finn
- Published in print:
- 2017
- Published Online:
- September 2017
- ISBN:
- 9780262035927
- eISBN:
- 9780262338837
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262035927.003.0001
- Subject:
- Computer Science, Programming Languages
The introduction lays out the central argument of the book: that algorithms are best understood as complex “culture machines” that we engage on the levels of magical thinking, epistemology, social ...
More
The introduction lays out the central argument of the book: that algorithms are best understood as complex “culture machines” that we engage on the levels of magical thinking, epistemology, social identity, and ideology as well as rationality and computation. Opening with a discussion of Neal Stephenson’s novel Snow Crash and our long-running fascination with the power of language to create the world, the chapter proceeds to describe our current relationship to code in terms of the “cathedral of computation.” This metaphor expresses the intermingling of empiricism and belief that underlies our cultural relationships with algorithms—relationships that are growing more powerful, creative, and intimate every day. In order to understand these systems that tell us where to go, whom to date, and what to think about, we need to develop a method for “algorithmic reading.”Less
The introduction lays out the central argument of the book: that algorithms are best understood as complex “culture machines” that we engage on the levels of magical thinking, epistemology, social identity, and ideology as well as rationality and computation. Opening with a discussion of Neal Stephenson’s novel Snow Crash and our long-running fascination with the power of language to create the world, the chapter proceeds to describe our current relationship to code in terms of the “cathedral of computation.” This metaphor expresses the intermingling of empiricism and belief that underlies our cultural relationships with algorithms—relationships that are growing more powerful, creative, and intimate every day. In order to understand these systems that tell us where to go, whom to date, and what to think about, we need to develop a method for “algorithmic reading.”
Ed Finn
- Published in print:
- 2017
- Published Online:
- September 2017
- ISBN:
- 9780262035927
- eISBN:
- 9780262338837
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262035927.003.0002
- Subject:
- Computer Science, Programming Languages
This chapter defines the algorithm as a critical concept across four intellectual strands, beginning with its foundations in computer science and the notion of “effective computability.” The second ...
More
This chapter defines the algorithm as a critical concept across four intellectual strands, beginning with its foundations in computer science and the notion of “effective computability.” The second strand considers cybernetics and ongoing debates about embodiment, abstraction, cognition, and information theory. The third explores magic and its overlap with symbolism, engaging with notions of software, “sourcery,” and the power of metaphors to represent reality. The fourth draws in the long history of technicity and humanity’s coevolution with our cultural tools. Synthesizing these threads, the chapter offers a definition of the algorithm as culture machine in the context of process and implementation, and closes with a summary of the essential facets of algorithmic reading and a brief glimpse of algorithmic imagination.Less
This chapter defines the algorithm as a critical concept across four intellectual strands, beginning with its foundations in computer science and the notion of “effective computability.” The second strand considers cybernetics and ongoing debates about embodiment, abstraction, cognition, and information theory. The third explores magic and its overlap with symbolism, engaging with notions of software, “sourcery,” and the power of metaphors to represent reality. The fourth draws in the long history of technicity and humanity’s coevolution with our cultural tools. Synthesizing these threads, the chapter offers a definition of the algorithm as culture machine in the context of process and implementation, and closes with a summary of the essential facets of algorithmic reading and a brief glimpse of algorithmic imagination.
Ed Finn
- Published in print:
- 2017
- Published Online:
- September 2017
- ISBN:
- 9780262035927
- eISBN:
- 9780262338837
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262035927.003.0007
- Subject:
- Computer Science, Programming Languages
The coda retraces the genealogy of the algorithm to consider our future prospects for achieving the twinned desires embedded in the heart of effective computability: the quest for universal knowledge ...
More
The coda retraces the genealogy of the algorithm to consider our future prospects for achieving the twinned desires embedded in the heart of effective computability: the quest for universal knowledge and perfect self-knowledge. Central to this is the question of algorithmic imagination, particularly given the startling advances in the field of machine learning. The metaphors we use to access and influence the complexity and processes of computational systems will ultimately determine our prospects for true collaboration with intelligent machines. These questions are particularly vital for the humanities, and the chapter argues for a new mode of scholarly and public engagement with computation: the experimental humanities. This is how we can begin to understand the figure of the algorithm as a new territory for cultural imagination and become true collaborators with culture machines rather than their worshippers or, worse, their pets.Less
The coda retraces the genealogy of the algorithm to consider our future prospects for achieving the twinned desires embedded in the heart of effective computability: the quest for universal knowledge and perfect self-knowledge. Central to this is the question of algorithmic imagination, particularly given the startling advances in the field of machine learning. The metaphors we use to access and influence the complexity and processes of computational systems will ultimately determine our prospects for true collaboration with intelligent machines. These questions are particularly vital for the humanities, and the chapter argues for a new mode of scholarly and public engagement with computation: the experimental humanities. This is how we can begin to understand the figure of the algorithm as a new territory for cultural imagination and become true collaborators with culture machines rather than their worshippers or, worse, their pets.
Benjamin H. Bratton
- Published in print:
- 2016
- Published Online:
- September 2016
- ISBN:
- 9780262029575
- eISBN:
- 9780262330183
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262029575.003.0004
- Subject:
- Society and Culture, Cultural Studies
The foundational layer of The Stack is “Earth,” and this chapter examines how it operates as the substrate of each layer above. It begins with a consideration of the physical materials that go into ...
More
The foundational layer of The Stack is “Earth,” and this chapter examines how it operates as the substrate of each layer above. It begins with a consideration of the physical materials that go into the manufacture of information computing technologies. It reviews how Earth has been construed as a site for comprehensive data capture and management and links contemporary cyberinfrastructure projects with experimental Art and Architecture projects, each seeking to striate and envelope the globe. The chapter discusses the paradoxes of ecological governance, particularly how our empirical knowledge of climate change is provided by a panoptic global information apparatus that is itself a key contributor to the change that it models for us. This paradox symbolizes the difficulties of engineering ‘smart grids’ that can achieve net energy savings despite the extraordinary expenditures necessary to build them. Regarding ecological governance, the chapter also discusses how climate change produces new tactical jurisdictions of those affected by it in a similar way and of those whose energy output produces it in a similar way.Less
The foundational layer of The Stack is “Earth,” and this chapter examines how it operates as the substrate of each layer above. It begins with a consideration of the physical materials that go into the manufacture of information computing technologies. It reviews how Earth has been construed as a site for comprehensive data capture and management and links contemporary cyberinfrastructure projects with experimental Art and Architecture projects, each seeking to striate and envelope the globe. The chapter discusses the paradoxes of ecological governance, particularly how our empirical knowledge of climate change is provided by a panoptic global information apparatus that is itself a key contributor to the change that it models for us. This paradox symbolizes the difficulties of engineering ‘smart grids’ that can achieve net energy savings despite the extraordinary expenditures necessary to build them. Regarding ecological governance, the chapter also discusses how climate change produces new tactical jurisdictions of those affected by it in a similar way and of those whose energy output produces it in a similar way.
Benjamin H. Bratton
- Published in print:
- 2016
- Published Online:
- September 2016
- ISBN:
- 9780262029575
- eISBN:
- 9780262330183
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262029575.003.0009
- Subject:
- Society and Culture, Cultural Studies
This chapter discusses the sixth and top layer in The Stack, the User layer. It describes how Users initiate chains of interaction up and down layers, from Interface to Earth and back again. It sees ...
More
This chapter discusses the sixth and top layer in The Stack, the User layer. It describes how Users initiate chains of interaction up and down layers, from Interface to Earth and back again. It sees the “user” as a contemporary mediated image of the self, one that is often reduced to narrow and utilitarian frames, but also open to a diverse variety of possible human and non-human agencies. The user position can both over-individuate that agent’s sense of self and also radically multiply it. For example, data generated by Users and producing traces and shadows of their worldly transactions, initially creates a high-resolution portrait of a single user (for example as seen in the Quantified Self movement) but as overlapping external data streams are introduced, the coherency the user’s subjectivity is dissolved by the overdetermination by external relations and forces. Any durable politics of the User must understand this dynamic of platform sovereignty.Less
This chapter discusses the sixth and top layer in The Stack, the User layer. It describes how Users initiate chains of interaction up and down layers, from Interface to Earth and back again. It sees the “user” as a contemporary mediated image of the self, one that is often reduced to narrow and utilitarian frames, but also open to a diverse variety of possible human and non-human agencies. The user position can both over-individuate that agent’s sense of self and also radically multiply it. For example, data generated by Users and producing traces and shadows of their worldly transactions, initially creates a high-resolution portrait of a single user (for example as seen in the Quantified Self movement) but as overlapping external data streams are introduced, the coherency the user’s subjectivity is dissolved by the overdetermination by external relations and forces. Any durable politics of the User must understand this dynamic of platform sovereignty.
Volker Pantenburg
- Published in print:
- 2016
- Published Online:
- May 2017
- ISBN:
- 9781526107213
- eISBN:
- 9781526120984
- Item type:
- chapter
- Publisher:
- Manchester University Press
- DOI:
- 10.7228/manchester/9781526107213.003.0004
- Subject:
- Society and Culture, Media Studies
This chapter traces the origins and implications of the ‘operational image’, as it has come to be explored in Harun Farocki’s texts and installations since 2000. Defined in Eye/Machine I (2000) as ...
More
This chapter traces the origins and implications of the ‘operational image’, as it has come to be explored in Harun Farocki’s texts and installations since 2000. Defined in Eye/Machine I (2000) as ‘Images without a social goal, not for edification, not for reflection,’ operational images pervade both the military and non-military realm of today’s life. To contextualise this type of image, three concepts are revisited that have explicitly informed Farocki’s understanding: Roland Barthes’ idea of ‘operational language’, Vilém Flusser’s ‘technical image’, and the project of a computer aided ‘Bildwissenschaft’ based on algorithmic image retrieval. After a closer analysis of some of the counter-operational strategies Farocki employs, the article suggests to distinguish three levels of operationality and to understand Farocki’s project as a film maker, video artist and thinker as a continuous examination of the operational potential of images.Less
This chapter traces the origins and implications of the ‘operational image’, as it has come to be explored in Harun Farocki’s texts and installations since 2000. Defined in Eye/Machine I (2000) as ‘Images without a social goal, not for edification, not for reflection,’ operational images pervade both the military and non-military realm of today’s life. To contextualise this type of image, three concepts are revisited that have explicitly informed Farocki’s understanding: Roland Barthes’ idea of ‘operational language’, Vilém Flusser’s ‘technical image’, and the project of a computer aided ‘Bildwissenschaft’ based on algorithmic image retrieval. After a closer analysis of some of the counter-operational strategies Farocki employs, the article suggests to distinguish three levels of operationality and to understand Farocki’s project as a film maker, video artist and thinker as a continuous examination of the operational potential of images.
Bryce Huebner
- Published in print:
- 2013
- Published Online:
- January 2014
- ISBN:
- 9780199926275
- eISBN:
- 9780199347193
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199926275.003.0003
- Subject:
- Philosophy, Philosophy of Mind, General
This chapter discusses the organizational and architectural constraints on collective mentality. It is argued that collective mentality is unlikely to be realized by an organizational structure that ...
More
This chapter discusses the organizational and architectural constraints on collective mentality. It is argued that collective mentality is unlikely to be realized by an organizational structure that simply aggregates individual beliefs and desires; and that we should commit to the taxonomy of folk psychology at the level of cognitive systems, and deny that a naturalistic theory should appeal to representational states and processes that are readily mapped onto categories like belief and desire. A componential account of mental representation is sketched, and it is argued that many types of system-level representations are realized by competitive or quasi-competitive algorithms.Less
This chapter discusses the organizational and architectural constraints on collective mentality. It is argued that collective mentality is unlikely to be realized by an organizational structure that simply aggregates individual beliefs and desires; and that we should commit to the taxonomy of folk psychology at the level of cognitive systems, and deny that a naturalistic theory should appeal to representational states and processes that are readily mapped onto categories like belief and desire. A componential account of mental representation is sketched, and it is argued that many types of system-level representations are realized by competitive or quasi-competitive algorithms.
Christopher G. Timpson
- Published in print:
- 2013
- Published Online:
- September 2013
- ISBN:
- 9780199296460
- eISBN:
- 9780191741791
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199296460.003.0006
- Subject:
- Philosophy, Philosophy of Science
Some of the philosophical questions raised by the theory of quantum computation are discussed. First it is considered whether the possibility of exponential speed-up in quantum computation provides ...
More
Some of the philosophical questions raised by the theory of quantum computation are discussed. First it is considered whether the possibility of exponential speed-up in quantum computation provides an argument for a more substantive notion of quantum information than so far allowed. It is concluded that this is not so. Then some questions regarding the status of the Church-Turing hypothesis in the light of quantum computation are considered. In particular, Deutsch’s claim that a physical principle, the Turing Principle, underlies the Church-Turing hypothesis is rebutted. Finally, the question of whether the Church-Turing hypothesis might serve as a constraint on the laws of physics is briefly considered.Less
Some of the philosophical questions raised by the theory of quantum computation are discussed. First it is considered whether the possibility of exponential speed-up in quantum computation provides an argument for a more substantive notion of quantum information than so far allowed. It is concluded that this is not so. Then some questions regarding the status of the Church-Turing hypothesis in the light of quantum computation are considered. In particular, Deutsch’s claim that a physical principle, the Turing Principle, underlies the Church-Turing hypothesis is rebutted. Finally, the question of whether the Church-Turing hypothesis might serve as a constraint on the laws of physics is briefly considered.
Nico Orlandi
- Published in print:
- 2014
- Published Online:
- August 2014
- ISBN:
- 9780199375035
- eISBN:
- 9780199375059
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199375035.003.0005
- Subject:
- Philosophy, Philosophy of Mind
The final chapter concludes the case in favor of the embedded view by showing that the view is compatible with computational theories of mental activity. This is true for two reasons. First, there ...
More
The final chapter concludes the case in favor of the embedded view by showing that the view is compatible with computational theories of mental activity. This is true for two reasons. First, there are perfectly legitimate models of computation that do not describe them as regimented operations on representations. Second, denying that the visual system performs computations in a narrow, classical sense does not amount to rejecting the classical computational theory of mind in its entirety. We can concede that classicism is true of other mental processes while recognizing that we have little reason to suppose that it is true of perception. The chapter discusses the systematicity and productivity that we find in vision and argues that these two features do not require appeal to classical computations.Less
The final chapter concludes the case in favor of the embedded view by showing that the view is compatible with computational theories of mental activity. This is true for two reasons. First, there are perfectly legitimate models of computation that do not describe them as regimented operations on representations. Second, denying that the visual system performs computations in a narrow, classical sense does not amount to rejecting the classical computational theory of mind in its entirety. We can concede that classicism is true of other mental processes while recognizing that we have little reason to suppose that it is true of perception. The chapter discusses the systematicity and productivity that we find in vision and argues that these two features do not require appeal to classical computations.