S. N. Afriat
- Published in print:
- 1987
- Published Online:
- November 2003
- ISBN:
- 9780198284611
- eISBN:
- 9780191595844
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198284616.003.0026
- Subject:
- Economics and Finance, Microeconomics
This is the third of five chapters on optimal programming (the typical mathematics of economics) and related issues as related to choice making. It discusses linear programming, which might appear to ...
More
This is the third of five chapters on optimal programming (the typical mathematics of economics) and related issues as related to choice making. It discusses linear programming, which might appear to be a special case of convex programming, but is more substantial, and is really an embodiment of the theory of systems of linear inequalities (as reflected here). This chapter initiates the subject with reference to systems of linear inequalities and natural questions about them, and all LP (linear programming) theorems are encountered simply in pursuing those. Theorems about linear inequalities that have uses directly on their own are also derived (and are illustrated in many places in this book). The eight sections of the chapter are: linear inequalities; separation theorems; theorems of alternatives; polyhedra and polytopes; LP Duality Theorem; the pivot operation; the Simplex Algorithm; and BASIC program.Less
This is the third of five chapters on optimal programming (the typical mathematics of economics) and related issues as related to choice making. It discusses linear programming, which might appear to be a special case of convex programming, but is more substantial, and is really an embodiment of the theory of systems of linear inequalities (as reflected here). This chapter initiates the subject with reference to systems of linear inequalities and natural questions about them, and all LP (linear programming) theorems are encountered simply in pursuing those. Theorems about linear inequalities that have uses directly on their own are also derived (and are illustrated in many places in this book). The eight sections of the chapter are: linear inequalities; separation theorems; theorems of alternatives; polyhedra and polytopes; LP Duality Theorem; the pivot operation; the Simplex Algorithm; and BASIC program.
Ed Finn
- Published in print:
- 2017
- Published Online:
- September 2017
- ISBN:
- 9780262035927
- eISBN:
- 9780262338837
- Item type:
- book
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262035927.001.0001
- Subject:
- Computer Science, Programming
This book explores the cultural figure of the algorithm as it operates through contemporary digital culture. Drawing on sources that range from Neal Stephenson’s Snow Crash to Diderot’s Encyclopédie, ...
More
This book explores the cultural figure of the algorithm as it operates through contemporary digital culture. Drawing on sources that range from Neal Stephenson’s Snow Crash to Diderot’s Encyclopédie, from Adam Smith to the Star Trek computer, it explores the gap between theoretical ideas and pragmatic instructions. Humans have always believed that certain invocations—the marriage vow, the shaman’s curse—do not merely describe the world but make it. This book argues that the algorithm—in practical terms, “a method for solving a problem”—has its roots not only in the mathematical concept of “effective computability” but also in cybernetics, philosophy, and magical thinking. After bringing the full history of the term into view, the book describes how the algorithm attempts to translate between the idealized space of computation and a messy reality, with unpredictable and sometimes fascinating results. Case studies of this implementation gap include the development of intelligent assistants like Siri, Google’s goal of anticipating our questions, the rise of algorithmic aesthetics at Netflix, Ian Bogost’s satiric Facebook game Cow Clicker, Uber’s cartoon maps and black box accounting, and the revolutionary economics of Bitcoin. If we want to understand the gap between abstraction and messy reality, we need to build a model of “algorithmic reading” and scholarship that attends to process as part of a new experimental humanities.Less
This book explores the cultural figure of the algorithm as it operates through contemporary digital culture. Drawing on sources that range from Neal Stephenson’s Snow Crash to Diderot’s Encyclopédie, from Adam Smith to the Star Trek computer, it explores the gap between theoretical ideas and pragmatic instructions. Humans have always believed that certain invocations—the marriage vow, the shaman’s curse—do not merely describe the world but make it. This book argues that the algorithm—in practical terms, “a method for solving a problem”—has its roots not only in the mathematical concept of “effective computability” but also in cybernetics, philosophy, and magical thinking. After bringing the full history of the term into view, the book describes how the algorithm attempts to translate between the idealized space of computation and a messy reality, with unpredictable and sometimes fascinating results. Case studies of this implementation gap include the development of intelligent assistants like Siri, Google’s goal of anticipating our questions, the rise of algorithmic aesthetics at Netflix, Ian Bogost’s satiric Facebook game Cow Clicker, Uber’s cartoon maps and black box accounting, and the revolutionary economics of Bitcoin. If we want to understand the gap between abstraction and messy reality, we need to build a model of “algorithmic reading” and scholarship that attends to process as part of a new experimental humanities.
Alexandre Todorov
- Published in print:
- 2016
- Published Online:
- May 2017
- ISBN:
- 9780262034685
- eISBN:
- 9780262335522
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262034685.003.0006
- Subject:
- Biology, Biomathematics / Statistics and Data Analysis / Complexity Studies
The aim of the RELIEF algorithm is to filter out features (e.g., genes, environmental factors) that are relevant to a trait of interest, starting from a set of that may include thousands of ...
More
The aim of the RELIEF algorithm is to filter out features (e.g., genes, environmental factors) that are relevant to a trait of interest, starting from a set of that may include thousands of irrelevant features. Though widely used in many fields, its application to the study of gene-environment interaction studies has been limited thus far. We provide here an overview of this machine learning algorithm and some of its variants. Using simulated data, we then compare of the performance of RELIEF to that of logistic regression for screening for gene-environment interactions in SNP data. Even though performance degrades in larger sets of markers, RELIEF remains a competitive alternative to logistic regression, and shows clear promise as a tool for the study of gene-environment interactions. Areas for further improvements of the algorithm are then suggested.Less
The aim of the RELIEF algorithm is to filter out features (e.g., genes, environmental factors) that are relevant to a trait of interest, starting from a set of that may include thousands of irrelevant features. Though widely used in many fields, its application to the study of gene-environment interaction studies has been limited thus far. We provide here an overview of this machine learning algorithm and some of its variants. Using simulated data, we then compare of the performance of RELIEF to that of logistic regression for screening for gene-environment interactions in SNP data. Even though performance degrades in larger sets of markers, RELIEF remains a competitive alternative to logistic regression, and shows clear promise as a tool for the study of gene-environment interactions. Areas for further improvements of the algorithm are then suggested.
Ed Finn
- Published in print:
- 2017
- Published Online:
- September 2017
- ISBN:
- 9780262035927
- eISBN:
- 9780262338837
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262035927.003.0002
- Subject:
- Computer Science, Programming
This chapter defines the algorithm as a critical concept across four intellectual strands, beginning with its foundations in computer science and the notion of “effective computability.” The second ...
More
This chapter defines the algorithm as a critical concept across four intellectual strands, beginning with its foundations in computer science and the notion of “effective computability.” The second strand considers cybernetics and ongoing debates about embodiment, abstraction, cognition, and information theory. The third explores magic and its overlap with symbolism, engaging with notions of software, “sourcery,” and the power of metaphors to represent reality. The fourth draws in the long history of technicity and humanity’s coevolution with our cultural tools. Synthesizing these threads, the chapter offers a definition of the algorithm as culture machine in the context of process and implementation, and closes with a summary of the essential facets of algorithmic reading and a brief glimpse of algorithmic imagination.Less
This chapter defines the algorithm as a critical concept across four intellectual strands, beginning with its foundations in computer science and the notion of “effective computability.” The second strand considers cybernetics and ongoing debates about embodiment, abstraction, cognition, and information theory. The third explores magic and its overlap with symbolism, engaging with notions of software, “sourcery,” and the power of metaphors to represent reality. The fourth draws in the long history of technicity and humanity’s coevolution with our cultural tools. Synthesizing these threads, the chapter offers a definition of the algorithm as culture machine in the context of process and implementation, and closes with a summary of the essential facets of algorithmic reading and a brief glimpse of algorithmic imagination.
Clifford Siskin
- Published in print:
- 2016
- Published Online:
- May 2017
- ISBN:
- 9780262035316
- eISBN:
- 9780262336345
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262035316.003.0009
- Subject:
- History, History of Science, Technology, and Medicine
To bring the story of system forward to the present, this Coda recounts the author’s involvement in an attempt to reshape knowledge called the Whole Enchilada group. Its work highlighted the ongoing ...
More
To bring the story of system forward to the present, this Coda recounts the author’s involvement in an attempt to reshape knowledge called the Whole Enchilada group. Its work highlighted the ongoing importance of Darwin to current efforts at reform. Building on Daniel Dennett’s focus on the algorithmic in Darwin—the “If … then” format of his arguments—the Coda examines how algorithm and system have been deployed as variations of each other: if we reverse engineer any complex system we will find the simple algorithm that generated it. Just as we engage system as something both conceptual—a way of knowing the world—and as something that is really there, that constitutes part of that world, so algorithm is both a formal mode of producing knowledge about the world and—thanks to its simple-to-complex mechanicity—constitutive of that world. Using David Deutsch’s arguments about knowledge and virtual reality, the Coda then asks “Have we been able to know the world through system because the world is, in fact, itself structured as a system?” This question leads to a turn to Stephen Wolfram’s “new science” and the possibility of significant change: a reformulating on the Enlightenment’s once startling conviction that the world could be known through complex master SYSTEMS into the notion that knowledge—in the form of simple, iterative systems—renders the world. The book ends by emphasizing the persistence of system in the growing consensus that the universe is computational—with system generating the world it helps us to know.Less
To bring the story of system forward to the present, this Coda recounts the author’s involvement in an attempt to reshape knowledge called the Whole Enchilada group. Its work highlighted the ongoing importance of Darwin to current efforts at reform. Building on Daniel Dennett’s focus on the algorithmic in Darwin—the “If … then” format of his arguments—the Coda examines how algorithm and system have been deployed as variations of each other: if we reverse engineer any complex system we will find the simple algorithm that generated it. Just as we engage system as something both conceptual—a way of knowing the world—and as something that is really there, that constitutes part of that world, so algorithm is both a formal mode of producing knowledge about the world and—thanks to its simple-to-complex mechanicity—constitutive of that world. Using David Deutsch’s arguments about knowledge and virtual reality, the Coda then asks “Have we been able to know the world through system because the world is, in fact, itself structured as a system?” This question leads to a turn to Stephen Wolfram’s “new science” and the possibility of significant change: a reformulating on the Enlightenment’s once startling conviction that the world could be known through complex master SYSTEMS into the notion that knowledge—in the form of simple, iterative systems—renders the world. The book ends by emphasizing the persistence of system in the growing consensus that the universe is computational—with system generating the world it helps us to know.
Keith M. Martin
- Published in print:
- 2017
- Published Online:
- July 2017
- ISBN:
- 9780198788003
- eISBN:
- 9780191829956
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198788003.003.0002
- Subject:
- Mathematics, Computational Mathematics / Optimization, Logic / Computer Science / Mathematical Philosophy
This chapter presents several historical cryptosystems. These are all relatively simple, and none are fit for modern use. These cryptosystems serve to illustrate the basic model of a cryptosystem, as ...
More
This chapter presents several historical cryptosystems. These are all relatively simple, and none are fit for modern use. These cryptosystems serve to illustrate the basic model of a cryptosystem, as well as introduce a number of important design principles for modern encryption algorithms. We demonstrate the importance of a large keyspace, randomness of ciphertext, and positional dependence. We also show how efficiency and security are often traded off against one another when designing a cryptosystem.Less
This chapter presents several historical cryptosystems. These are all relatively simple, and none are fit for modern use. These cryptosystems serve to illustrate the basic model of a cryptosystem, as well as introduce a number of important design principles for modern encryption algorithms. We demonstrate the importance of a large keyspace, randomness of ciphertext, and positional dependence. We also show how efficiency and security are often traded off against one another when designing a cryptosystem.
A.C.C. Coolen, A. Annibale, and E.S. Roberts
- Published in print:
- 2017
- Published Online:
- May 2017
- ISBN:
- 9780198709893
- eISBN:
- 9780191780172
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198709893.003.0001
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
This introductory chapter sets the scene for the material which follows by briefly introducing the study of networks and describing their wide scope of application. It discusses the role of ...
More
This introductory chapter sets the scene for the material which follows by briefly introducing the study of networks and describing their wide scope of application. It discusses the role of well-specified random graphs in setting network science onto a firm scientific footing, emphasizing the importance of well-defined null models. Non-trivial aspects of graph generation are introduced. An important distinction is made between approaches that begin with a desired probability distribution on the final graph ensembles and approaches where the graph generation process is the main object of interest and the challenge is to analyze the expected topological properties of the generated networks. At the core of the graph generation process is the need to establish a mathematical connection between the stochastic graph generation process and the stationary probability distribution to which these processes evolve.Less
This introductory chapter sets the scene for the material which follows by briefly introducing the study of networks and describing their wide scope of application. It discusses the role of well-specified random graphs in setting network science onto a firm scientific footing, emphasizing the importance of well-defined null models. Non-trivial aspects of graph generation are introduced. An important distinction is made between approaches that begin with a desired probability distribution on the final graph ensembles and approaches where the graph generation process is the main object of interest and the challenge is to analyze the expected topological properties of the generated networks. At the core of the graph generation process is the need to establish a mathematical connection between the stochastic graph generation process and the stationary probability distribution to which these processes evolve.