Paul Erickson
- Published in print:
- 2015
- Published Online:
- May 2016
- ISBN:
- 9780226097039
- eISBN:
- 9780226097206
- Item type:
- chapter
- Publisher:
- University of Chicago Press
- DOI:
- 10.7208/chicago/9780226097206.003.0002
- Subject:
- History, History of Science, Technology, and Medicine
Despite the ex post facto identification of a number of “anticipations” of game-theoretic results through history, it is generally agreed that modern game theory’s founding work was mathematician ...
More
Despite the ex post facto identification of a number of “anticipations” of game-theoretic results through history, it is generally agreed that modern game theory’s founding work was mathematician John von Neumann and economist Oskar Morgenstern’s 1944 book, Theory of Games and Economic Behavior. Yet to a student of game theory trained in recent decades, the book must seem antique in terms of its notations, style of presentation, and terminology. This chapter is therefore principally devoted to explicating the text of von Neumann and Morgenstern’s book, emphasizing the diverse nature of its contents: a dynamic, set-theoretic depiction of games in “extensive form;” the matrix “normal form” of the game and the celebrated “minimax theorem,” with its rich connections to topology and the theory of fixed points; the “characteristic function form” of games and definition of “solutions” as non-dominated sets of imputations; and finally, the von Neumann – Morgenstern theory of utility, which constructed a measure of utility from axioms of preference ordering. These pieces of the theory were not just selectively appropriated and used by different groups of individuals after 1944, but they were also the outgrowth of varied research interests of the book’s authors in the years preceding its publication.Less
Despite the ex post facto identification of a number of “anticipations” of game-theoretic results through history, it is generally agreed that modern game theory’s founding work was mathematician John von Neumann and economist Oskar Morgenstern’s 1944 book, Theory of Games and Economic Behavior. Yet to a student of game theory trained in recent decades, the book must seem antique in terms of its notations, style of presentation, and terminology. This chapter is therefore principally devoted to explicating the text of von Neumann and Morgenstern’s book, emphasizing the diverse nature of its contents: a dynamic, set-theoretic depiction of games in “extensive form;” the matrix “normal form” of the game and the celebrated “minimax theorem,” with its rich connections to topology and the theory of fixed points; the “characteristic function form” of games and definition of “solutions” as non-dominated sets of imputations; and finally, the von Neumann – Morgenstern theory of utility, which constructed a measure of utility from axioms of preference ordering. These pieces of the theory were not just selectively appropriated and used by different groups of individuals after 1944, but they were also the outgrowth of varied research interests of the book’s authors in the years preceding its publication.
Bas C. van Fraassen
- Published in print:
- 1991
- Published Online:
- November 2003
- ISBN:
- 9780198239802
- eISBN:
- 9780191597466
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198239807.001.0001
- Subject:
- Philosophy, Philosophy of Science
Quantum theory was developed in response to a welter of new experimental phenomena, yet appeared to depict a world so esoteric as to be literally unimaginable. Interpretation of the theory became ...
More
Quantum theory was developed in response to a welter of new experimental phenomena, yet appeared to depict a world so esoteric as to be literally unimaginable. Interpretation of the theory became feasible only after von Neumann's theoretical unification, but von Neumann's own interpretation astonishingly implied that in measurement something happens that violates Schroedinger's equation, the theory's cornerstone. This book argues first of all that the phenomena themselves, without theoretical motives, suffice to eliminate ’common cause’ models, thus requiring a radical departure from classical physics models. The measurement process, however, has an adequate description of itself as a quantum‐mechanical process, so that the theory can be seen as complete in a relevant sense. But the question of interpretation, ‘How could the world possibly be the way this theory says it is?’, is not thereby answered. In response to that question it is argued that the theory admits a plurality of interpretations, each of which helps to understand the theory further, but also advocates one particular interpretation (the Copenhagen Variant of the Modal Interpretation). That interpretation is then applied to such topics as the Einstein–Podolsky–Rosen paradox and the problem of ’identical’ particles, quantum statistics, identity, and individuation.Less
Quantum theory was developed in response to a welter of new experimental phenomena, yet appeared to depict a world so esoteric as to be literally unimaginable. Interpretation of the theory became feasible only after von Neumann's theoretical unification, but von Neumann's own interpretation astonishingly implied that in measurement something happens that violates Schroedinger's equation, the theory's cornerstone. This book argues first of all that the phenomena themselves, without theoretical motives, suffice to eliminate ’common cause’ models, thus requiring a radical departure from classical physics models. The measurement process, however, has an adequate description of itself as a quantum‐mechanical process, so that the theory can be seen as complete in a relevant sense. But the question of interpretation, ‘How could the world possibly be the way this theory says it is?’, is not thereby answered. In response to that question it is argued that the theory admits a plurality of interpretations, each of which helps to understand the theory further, but also advocates one particular interpretation (the Copenhagen Variant of the Modal Interpretation). That interpretation is then applied to such topics as the Einstein–Podolsky–Rosen paradox and the problem of ’identical’ particles, quantum statistics, identity, and individuation.
Gordon M. Shepherd
- Published in print:
- 2009
- Published Online:
- February 2010
- ISBN:
- 9780195391503
- eISBN:
- 9780199863464
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195391503.003.0016
- Subject:
- Neuroscience, History of Neuroscience
This chapter focuses on the development of theoretical neuroscience. The mid-20th century marked the emergence of several new fields that laid the foundations for general theories of brain function. ...
More
This chapter focuses on the development of theoretical neuroscience. The mid-20th century marked the emergence of several new fields that laid the foundations for general theories of brain function. McCulloch and Pitts applied the symbolic logic metaphor to nerve cell circuits, postulating that specific interconnections could perform basic logic functions such as AND, OR, and AND-NOT gates. John von Neumann drew on this idea of the brain in formulating the classical architecture of the digital computer. Developments in control theory, neurology, and adaptive behavior came together in the new field of cybernetics. The McCulloch–Pitts oversimplified neurons contributed to the rise of artificial intelligence and neural nets. Von Neumann eventually realized that the fundamental computational elements of the nervous system are not oversimplified neurons, but individual synapses distributed on dendritic trees. This insight anticipated current work on developing more realistic large-scale neural networks, drawing on studies at all the levels of organization covered in this book, to simulate how the brain actually carries out its functions.Less
This chapter focuses on the development of theoretical neuroscience. The mid-20th century marked the emergence of several new fields that laid the foundations for general theories of brain function. McCulloch and Pitts applied the symbolic logic metaphor to nerve cell circuits, postulating that specific interconnections could perform basic logic functions such as AND, OR, and AND-NOT gates. John von Neumann drew on this idea of the brain in formulating the classical architecture of the digital computer. Developments in control theory, neurology, and adaptive behavior came together in the new field of cybernetics. The McCulloch–Pitts oversimplified neurons contributed to the rise of artificial intelligence and neural nets. Von Neumann eventually realized that the fundamental computational elements of the nervous system are not oversimplified neurons, but individual synapses distributed on dendritic trees. This insight anticipated current work on developing more realistic large-scale neural networks, drawing on studies at all the levels of organization covered in this book, to simulate how the brain actually carries out its functions.
ROBERT V. DODGE
- Published in print:
- 2012
- Published Online:
- May 2012
- ISBN:
- 9780199857203
- eISBN:
- 9780199932597
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199857203.003.0004
- Subject:
- Economics and Finance, Behavioural Economics
This chapter formally introduces game theory. It provides a simple definition as the study of interactive decision-making by rational decision makers. Game theory is a formal study of vicarious ...
More
This chapter formally introduces game theory. It provides a simple definition as the study of interactive decision-making by rational decision makers. Game theory is a formal study of vicarious thinking. A number of introductory terms are introduced, including “game,” “player,” and “payoff.” Concepts, including “rational actor,” “defect,” and “equilibrium” are discussed, and the point is made that a player's goal is to maximize his personal outcome, not to “beat” the other player. A history of game theory and its ideas as seen through the prism of two key figures—John von Neumann and Thomas Schelling—is entertainingly told by BBC presenter and Financial Times correspondent Tim Harford in an excerpt from his book The Logic of Life entitled “Las Vegas: the Edge of Reason.”Less
This chapter formally introduces game theory. It provides a simple definition as the study of interactive decision-making by rational decision makers. Game theory is a formal study of vicarious thinking. A number of introductory terms are introduced, including “game,” “player,” and “payoff.” Concepts, including “rational actor,” “defect,” and “equilibrium” are discussed, and the point is made that a player's goal is to maximize his personal outcome, not to “beat” the other player. A history of game theory and its ideas as seen through the prism of two key figures—John von Neumann and Thomas Schelling—is entertainingly told by BBC presenter and Financial Times correspondent Tim Harford in an excerpt from his book The Logic of Life entitled “Las Vegas: the Edge of Reason.”
István Hargittai
- Published in print:
- 2006
- Published Online:
- September 2007
- ISBN:
- 9780195178456
- eISBN:
- 9780199787012
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195178456.001.0001
- Subject:
- Physics, History of Physics
Five men born at the turn of the 20th century in Budapest: Theodore von Kármán, Leo Szilard, Eugene Wigner, John von Neumann, and Edward Teller, became a special group often referred to as the ...
More
Five men born at the turn of the 20th century in Budapest: Theodore von Kármán, Leo Szilard, Eugene Wigner, John von Neumann, and Edward Teller, became a special group often referred to as the Martians. Through immigration from Hungary to Germany and to the United States, they remained friends and continued to work together and influence each other throughout their lives. As a result, their work was integral to some of the most important scientific and political developments of the 20th century. Wigner won a Nobel Prize in theoretical physics; Szilard was the first to see that a chain reaction based on neutrons was possible and initiated the Manhattan Project, but later tried to restrict nuclear arms; von Neumann developed the modern computer for complex problems; von Kármán provided the scientific bases for the US Air Force; and Teller was the father of the hydrogen bomb, whose name is also synonymous with the controversial “Star Wars” initiative of the 1980s. Each was fiercely opinionated and all were politically active reactionaries against all forms of totalitarianism. They risked their careers for the defense of the United States and the Free World.Less
Five men born at the turn of the 20th century in Budapest: Theodore von Kármán, Leo Szilard, Eugene Wigner, John von Neumann, and Edward Teller, became a special group often referred to as the Martians. Through immigration from Hungary to Germany and to the United States, they remained friends and continued to work together and influence each other throughout their lives. As a result, their work was integral to some of the most important scientific and political developments of the 20th century. Wigner won a Nobel Prize in theoretical physics; Szilard was the first to see that a chain reaction based on neutrons was possible and initiated the Manhattan Project, but later tried to restrict nuclear arms; von Neumann developed the modern computer for complex problems; von Kármán provided the scientific bases for the US Air Force; and Teller was the father of the hydrogen bomb, whose name is also synonymous with the controversial “Star Wars” initiative of the 1980s. Each was fiercely opinionated and all were politically active reactionaries against all forms of totalitarianism. They risked their careers for the defense of the United States and the Free World.
Thomas Haigh, Mark Priestley, and Crispin Rope
- Published in print:
- 2016
- Published Online:
- May 2017
- ISBN:
- 9780262033985
- eISBN:
- 9780262334426
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262033985.003.0007
- Subject:
- History, History of Science, Technology, and Medicine
This chapter charts the rapid evolution of thinking about programming and computer architecture among members of the ENIAC team from 1944 onward, as what is usually called the “stored program ...
More
This chapter charts the rapid evolution of thinking about programming and computer architecture among members of the ENIAC team from 1944 onward, as what is usually called the “stored program concept” was formulated with John von Neumann and presented in the “First Draft of a Report on the EDVAC.” Use of archival sources makes this more specific and rigorous in documenting this process than any previous published account, presenting the new approach as an evolution of, and response to, the original ENIAC programming method. The ideas present in the “First Draft” are clearly explained and separated into three distinct clusters: the “EDVAC hardware paradigm,” the “von Neumann architecture paradigm,” and the “modern code paradigm.” The chapter finishes with an exploration of initial understanding and reception of these ideas, reconstructing the late-1940s consensus on what was important about the new approach and why.Less
This chapter charts the rapid evolution of thinking about programming and computer architecture among members of the ENIAC team from 1944 onward, as what is usually called the “stored program concept” was formulated with John von Neumann and presented in the “First Draft of a Report on the EDVAC.” Use of archival sources makes this more specific and rigorous in documenting this process than any previous published account, presenting the new approach as an evolution of, and response to, the original ENIAC programming method. The ideas present in the “First Draft” are clearly explained and separated into three distinct clusters: the “EDVAC hardware paradigm,” the “von Neumann architecture paradigm,” and the “modern code paradigm.” The chapter finishes with an exploration of initial understanding and reception of these ideas, reconstructing the late-1940s consensus on what was important about the new approach and why.
Bas C. van Fraassen
- Published in print:
- 1991
- Published Online:
- November 2003
- ISBN:
- 9780198239802
- eISBN:
- 9780191597466
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198239807.003.0008
- Subject:
- Philosophy, Philosophy of Science
Von Neumann's unification of Schroedinger's and Heisenberg's formalisms came with an interpretation of quantum theory involving two principles. The first is that assertions about the values of ...
More
Von Neumann's unification of Schroedinger's and Heisenberg's formalisms came with an interpretation of quantum theory involving two principles. The first is that assertions about the values of observables are equivalent to assertions about the quantum‐mechanical state of the system. This is sometimes known as the ’eigenvalue–eigenstate link’, since it equates an observable having a value with the system being in an eigenstate of that observable. The second is his Projection Postulate—i.e. the postulate that during measurement there is a ’collapse of the wave packet’. It is argued that the theory does not force these principles on us, and that there are severe difficulties in this interpretation, despite also its more recent defences.Less
Von Neumann's unification of Schroedinger's and Heisenberg's formalisms came with an interpretation of quantum theory involving two principles. The first is that assertions about the values of observables are equivalent to assertions about the quantum‐mechanical state of the system. This is sometimes known as the ’eigenvalue–eigenstate link’, since it equates an observable having a value with the system being in an eigenstate of that observable. The second is his Projection Postulate—i.e. the postulate that during measurement there is a ’collapse of the wave packet’. It is argued that the theory does not force these principles on us, and that there are severe difficulties in this interpretation, despite also its more recent defences.
Thomas Haigh, Mark Priestley, and Crispin Rope
- Published in print:
- 2016
- Published Online:
- May 2017
- ISBN:
- 9780262033985
- eISBN:
- 9780262334426
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262033985.003.0010
- Subject:
- History, History of Science, Technology, and Medicine
As soon as Metropolis had completed the initial configuration of ENIAC for the new programming method, and before it was working properly, Klara von Neumann arrived to help. She had taken the leading ...
More
As soon as Metropolis had completed the initial configuration of ENIAC for the new programming method, and before it was working properly, Klara von Neumann arrived to help. She had taken the leading role in converting the flow diagrams into program code, and together they worked around the clock for several weeks to get both program and machine into a usable state and to shuffle tens of thousands of cards in and out of it during Monte Carlo simulation of each exploding fission bomb. This chapter integrates the narrative of this initial “run,” of and a second batch of calculations carried out in late-1948 with analysis of the structure of the program itself. It finishes with an exploration of further Monte Carlo work run on ENIAC, including reactor simulations, simulation of uranium-hydride bombs, and in 1950 simulation of the “Super” concept for a hydrogen weapon.Less
As soon as Metropolis had completed the initial configuration of ENIAC for the new programming method, and before it was working properly, Klara von Neumann arrived to help. She had taken the leading role in converting the flow diagrams into program code, and together they worked around the clock for several weeks to get both program and machine into a usable state and to shuffle tens of thousands of cards in and out of it during Monte Carlo simulation of each exploding fission bomb. This chapter integrates the narrative of this initial “run,” of and a second batch of calculations carried out in late-1948 with analysis of the structure of the program itself. It finishes with an exploration of further Monte Carlo work run on ENIAC, including reactor simulations, simulation of uranium-hydride bombs, and in 1950 simulation of the “Super” concept for a hydrogen weapon.
Ken Binmore
- Published in print:
- 2007
- Published Online:
- May 2007
- ISBN:
- 9780195300574
- eISBN:
- 9780199783748
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195300574.003.0007
- Subject:
- Economics and Finance, Microeconomics
This chapter describes the theory of two-person, zero-sum games invented by John Von Neumann in 1928. It begins with an application to the computation of economic shadow prices. It shows that a ...
More
This chapter describes the theory of two-person, zero-sum games invented by John Von Neumann in 1928. It begins with an application to the computation of economic shadow prices. It shows that a two-person game is strictly competitive if, and only if, it has a zero-sum representation. Such a game can be represented using only the first player's payoff matrix. The minimax and maximin values of the matrix are defined and linked to the concept of a saddle point. The ideas are then related to a player's security level in a game. An inductive proof of Von Neumann's minimax theorem is offered. The connexion between the minimax theorem and the duality theorem of linear programming is explained. The method of solving certain two-person, zero-sum games geometrically with the help of the theorem of the separating hyperplane is introduced. The Hide-and-Seek Game is used as a non-trivial example.Less
This chapter describes the theory of two-person, zero-sum games invented by John Von Neumann in 1928. It begins with an application to the computation of economic shadow prices. It shows that a two-person game is strictly competitive if, and only if, it has a zero-sum representation. Such a game can be represented using only the first player's payoff matrix. The minimax and maximin values of the matrix are defined and linked to the concept of a saddle point. The ideas are then related to a player's security level in a game. An inductive proof of Von Neumann's minimax theorem is offered. The connexion between the minimax theorem and the duality theorem of linear programming is explained. The method of solving certain two-person, zero-sum games geometrically with the help of the theorem of the separating hyperplane is introduced. The Hide-and-Seek Game is used as a non-trivial example.
David Bates
- Published in print:
- 2007
- Published Online:
- March 2013
- ISBN:
- 9780226720807
- eISBN:
- 9780226720838
- Item type:
- chapter
- Publisher:
- University of Chicago Press
- DOI:
- 10.7208/chicago/9780226720838.003.0012
- Subject:
- History, History of Science, Technology, and Medicine
This chapter questions the underlying assumptions of both classic Artificial Intelligence, founded in the analogy between the brain and the digital computer, and the newer tradition that construes ...
More
This chapter questions the underlying assumptions of both classic Artificial Intelligence, founded in the analogy between the brain and the digital computer, and the newer tradition that construes the mind as an emergent property of interacting, distributed, parallel processes. It specifically explores Gestalt psychology and its brief engagement with cybernetics to suggest that was perhaps a missed opportunitt, and additionally examines John von Neumann's influential automata theory. The structure of insight helped explain the complex, nonmechanical behavior of living, acting organisms. For von Neumann, the creative plasticity of the nervous system served only to highlight the rather simplistic, and inferior, mechanical structure of the early computers, something he was of course well positioned to notice. His terse conclusion was that the logical structures involved in nervous system activity must “differ considerably” from the ones that are familiar in logic and mathematics.Less
This chapter questions the underlying assumptions of both classic Artificial Intelligence, founded in the analogy between the brain and the digital computer, and the newer tradition that construes the mind as an emergent property of interacting, distributed, parallel processes. It specifically explores Gestalt psychology and its brief engagement with cybernetics to suggest that was perhaps a missed opportunitt, and additionally examines John von Neumann's influential automata theory. The structure of insight helped explain the complex, nonmechanical behavior of living, acting organisms. For von Neumann, the creative plasticity of the nervous system served only to highlight the rather simplistic, and inferior, mechanical structure of the early computers, something he was of course well positioned to notice. His terse conclusion was that the logical structures involved in nervous system activity must “differ considerably” from the ones that are familiar in logic and mathematics.
Kristine C. Harper
- Published in print:
- 2008
- Published Online:
- August 2013
- ISBN:
- 9780262083782
- eISBN:
- 9780262274982
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262083782.003.0005
- Subject:
- Society and Culture, Technology and Society
Carl-Gustav Rossby and his colleagues shared one overriding scientific goal—to pursue basic meteorological research aimed at developing a mathematics-based theory of general circulation—which became ...
More
Carl-Gustav Rossby and his colleagues shared one overriding scientific goal—to pursue basic meteorological research aimed at developing a mathematics-based theory of general circulation—which became wedded to John von Neumann’s Computer Project in early 1946. This chapter discusses how this project came into being.Less
Carl-Gustav Rossby and his colleagues shared one overriding scientific goal—to pursue basic meteorological research aimed at developing a mathematics-based theory of general circulation—which became wedded to John von Neumann’s Computer Project in early 1946. This chapter discusses how this project came into being.
Thomas Haigh, Mark Priestley, and Crispin Rope
- Published in print:
- 2016
- Published Online:
- May 2017
- ISBN:
- 9780262033985
- eISBN:
- 9780262334426
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262033985.003.0009
- Subject:
- History, History of Science, Technology, and Medicine
Traces the series of Monte Carlo simulations run on ENIAC from their genesis in January 1947 exchanges between John von Neumann, Robert Richtmyer, and Stanislaw Ulam through the completion of ...
More
Traces the series of Monte Carlo simulations run on ENIAC from their genesis in January 1947 exchanges between John von Neumann, Robert Richtmyer, and Stanislaw Ulam through the completion of detailed planning work for the initial batch of calculations in December 1947. Close attention to successive drafts illuminates the process by which John and Klara von Neumann worked with Adele Goldstine to transform the former’s outline plan of computation into a fully developed flow diagram documenting the flow of control and manipulation of data for a program written in the new style.Less
Traces the series of Monte Carlo simulations run on ENIAC from their genesis in January 1947 exchanges between John von Neumann, Robert Richtmyer, and Stanislaw Ulam through the completion of detailed planning work for the initial batch of calculations in December 1947. Close attention to successive drafts illuminates the process by which John and Klara von Neumann worked with Adele Goldstine to transform the former’s outline plan of computation into a fully developed flow diagram documenting the flow of control and manipulation of data for a program written in the new style.
Graham Priest
- Published in print:
- 2002
- Published Online:
- October 2011
- ISBN:
- 9780199254057
- eISBN:
- 9780191698194
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199254057.003.0012
- Subject:
- Philosophy, Logic/Philosophy of Mathematics, Metaphysics/Epistemology
This chapter examines the relevance of theories on sets and classes to the contradictions in the limits of thought. It analyses possible solutions to the set-theoretic paradoxes based on Ernst ...
More
This chapter examines the relevance of theories on sets and classes to the contradictions in the limits of thought. It analyses possible solutions to the set-theoretic paradoxes based on Ernst Friedrich Ferdinand Zermelo's Zermelo-Fraenkel set theory and John von Neumann's notion of a proper class. It shows that, like other contemporary solutions to inclosure contradictions, Zermelo's and von Neumann's are not adequate even in the limited domain for which they were designed. This chapter suggests that the only uniform approach to all these paradoxes is the dialetheic one, and that the limits of thought which are the inclosures are truly contradictory.Less
This chapter examines the relevance of theories on sets and classes to the contradictions in the limits of thought. It analyses possible solutions to the set-theoretic paradoxes based on Ernst Friedrich Ferdinand Zermelo's Zermelo-Fraenkel set theory and John von Neumann's notion of a proper class. It shows that, like other contemporary solutions to inclosure contradictions, Zermelo's and von Neumann's are not adequate even in the limited domain for which they were designed. This chapter suggests that the only uniform approach to all these paradoxes is the dialetheic one, and that the limits of thought which are the inclosures are truly contradictory.
Ivan Moscati
- Published in print:
- 2018
- Published Online:
- December 2018
- ISBN:
- 9780199372768
- eISBN:
- 9780199372805
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780199372768.003.0010
- Subject:
- Economics and Finance, Microeconomics
Chapter 9 discusses the axiomatic version of expected utility theory (EUT), a theory of decision-making under risk, put forward by John von Neumann and Oskar Morgenstern in their book Theory of Games ...
More
Chapter 9 discusses the axiomatic version of expected utility theory (EUT), a theory of decision-making under risk, put forward by John von Neumann and Oskar Morgenstern in their book Theory of Games and Economic Behavior (1944). EUT was a changing factor in the history of utility measurement. In fact, while discussions of the measurability of utility before 1944 focused on the utility used to analyze decision-making between risk-free alternatives, after that year, discussions centered on the utility used to analyze decision-making between risky alternatives. In Theory of Games, the nature of the cardinal utility function u featured in von Neumann and Morgenstern’s EUT, and its relationship with the riskless utility function U of previous utility analysis remained ambiguous. Von Neumann and Morgenstern also put forward an axiomatic theory of measurement, which presents some similarities with Stanley Smith Stevens’s measurement theory but had no immediate impact on utility analysis.Less
Chapter 9 discusses the axiomatic version of expected utility theory (EUT), a theory of decision-making under risk, put forward by John von Neumann and Oskar Morgenstern in their book Theory of Games and Economic Behavior (1944). EUT was a changing factor in the history of utility measurement. In fact, while discussions of the measurability of utility before 1944 focused on the utility used to analyze decision-making between risk-free alternatives, after that year, discussions centered on the utility used to analyze decision-making between risky alternatives. In Theory of Games, the nature of the cardinal utility function u featured in von Neumann and Morgenstern’s EUT, and its relationship with the riskless utility function U of previous utility analysis remained ambiguous. Von Neumann and Morgenstern also put forward an axiomatic theory of measurement, which presents some similarities with Stanley Smith Stevens’s measurement theory but had no immediate impact on utility analysis.
John Johnston
- Published in print:
- 2008
- Published Online:
- August 2013
- ISBN:
- 9780262101264
- eISBN:
- 9780262276351
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262101264.003.0002
- Subject:
- Computer Science, Artificial Intelligence
This chapter makes a case for the fundamental complexity of cybernetic machines as a new species of automata, existing both “in the metal and in the flesh,” to use Norbert Wiener’s expression, as ...
More
This chapter makes a case for the fundamental complexity of cybernetic machines as a new species of automata, existing both “in the metal and in the flesh,” to use Norbert Wiener’s expression, as built and theorized by Claude Shannon, Ross Ashby, John von Neumann, Grey Walter, Heinz von Foerster, and Valentino Braitenberg.Less
This chapter makes a case for the fundamental complexity of cybernetic machines as a new species of automata, existing both “in the metal and in the flesh,” to use Norbert Wiener’s expression, as built and theorized by Claude Shannon, Ross Ashby, John von Neumann, Grey Walter, Heinz von Foerster, and Valentino Braitenberg.
Thomas Haigh, Mark Priestley, and Crispin Rope
- Published in print:
- 2016
- Published Online:
- May 2017
- ISBN:
- 9780262033985
- eISBN:
- 9780262334426
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262033985.003.0008
- Subject:
- History, History of Science, Technology, and Medicine
In spring 1947 a project was launched to convert ENIAC to run code written in the new from introduced with the 1945 “First Draft of a Report on the EDVAC.” This was intertwined with the planning of ...
More
In spring 1947 a project was launched to convert ENIAC to run code written in the new from introduced with the 1945 “First Draft of a Report on the EDVAC.” This was intertwined with the planning of Monte Carlo calculations for Los Alamos. Adele Goldstine worked with a team of contractors led by Jean Bartik and a group of Aberdeen employees under Richard Clippinger to develop a succession of planned “set-ups” to implement a new control mechanism and vocabulary of general purpose instructions for ENIAC. Our analysis focuses particularly on the relationship of this work on concurrent efforts by von Neumann’s team on the design of the Institute for Advanced Studies computer and a series of related reports on programming methods. Accounts by participants and historians have differed dramatically in assigning credit for the conversion and on such basic facts as when the conversion was implemented and what version of the design was used. The conversion was finally implement in March 1948 by Nick Metropolis (of Los Alamos and the University of Chicago) using a variant design he formulated with Klara von Neumann. At this point ENIAC became the first computer ever to execute a program written in the “modern code paradigm.”Less
In spring 1947 a project was launched to convert ENIAC to run code written in the new from introduced with the 1945 “First Draft of a Report on the EDVAC.” This was intertwined with the planning of Monte Carlo calculations for Los Alamos. Adele Goldstine worked with a team of contractors led by Jean Bartik and a group of Aberdeen employees under Richard Clippinger to develop a succession of planned “set-ups” to implement a new control mechanism and vocabulary of general purpose instructions for ENIAC. Our analysis focuses particularly on the relationship of this work on concurrent efforts by von Neumann’s team on the design of the Institute for Advanced Studies computer and a series of related reports on programming methods. Accounts by participants and historians have differed dramatically in assigning credit for the conversion and on such basic facts as when the conversion was implemented and what version of the design was used. The conversion was finally implement in March 1948 by Nick Metropolis (of Los Alamos and the University of Chicago) using a variant design he formulated with Klara von Neumann. At this point ENIAC became the first computer ever to execute a program written in the “modern code paradigm.”
Johannes Lenhard
- Published in print:
- 2019
- Published Online:
- March 2019
- ISBN:
- 9780190873288
- eISBN:
- 9780190873318
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780190873288.003.0007
- Subject:
- Philosophy, Philosophy of Science
This chapter distinguishes two fundamental but opposing conceptions of simulation. The first conception conceives simulations as numerical solutions of equations. The second approach does not involve ...
More
This chapter distinguishes two fundamental but opposing conceptions of simulation. The first conception conceives simulations as numerical solutions of equations. The second approach does not involve the concept of solution, but takes simulation as the imitation of the behavior of a complex system by a computer model. This chapter claims that simulation modeling combines both conceptions. Large parts of the sciences involve a compromise (in one way or another) between two diverging forces. Theoretical understanding and epistemic quality stand on the one side; applicability and tractability on the other. What is interesting about simulation is the way in which a balance is achieved—that is, how the conflicting types are combined. The chapter analyzes the relationship between the simulation pioneers John von Neumann, who advocated the solution, and Norbert Wiener, who advocated the imitation concept.Less
This chapter distinguishes two fundamental but opposing conceptions of simulation. The first conception conceives simulations as numerical solutions of equations. The second approach does not involve the concept of solution, but takes simulation as the imitation of the behavior of a complex system by a computer model. This chapter claims that simulation modeling combines both conceptions. Large parts of the sciences involve a compromise (in one way or another) between two diverging forces. Theoretical understanding and epistemic quality stand on the one side; applicability and tractability on the other. What is interesting about simulation is the way in which a balance is achieved—that is, how the conflicting types are combined. The chapter analyzes the relationship between the simulation pioneers John von Neumann, who advocated the solution, and Norbert Wiener, who advocated the imitation concept.
Anthony Chaney
- Published in print:
- 2017
- Published Online:
- January 2018
- ISBN:
- 9781469631738
- eISBN:
- 9781469631752
- Item type:
- chapter
- Publisher:
- University of North Carolina Press
- DOI:
- 10.5149/northcarolina/9781469631738.003.0005
- Subject:
- History, Environmental History
This chapter explores double-bind theory and the concept of power, including the sexist tendency among proponents to blame the mother. Similarly, radicals, liberals, and secular existentialists ...
More
This chapter explores double-bind theory and the concept of power, including the sexist tendency among proponents to blame the mother. Similarly, radicals, liberals, and secular existentialists challenged the “tragic turn” as bourgeois accommodation to status quo power relations. The holism of systemic approaches foregrounded the old problem of whether nature supplies an ethic. Bateson and the double-bind research team struggled to account for power in the schizophrenic family in a way that blamed neither victim nor victimizer. Bateson's recognition of progressive stalemate in the schizophrenic family drew on the systems theory concept of runaway. Runaway in arms race policies, in turn, reflected political and theoretical conflicts between Norbert Weiner and John von Neumann, the leading mathematicians of the Macy Conferences on Cybernetics. In two essays, Bateson critiques the centrality of power in von Neumann's game theory. Meanwhile, Bateson's conflicts with the more pragmatic research team members, such as Jay Haley, lead him to cast about for a new direction. His eulogy for Frieda Fromm-Reichmann echoed a similar debate over political quietism between Reinhold Niebuhr and Richard Niebuhr.Less
This chapter explores double-bind theory and the concept of power, including the sexist tendency among proponents to blame the mother. Similarly, radicals, liberals, and secular existentialists challenged the “tragic turn” as bourgeois accommodation to status quo power relations. The holism of systemic approaches foregrounded the old problem of whether nature supplies an ethic. Bateson and the double-bind research team struggled to account for power in the schizophrenic family in a way that blamed neither victim nor victimizer. Bateson's recognition of progressive stalemate in the schizophrenic family drew on the systems theory concept of runaway. Runaway in arms race policies, in turn, reflected political and theoretical conflicts between Norbert Weiner and John von Neumann, the leading mathematicians of the Macy Conferences on Cybernetics. In two essays, Bateson critiques the centrality of power in von Neumann's game theory. Meanwhile, Bateson's conflicts with the more pragmatic research team members, such as Jay Haley, lead him to cast about for a new direction. His eulogy for Frieda Fromm-Reichmann echoed a similar debate over political quietism between Reinhold Niebuhr and Richard Niebuhr.
John Johnston
- Published in print:
- 2008
- Published Online:
- August 2013
- ISBN:
- 9780262101264
- eISBN:
- 9780262276351
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262101264.003.0005
- Subject:
- Computer Science, Artificial Intelligence
This chapter examines John von Neumann’s theory of self-reproducing automata and Christopher Langton’s self-reproducing digital loops. Langton’s theory of artificial life (ALife) as a new science ...
More
This chapter examines John von Neumann’s theory of self-reproducing automata and Christopher Langton’s self-reproducing digital loops. Langton’s theory of artificial life (ALife) as a new science based on computer simulations whose theoretical underpinnings combine information theory with dynamical systems theory is contrasted with Francisco Varela and Humberto Maturana’s theory of autopoiesis, which leads to a consideration of both natural and artificial immune systems and computer viruses.Less
This chapter examines John von Neumann’s theory of self-reproducing automata and Christopher Langton’s self-reproducing digital loops. Langton’s theory of artificial life (ALife) as a new science based on computer simulations whose theoretical underpinnings combine information theory with dynamical systems theory is contrasted with Francisco Varela and Humberto Maturana’s theory of autopoiesis, which leads to a consideration of both natural and artificial immune systems and computer viruses.
Chris Bleakley
- Published in print:
- 2020
- Published Online:
- October 2020
- ISBN:
- 9780198853732
- eISBN:
- 9780191888168
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198853732.003.0004
- Subject:
- Mathematics, History of Mathematics, Logic / Computer Science / Mathematical Philosophy
Chapter 4 tells the story of numerical weather forecasting from its inception to today’s supercomputing algorithms. In 1922, Lewis Fry Richardson proposed that, since the atmosphere is subject to the ...
More
Chapter 4 tells the story of numerical weather forecasting from its inception to today’s supercomputing algorithms. In 1922, Lewis Fry Richardson proposed that, since the atmosphere is subject to the laws of physics, future weather can be predicted by means of algorithmic calculations. His attempt at forecasting a single day’s weather by means of manual calculations took several months. In the late 1940s, John von Neumann resurrected Richardson’s idea and launched a project to conduct the first weather forecast by computer. The world’s first operational electronic computer – ENIAC - completed a 24-hour forecast in just one day. It appeared that accurate forecasting simply required faster computers. In 1969, Edward Lorenz discovered that tiny errors in weather measurements can accumulate during numerical forecasting to produce large errors. The so-called Butterfly Effect was alleviated by the Monte Carlo simulation method invented by Stanislaw Ulam for particle physics.Less
Chapter 4 tells the story of numerical weather forecasting from its inception to today’s supercomputing algorithms. In 1922, Lewis Fry Richardson proposed that, since the atmosphere is subject to the laws of physics, future weather can be predicted by means of algorithmic calculations. His attempt at forecasting a single day’s weather by means of manual calculations took several months. In the late 1940s, John von Neumann resurrected Richardson’s idea and launched a project to conduct the first weather forecast by computer. The world’s first operational electronic computer – ENIAC - completed a 24-hour forecast in just one day. It appeared that accurate forecasting simply required faster computers. In 1969, Edward Lorenz discovered that tiny errors in weather measurements can accumulate during numerical forecasting to produce large errors. The so-called Butterfly Effect was alleviated by the Monte Carlo simulation method invented by Stanislaw Ulam for particle physics.