Ken Binmore
- Published in print:
- 2007
- Published Online:
- May 2007
- ISBN:
- 9780195300574
- eISBN:
- 9780199783748
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195300574.003.0007
- Subject:
- Economics and Finance, Microeconomics
This chapter describes the theory of two-person, zero-sum games invented by John Von Neumann in 1928. It begins with an application to the computation of economic shadow prices. It shows that a ...
More
This chapter describes the theory of two-person, zero-sum games invented by John Von Neumann in 1928. It begins with an application to the computation of economic shadow prices. It shows that a two-person game is strictly competitive if, and only if, it has a zero-sum representation. Such a game can be represented using only the first player's payoff matrix. The minimax and maximin values of the matrix are defined and linked to the concept of a saddle point. The ideas are then related to a player's security level in a game. An inductive proof of Von Neumann's minimax theorem is offered. The connexion between the minimax theorem and the duality theorem of linear programming is explained. The method of solving certain two-person, zero-sum games geometrically with the help of the theorem of the separating hyperplane is introduced. The Hide-and-Seek Game is used as a non-trivial example.Less
This chapter describes the theory of two-person, zero-sum games invented by John Von Neumann in 1928. It begins with an application to the computation of economic shadow prices. It shows that a two-person game is strictly competitive if, and only if, it has a zero-sum representation. Such a game can be represented using only the first player's payoff matrix. The minimax and maximin values of the matrix are defined and linked to the concept of a saddle point. The ideas are then related to a player's security level in a game. An inductive proof of Von Neumann's minimax theorem is offered. The connexion between the minimax theorem and the duality theorem of linear programming is explained. The method of solving certain two-person, zero-sum games geometrically with the help of the theorem of the separating hyperplane is introduced. The Hide-and-Seek Game is used as a non-trivial example.
Thomas Haigh, Mark Priestley, and Crispin Rope
- Published in print:
- 2016
- Published Online:
- May 2017
- ISBN:
- 9780262033985
- eISBN:
- 9780262334426
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262033985.003.0007
- Subject:
- History, History of Science, Technology, and Medicine
This chapter charts the rapid evolution of thinking about programming and computer architecture among members of the ENIAC team from 1944 onward, as what is usually called the “stored program ...
More
This chapter charts the rapid evolution of thinking about programming and computer architecture among members of the ENIAC team from 1944 onward, as what is usually called the “stored program concept” was formulated with John von Neumann and presented in the “First Draft of a Report on the EDVAC.” Use of archival sources makes this more specific and rigorous in documenting this process than any previous published account, presenting the new approach as an evolution of, and response to, the original ENIAC programming method. The ideas present in the “First Draft” are clearly explained and separated into three distinct clusters: the “EDVAC hardware paradigm,” the “von Neumann architecture paradigm,” and the “modern code paradigm.” The chapter finishes with an exploration of initial understanding and reception of these ideas, reconstructing the late-1940s consensus on what was important about the new approach and why.Less
This chapter charts the rapid evolution of thinking about programming and computer architecture among members of the ENIAC team from 1944 onward, as what is usually called the “stored program concept” was formulated with John von Neumann and presented in the “First Draft of a Report on the EDVAC.” Use of archival sources makes this more specific and rigorous in documenting this process than any previous published account, presenting the new approach as an evolution of, and response to, the original ENIAC programming method. The ideas present in the “First Draft” are clearly explained and separated into three distinct clusters: the “EDVAC hardware paradigm,” the “von Neumann architecture paradigm,” and the “modern code paradigm.” The chapter finishes with an exploration of initial understanding and reception of these ideas, reconstructing the late-1940s consensus on what was important about the new approach and why.
Bas C. van Fraassen
- Published in print:
- 1991
- Published Online:
- November 2003
- ISBN:
- 9780198239802
- eISBN:
- 9780191597466
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198239807.003.0008
- Subject:
- Philosophy, Philosophy of Science
Von Neumann's unification of Schroedinger's and Heisenberg's formalisms came with an interpretation of quantum theory involving two principles. The first is that assertions about the values of ...
More
Von Neumann's unification of Schroedinger's and Heisenberg's formalisms came with an interpretation of quantum theory involving two principles. The first is that assertions about the values of observables are equivalent to assertions about the quantum‐mechanical state of the system. This is sometimes known as the ’eigenvalue–eigenstate link’, since it equates an observable having a value with the system being in an eigenstate of that observable. The second is his Projection Postulate—i.e. the postulate that during measurement there is a ’collapse of the wave packet’. It is argued that the theory does not force these principles on us, and that there are severe difficulties in this interpretation, despite also its more recent defences.Less
Von Neumann's unification of Schroedinger's and Heisenberg's formalisms came with an interpretation of quantum theory involving two principles. The first is that assertions about the values of observables are equivalent to assertions about the quantum‐mechanical state of the system. This is sometimes known as the ’eigenvalue–eigenstate link’, since it equates an observable having a value with the system being in an eigenstate of that observable. The second is his Projection Postulate—i.e. the postulate that during measurement there is a ’collapse of the wave packet’. It is argued that the theory does not force these principles on us, and that there are severe difficulties in this interpretation, despite also its more recent defences.
Thomas Haigh, Mark Priestley, and Crispin Rope
- Published in print:
- 2016
- Published Online:
- May 2017
- ISBN:
- 9780262033985
- eISBN:
- 9780262334426
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262033985.003.0009
- Subject:
- History, History of Science, Technology, and Medicine
Traces the series of Monte Carlo simulations run on ENIAC from their genesis in January 1947 exchanges between John von Neumann, Robert Richtmyer, and Stanislaw Ulam through the completion of ...
More
Traces the series of Monte Carlo simulations run on ENIAC from their genesis in January 1947 exchanges between John von Neumann, Robert Richtmyer, and Stanislaw Ulam through the completion of detailed planning work for the initial batch of calculations in December 1947. Close attention to successive drafts illuminates the process by which John and Klara von Neumann worked with Adele Goldstine to transform the former’s outline plan of computation into a fully developed flow diagram documenting the flow of control and manipulation of data for a program written in the new style.Less
Traces the series of Monte Carlo simulations run on ENIAC from their genesis in January 1947 exchanges between John von Neumann, Robert Richtmyer, and Stanislaw Ulam through the completion of detailed planning work for the initial batch of calculations in December 1947. Close attention to successive drafts illuminates the process by which John and Klara von Neumann worked with Adele Goldstine to transform the former’s outline plan of computation into a fully developed flow diagram documenting the flow of control and manipulation of data for a program written in the new style.
Thomas Haigh, Mark Priestley, and Crispin Rope
- Published in print:
- 2016
- Published Online:
- May 2017
- ISBN:
- 9780262033985
- eISBN:
- 9780262334426
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262033985.003.0010
- Subject:
- History, History of Science, Technology, and Medicine
As soon as Metropolis had completed the initial configuration of ENIAC for the new programming method, and before it was working properly, Klara von Neumann arrived to help. She had taken the leading ...
More
As soon as Metropolis had completed the initial configuration of ENIAC for the new programming method, and before it was working properly, Klara von Neumann arrived to help. She had taken the leading role in converting the flow diagrams into program code, and together they worked around the clock for several weeks to get both program and machine into a usable state and to shuffle tens of thousands of cards in and out of it during Monte Carlo simulation of each exploding fission bomb. This chapter integrates the narrative of this initial “run,” of and a second batch of calculations carried out in late-1948 with analysis of the structure of the program itself. It finishes with an exploration of further Monte Carlo work run on ENIAC, including reactor simulations, simulation of uranium-hydride bombs, and in 1950 simulation of the “Super” concept for a hydrogen weapon.Less
As soon as Metropolis had completed the initial configuration of ENIAC for the new programming method, and before it was working properly, Klara von Neumann arrived to help. She had taken the leading role in converting the flow diagrams into program code, and together they worked around the clock for several weeks to get both program and machine into a usable state and to shuffle tens of thousands of cards in and out of it during Monte Carlo simulation of each exploding fission bomb. This chapter integrates the narrative of this initial “run,” of and a second batch of calculations carried out in late-1948 with analysis of the structure of the program itself. It finishes with an exploration of further Monte Carlo work run on ENIAC, including reactor simulations, simulation of uranium-hydride bombs, and in 1950 simulation of the “Super” concept for a hydrogen weapon.
Thomas Haigh, Mark Priestley, and Crispin Rope
- Published in print:
- 2016
- Published Online:
- May 2017
- ISBN:
- 9780262033985
- eISBN:
- 9780262334426
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262033985.003.0008
- Subject:
- History, History of Science, Technology, and Medicine
In spring 1947 a project was launched to convert ENIAC to run code written in the new from introduced with the 1945 “First Draft of a Report on the EDVAC.” This was intertwined with the planning of ...
More
In spring 1947 a project was launched to convert ENIAC to run code written in the new from introduced with the 1945 “First Draft of a Report on the EDVAC.” This was intertwined with the planning of Monte Carlo calculations for Los Alamos. Adele Goldstine worked with a team of contractors led by Jean Bartik and a group of Aberdeen employees under Richard Clippinger to develop a succession of planned “set-ups” to implement a new control mechanism and vocabulary of general purpose instructions for ENIAC. Our analysis focuses particularly on the relationship of this work on concurrent efforts by von Neumann’s team on the design of the Institute for Advanced Studies computer and a series of related reports on programming methods. Accounts by participants and historians have differed dramatically in assigning credit for the conversion and on such basic facts as when the conversion was implemented and what version of the design was used. The conversion was finally implement in March 1948 by Nick Metropolis (of Los Alamos and the University of Chicago) using a variant design he formulated with Klara von Neumann. At this point ENIAC became the first computer ever to execute a program written in the “modern code paradigm.”Less
In spring 1947 a project was launched to convert ENIAC to run code written in the new from introduced with the 1945 “First Draft of a Report on the EDVAC.” This was intertwined with the planning of Monte Carlo calculations for Los Alamos. Adele Goldstine worked with a team of contractors led by Jean Bartik and a group of Aberdeen employees under Richard Clippinger to develop a succession of planned “set-ups” to implement a new control mechanism and vocabulary of general purpose instructions for ENIAC. Our analysis focuses particularly on the relationship of this work on concurrent efforts by von Neumann’s team on the design of the Institute for Advanced Studies computer and a series of related reports on programming methods. Accounts by participants and historians have differed dramatically in assigning credit for the conversion and on such basic facts as when the conversion was implemented and what version of the design was used. The conversion was finally implement in March 1948 by Nick Metropolis (of Los Alamos and the University of Chicago) using a variant design he formulated with Klara von Neumann. At this point ENIAC became the first computer ever to execute a program written in the “modern code paradigm.”
James A. Anderson
- Published in print:
- 2017
- Published Online:
- February 2018
- ISBN:
- 9780199357789
- eISBN:
- 9780190675264
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199357789.003.0007
- Subject:
- Psychology, Cognitive Psychology
Brains and computers were twins separated at birth. In 1943, it was known that action potentials were all or none, approximating TRUE or FALSE. In that year, Walter Pitts and Warren McCulloch wrote a ...
More
Brains and computers were twins separated at birth. In 1943, it was known that action potentials were all or none, approximating TRUE or FALSE. In that year, Walter Pitts and Warren McCulloch wrote a paper suggesting that neurons were computing logic functions and that networks of such neurons could compute any finite logic function. This was a bold and exciting large-scale theory of brain function. Around the same time, the first digital computer, the ENIAC, was being built. The McCulloch-Pitts work was well known to the scientists building ENIAC. The connection between them appeared explicitly in a report by John von Neumann on the successor to the ENIAC, the EDVAC. It soon became clear that biological brain computation was not based on logic functions. However, this idea was believed by many scientists for decades. A brilliant wrong theory can sometimes cause trouble.Less
Brains and computers were twins separated at birth. In 1943, it was known that action potentials were all or none, approximating TRUE or FALSE. In that year, Walter Pitts and Warren McCulloch wrote a paper suggesting that neurons were computing logic functions and that networks of such neurons could compute any finite logic function. This was a bold and exciting large-scale theory of brain function. Around the same time, the first digital computer, the ENIAC, was being built. The McCulloch-Pitts work was well known to the scientists building ENIAC. The connection between them appeared explicitly in a report by John von Neumann on the successor to the ENIAC, the EDVAC. It soon became clear that biological brain computation was not based on logic functions. However, this idea was believed by many scientists for decades. A brilliant wrong theory can sometimes cause trouble.