Nathan L. Ensmenger
- Published in print:
- 2010
- Published Online:
- August 2013
- ISBN:
- 9780262050937
- eISBN:
- 9780262289351
- Item type:
- book
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262050937.001.0001
- Subject:
- Information Science, Information Science
This is a book about the computer revolution of the mid-twentieth century and the people who made it possible. Unlike most histories of computing, it is not a book about machines, inventors, or ...
More
This is a book about the computer revolution of the mid-twentieth century and the people who made it possible. Unlike most histories of computing, it is not a book about machines, inventors, or entrepreneurs. Instead, the book tells the story of the vast but largely anonymous legions of computer specialists—programmers, systems analysts, and other software developers—who transformed the electronic computer from a scientific curiosity into the defining technology of the modern era. As the systems that they built became increasingly powerful and ubiquitous, these specialists became the focus of a series of critiques of the social and organizational impact of electronic computing. To many of their contemporaries, it seemed the “computer boys” were taking over, not just in the corporate setting, but also in government, politics, and society in general. This book traces the rise to power of the computer expert in modern American society. Its portrayal of the men and women (a surprising number of the “computer boys” were, in fact, female) who built their careers around the novel technology of electronic computing explores issues of power, identity and expertise that have only become more significant in our increasingly computerized society. In a recasting of the drama of the computer revolution through the eyes of its principle revolutionaries, the book reminds us that the computerization of modern society was not an inevitable process driven by impersonal technological or economic imperatives, but was rather a creative, contentious, and above all, fundamentally human development.Less
This is a book about the computer revolution of the mid-twentieth century and the people who made it possible. Unlike most histories of computing, it is not a book about machines, inventors, or entrepreneurs. Instead, the book tells the story of the vast but largely anonymous legions of computer specialists—programmers, systems analysts, and other software developers—who transformed the electronic computer from a scientific curiosity into the defining technology of the modern era. As the systems that they built became increasingly powerful and ubiquitous, these specialists became the focus of a series of critiques of the social and organizational impact of electronic computing. To many of their contemporaries, it seemed the “computer boys” were taking over, not just in the corporate setting, but also in government, politics, and society in general. This book traces the rise to power of the computer expert in modern American society. Its portrayal of the men and women (a surprising number of the “computer boys” were, in fact, female) who built their careers around the novel technology of electronic computing explores issues of power, identity and expertise that have only become more significant in our increasingly computerized society. In a recasting of the drama of the computer revolution through the eyes of its principle revolutionaries, the book reminds us that the computerization of modern society was not an inevitable process driven by impersonal technological or economic imperatives, but was rather a creative, contentious, and above all, fundamentally human development.
Steven W. Usselman
- Published in print:
- 2007
- Published Online:
- August 2013
- ISBN:
- 9780262122894
- eISBN:
- 9780262277884
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262122894.003.0009
- Subject:
- Economics and Finance, Economic History
This chapter adds certain moderate adjustments to the body of work promulgated by the National Academy of Science, and also the work of David Mowery, that deals with the notion of modern computing ...
More
This chapter adds certain moderate adjustments to the body of work promulgated by the National Academy of Science, and also the work of David Mowery, that deals with the notion of modern computing being the product of massive public investment and government funding. The goal of the chapter, however, is to suggest that private enterprise and private capital—and not just government funding—played certain roles in influencing computing. IBM, in particular, is given more focus here to determine how the firm has contributed to the emergence and refinement of the storage capacity of the computer from the end of World War II until the development of the System/360. The conclusion arrived at is that while certain activities of the government may have benefited IBM, the government also drew IBM away from various opportunities that might have allowed them to blossom without the required intervention from the public sector.Less
This chapter adds certain moderate adjustments to the body of work promulgated by the National Academy of Science, and also the work of David Mowery, that deals with the notion of modern computing being the product of massive public investment and government funding. The goal of the chapter, however, is to suggest that private enterprise and private capital—and not just government funding—played certain roles in influencing computing. IBM, in particular, is given more focus here to determine how the firm has contributed to the emergence and refinement of the storage capacity of the computer from the end of World War II until the development of the System/360. The conclusion arrived at is that while certain activities of the government may have benefited IBM, the government also drew IBM away from various opportunities that might have allowed them to blossom without the required intervention from the public sector.
Renée Levine Packer
- Published in print:
- 2010
- Published Online:
- September 2010
- ISBN:
- 9780199730773
- eISBN:
- 9780199863532
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199730773.003.0001
- Subject:
- Music, Popular, History, American
The introduction traces the genesis of the Center of the Creative and Performing Arts as a forum to remedy the growing disparity that existed between what musicians learned in the conservatories and ...
More
The introduction traces the genesis of the Center of the Creative and Performing Arts as a forum to remedy the growing disparity that existed between what musicians learned in the conservatories and the new compositional techniques and ideas being employed by composers including improvisation, chance processes, theater pieces, sound installations, mixed media, and electronic and computer music. It presents an overview of contemporary music groups in existence in the early 1960s and notes the changing nature of experimental art making. Lastly, Buffalo's tradition of embracing challenge and innovation is outlined.Less
The introduction traces the genesis of the Center of the Creative and Performing Arts as a forum to remedy the growing disparity that existed between what musicians learned in the conservatories and the new compositional techniques and ideas being employed by composers including improvisation, chance processes, theater pieces, sound installations, mixed media, and electronic and computer music. It presents an overview of contemporary music groups in existence in the early 1960s and notes the changing nature of experimental art making. Lastly, Buffalo's tradition of embracing challenge and innovation is outlined.
Jack Copeland
- Published in print:
- 2017
- Published Online:
- November 2020
- ISBN:
- 9780198747826
- eISBN:
- 9780191916946
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198747826.003.0029
- Subject:
- Computer Science, History of Computer Science
The modern computer age began on 21 June 1948, when the first electronic universal stored-program computer successfully ran its first program. Built in ...
More
The modern computer age began on 21 June 1948, when the first electronic universal stored-program computer successfully ran its first program. Built in Manchester, this ancestral computer was the world’s first universal Turing machine in hardware. Fittingly, it was called simply ‘Baby’. The story of Turing’s involvement with Baby and with its successors at Manchester is a tangled one. The world’s first electronic stored-program digital computer ran its first program in the summer of 1948 (Fig. 20.1). ‘A small electronic digital computing machine has been operating successfully for some weeks in the Royal Society Computing Machine Laboratory’, wrote Baby’s designers, Freddie Williams and Tom Kilburn, in the letter to the scientific periodical Nature that announced their success to the world. Williams, a native of the Manchester area, had spent his war years working on radar in rural Worcestershire. Kilburn, his assistant, was a bluntspeaking Yorkshireman. By the end of the fighting there wasn’t much that, between them, they didn’t know about the state of the art in electronics. In December 1945 the two friends returned to the north of England to pioneer the modern computer. Baby was a classic case of a small-scale university pilot project that led to successful commercial development by an external company. The Manchester engineering firm Ferranti built its Ferranti Mark I computer to Williams’s and Kilburn’s design: this was the earliest commercially available electronic digital computer. The first Ferranti rolled out of the factory in February 1951. UNIVAC I, the earliest computer to go on the market in the United States, came a close second: the first one was delivered a few weeks later, in March 1951. Williams and Kilburn developed a high-speed memory for Baby that went on to become a mainstay of computing worldwide. It consisted of cathode-ray tubes resembling small television tubes. Data (zeros and ones) were stored as a scatter of dots on each tube’s screen: a small focused dot represented ‘1’ and a larger blurry dot represented ‘0’. The Williams tube memory, as the invention was soon called, was also used in Baby’s immediate successors, built at Manchester University and by Ferranti Ltd.
Less
The modern computer age began on 21 June 1948, when the first electronic universal stored-program computer successfully ran its first program. Built in Manchester, this ancestral computer was the world’s first universal Turing machine in hardware. Fittingly, it was called simply ‘Baby’. The story of Turing’s involvement with Baby and with its successors at Manchester is a tangled one. The world’s first electronic stored-program digital computer ran its first program in the summer of 1948 (Fig. 20.1). ‘A small electronic digital computing machine has been operating successfully for some weeks in the Royal Society Computing Machine Laboratory’, wrote Baby’s designers, Freddie Williams and Tom Kilburn, in the letter to the scientific periodical Nature that announced their success to the world. Williams, a native of the Manchester area, had spent his war years working on radar in rural Worcestershire. Kilburn, his assistant, was a bluntspeaking Yorkshireman. By the end of the fighting there wasn’t much that, between them, they didn’t know about the state of the art in electronics. In December 1945 the two friends returned to the north of England to pioneer the modern computer. Baby was a classic case of a small-scale university pilot project that led to successful commercial development by an external company. The Manchester engineering firm Ferranti built its Ferranti Mark I computer to Williams’s and Kilburn’s design: this was the earliest commercially available electronic digital computer. The first Ferranti rolled out of the factory in February 1951. UNIVAC I, the earliest computer to go on the market in the United States, came a close second: the first one was delivered a few weeks later, in March 1951. Williams and Kilburn developed a high-speed memory for Baby that went on to become a mainstay of computing worldwide. It consisted of cathode-ray tubes resembling small television tubes. Data (zeros and ones) were stored as a scatter of dots on each tube’s screen: a small focused dot represented ‘1’ and a larger blurry dot represented ‘0’. The Williams tube memory, as the invention was soon called, was also used in Baby’s immediate successors, built at Manchester University and by Ferranti Ltd.
Martin Campbell-Kelly
- Published in print:
- 2017
- Published Online:
- November 2020
- ISBN:
- 9780198747826
- eISBN:
- 9780191916946
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198747826.003.0030
- Subject:
- Computer Science, History of Computer Science
In October 1945 Alan Turing was recruited by the National Physical Laboratory to lead computer development. His design for a computer, the Automatic ...
More
In October 1945 Alan Turing was recruited by the National Physical Laboratory to lead computer development. His design for a computer, the Automatic Computing Engine (ACE), was idiosyncratic but highly effective. The small-scale Pilot ACE, completed in 1950, was the fastest medium-sized computer of its era. By the time that the full-sized ACE was operational in 1958, however, technological advance had rendered it obsolescent. Although the wartime Bletchley Park operation saw the development of the electromechanical codebreaking bombe (specified by Turing) and the electronic Colossus (to which Turing was a bystander), these inventions had no direct impact on the invention of the electronic storedprogram computer, which originated in the United States. The stored-program computer was described in the classic ‘First draft of a report on the EDVAC’, written by John von Neumann on behalf of the computer group at the Moore School of Electrical Engineering, University of Pennsylvania, in June 1945. The report was the outcome of a series of discussions commencing in the summer of 1944 between von Neumann and the inventors of the ENIAC computer—John Presper Eckert, John W. Mauchly, and others. ENIAC was an electronic computer designed primarily for ballistics calculations: in practice, the machine was limited to the integration of ordinary differential equations and it had several other design shortcomings, including a vast number of electronic tubes (18,000) and a tiny memory of just twenty numbers. It was also very time-consuming to program. The EDVAC design grew out of an attempt to remedy these shortcomings. The most novel concept in the EDVAC, which gave it the description ‘stored program’, was the decision to store both instructions and numbers in the same memory. It is worth noting that during 1936 Turing became a research student of Alonzo Church at Princeton University. Turing came to know von Neumann, who was a founding professor of the Institute for Advanced Study (IAS) in Princeton and was fully aware of Turing’s 1936 paper ‘On computable numbers’. Indeed, von Neumann was sufficiently impressed with it that he invited Turing to become his research assistant at the IAS, but Turing decided to return to England and subsequently spent the war years at Bletchley Park.
Less
In October 1945 Alan Turing was recruited by the National Physical Laboratory to lead computer development. His design for a computer, the Automatic Computing Engine (ACE), was idiosyncratic but highly effective. The small-scale Pilot ACE, completed in 1950, was the fastest medium-sized computer of its era. By the time that the full-sized ACE was operational in 1958, however, technological advance had rendered it obsolescent. Although the wartime Bletchley Park operation saw the development of the electromechanical codebreaking bombe (specified by Turing) and the electronic Colossus (to which Turing was a bystander), these inventions had no direct impact on the invention of the electronic storedprogram computer, which originated in the United States. The stored-program computer was described in the classic ‘First draft of a report on the EDVAC’, written by John von Neumann on behalf of the computer group at the Moore School of Electrical Engineering, University of Pennsylvania, in June 1945. The report was the outcome of a series of discussions commencing in the summer of 1944 between von Neumann and the inventors of the ENIAC computer—John Presper Eckert, John W. Mauchly, and others. ENIAC was an electronic computer designed primarily for ballistics calculations: in practice, the machine was limited to the integration of ordinary differential equations and it had several other design shortcomings, including a vast number of electronic tubes (18,000) and a tiny memory of just twenty numbers. It was also very time-consuming to program. The EDVAC design grew out of an attempt to remedy these shortcomings. The most novel concept in the EDVAC, which gave it the description ‘stored program’, was the decision to store both instructions and numbers in the same memory. It is worth noting that during 1936 Turing became a research student of Alonzo Church at Princeton University. Turing came to know von Neumann, who was a founding professor of the Institute for Advanced Study (IAS) in Princeton and was fully aware of Turing’s 1936 paper ‘On computable numbers’. Indeed, von Neumann was sufficiently impressed with it that he invited Turing to become his research assistant at the IAS, but Turing decided to return to England and subsequently spent the war years at Bletchley Park.
Wendy Hui Kyong Chun
- Published in print:
- 2008
- Published Online:
- August 2013
- ISBN:
- 9780262062749
- eISBN:
- 9780262273343
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262062749.003.0032
- Subject:
- Society and Culture, Media Studies
This chapter briefly discusses the different conceptions of program and programmability in the digital computer field; the discussion continues with two different grammatical definitions of the term ...
More
This chapter briefly discusses the different conceptions of program and programmability in the digital computer field; the discussion continues with two different grammatical definitions of the term “program,” verb and noun. It also focuses on ENIAC, the first working electronic digital computer. The chapter describes the requirements and the process of programming the analog and digital computer machines along with the works and arguments of various computer scientists. It furthermore states that programming an analog computer is descriptive while programming a digital one is prescriptive. The conclusion explains how the programmability concept affected the world of computers, from quantum computers to biology computing fields like DNA and RNA computing.Less
This chapter briefly discusses the different conceptions of program and programmability in the digital computer field; the discussion continues with two different grammatical definitions of the term “program,” verb and noun. It also focuses on ENIAC, the first working electronic digital computer. The chapter describes the requirements and the process of programming the analog and digital computer machines along with the works and arguments of various computer scientists. It furthermore states that programming an analog computer is descriptive while programming a digital one is prescriptive. The conclusion explains how the programmability concept affected the world of computers, from quantum computers to biology computing fields like DNA and RNA computing.
Jack Copeland
- Published in print:
- 2017
- Published Online:
- November 2020
- ISBN:
- 9780198747826
- eISBN:
- 9780191916946
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198747826.003.0013
- Subject:
- Computer Science, History of Computer Science
There is no such person as the inventor of the computer: it was a group effort. The many pioneers involved worked in different places and at different ...
More
There is no such person as the inventor of the computer: it was a group effort. The many pioneers involved worked in different places and at different times, some in relative isolation and others within collaborative research networks. There are some very famous names among them, such as Charles Babbage and John von Neumann—and, of course, Alan Turing himself. Other leading names in this roll of honour include Konrad Zuse, Tommy Flowers, Howard Aiken, John Atanasoff, John Mauchly, Presper Eckert, Jay Forrester, Harry Huskey, Julian Bigelow, Samuel Alexander, Ralph Slutz, Trevor Pearcey, Maurice Wilkes, Max Newman, Freddie Williams, and Tom Kilburn. Turing’s own outstanding contribution was to invent what he called the ‘universal computing machine’. He was first to describe the basic logical principles of the modern computer, writing these down in 1936, 12 years before the appearance of the earliest implementation of his ideas. This came in 1948, when Williams and Kilburn succeeded in wiring together the first electronic universal computing machine—the first modern electronic computer. In 1936, at the age of just 23, Turing invented the fundamental logical principles of the modern computer—almost by accident. A shy boyish-looking genius, he had recently been elected a Fellow of King’s College, Cambridge. The young Turing worked alone, in a spartan room at the top of an ancient stone building beside the River Cam. It was all quite the opposite of a modern research facility—Cambridge’s scholars had been doing their thinking in comfortless stone buildings, reminiscent of cathedrals or monasteries, ever since the university had begun to thrive in the Middle Ages. A few steps from King’s, along narrow medieval lanes, are the buildings and courtyards where, in the seventeenth century, Isaac Newton revolutionized our understanding of the universe. Turing was about to usher in another revolution. He was engaged in theoretical work in the foundations of mathematics. No-one could have guessed that anything of practical value would emerge from his highly abstract research, let alone a machine that would change all our lives.
Less
There is no such person as the inventor of the computer: it was a group effort. The many pioneers involved worked in different places and at different times, some in relative isolation and others within collaborative research networks. There are some very famous names among them, such as Charles Babbage and John von Neumann—and, of course, Alan Turing himself. Other leading names in this roll of honour include Konrad Zuse, Tommy Flowers, Howard Aiken, John Atanasoff, John Mauchly, Presper Eckert, Jay Forrester, Harry Huskey, Julian Bigelow, Samuel Alexander, Ralph Slutz, Trevor Pearcey, Maurice Wilkes, Max Newman, Freddie Williams, and Tom Kilburn. Turing’s own outstanding contribution was to invent what he called the ‘universal computing machine’. He was first to describe the basic logical principles of the modern computer, writing these down in 1936, 12 years before the appearance of the earliest implementation of his ideas. This came in 1948, when Williams and Kilburn succeeded in wiring together the first electronic universal computing machine—the first modern electronic computer. In 1936, at the age of just 23, Turing invented the fundamental logical principles of the modern computer—almost by accident. A shy boyish-looking genius, he had recently been elected a Fellow of King’s College, Cambridge. The young Turing worked alone, in a spartan room at the top of an ancient stone building beside the River Cam. It was all quite the opposite of a modern research facility—Cambridge’s scholars had been doing their thinking in comfortless stone buildings, reminiscent of cathedrals or monasteries, ever since the university had begun to thrive in the Middle Ages. A few steps from King’s, along narrow medieval lanes, are the buildings and courtyards where, in the seventeenth century, Isaac Newton revolutionized our understanding of the universe. Turing was about to usher in another revolution. He was engaged in theoretical work in the foundations of mathematics. No-one could have guessed that anything of practical value would emerge from his highly abstract research, let alone a machine that would change all our lives.
Brian Randell
- Published in print:
- 2017
- Published Online:
- November 2020
- ISBN:
- 9780198747826
- eISBN:
- 9780191916946
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198747826.003.0015
- Subject:
- Computer Science, History of Computer Science
In this chapter I describe my initial attempts at investigating, during the early 1970s, what Alan Turing did during the Second World War. My ...
More
In this chapter I describe my initial attempts at investigating, during the early 1970s, what Alan Turing did during the Second World War. My investigations grew out of a study of the work of Charles Babbage’s earliest successors—in particular, the Irish pioneer Percy Ludgate—a study that led me to plan an overall historical account of the origins of the digital computer. The investigation resulted in my learning about a highly secret programmable electronic computer developed in Britain during the Second World War. I revealed that this computer was named Colossus, and had been built in 1943 for Bletchley Park, the UK government’s wartime codebreaking establishment. However, my attempt to get the details of the machine declassified were unsuccessful, and I came to the conclusion that it might be a long time before anything more would become public about Bletchley Park and Colossus. Around 1970, while I was seeking information about the work of Charles Babbage and Ada Lovelace to use in my inaugural lecture at Newcastle University, I stumbled across the work of Percy Ludgate. In a paper he wrote about Babbage’s ‘automatic calculating engines’, Ludgate mentioned that he had also worked on the design of an Analytical Engine, indicating that he had described this in an earlier paper in the Proceedings of the Royal Dublin Society.From a copy of that paper I learned that an apparently completely forgotten Irish inventor had taken up and developed Babbage’s ideas for what would now be called a program-controlled mechanical computer. Previously I had subscribed to the general belief that over a century had passed before anyone had followed up Babbage’s pioneering 1837 work on Analytical Engines. This discovery led me to undertake an intensive investigation of Ludgate, the results of which I published in the Computer Journal. With the help of a number of Irish librarians and archivists I managed to find out quite a few details about the tragically short life of this Irish accountant, and even to make contact with one of his relatives. Unfortunately, I found nothing more about his design for a paper-tapecontrolled analytical machine beyond what was given in his 1909 paper. My investigations into the background to Ludgate’s work left me with a considerable amount of information on pre-computer technology and on other little-known successors to Babbage.
Less
In this chapter I describe my initial attempts at investigating, during the early 1970s, what Alan Turing did during the Second World War. My investigations grew out of a study of the work of Charles Babbage’s earliest successors—in particular, the Irish pioneer Percy Ludgate—a study that led me to plan an overall historical account of the origins of the digital computer. The investigation resulted in my learning about a highly secret programmable electronic computer developed in Britain during the Second World War. I revealed that this computer was named Colossus, and had been built in 1943 for Bletchley Park, the UK government’s wartime codebreaking establishment. However, my attempt to get the details of the machine declassified were unsuccessful, and I came to the conclusion that it might be a long time before anything more would become public about Bletchley Park and Colossus. Around 1970, while I was seeking information about the work of Charles Babbage and Ada Lovelace to use in my inaugural lecture at Newcastle University, I stumbled across the work of Percy Ludgate. In a paper he wrote about Babbage’s ‘automatic calculating engines’, Ludgate mentioned that he had also worked on the design of an Analytical Engine, indicating that he had described this in an earlier paper in the Proceedings of the Royal Dublin Society.From a copy of that paper I learned that an apparently completely forgotten Irish inventor had taken up and developed Babbage’s ideas for what would now be called a program-controlled mechanical computer. Previously I had subscribed to the general belief that over a century had passed before anyone had followed up Babbage’s pioneering 1837 work on Analytical Engines. This discovery led me to undertake an intensive investigation of Ludgate, the results of which I published in the Computer Journal. With the help of a number of Irish librarians and archivists I managed to find out quite a few details about the tragically short life of this Irish accountant, and even to make contact with one of his relatives. Unfortunately, I found nothing more about his design for a paper-tapecontrolled analytical machine beyond what was given in his 1909 paper. My investigations into the background to Ludgate’s work left me with a considerable amount of information on pre-computer technology and on other little-known successors to Babbage.
Jack Copeland and Jonathan Bowen
- Published in print:
- 2017
- Published Online:
- November 2020
- ISBN:
- 9780198747826
- eISBN:
- 9780191916946
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198747826.003.0007
- Subject:
- Computer Science, History of Computer Science
A few months after Alan Turing’s tragically early death, in 1954, his colleague Geoffrey Jefferson (professor of neurosurgery at Manchester University) ...
More
A few months after Alan Turing’s tragically early death, in 1954, his colleague Geoffrey Jefferson (professor of neurosurgery at Manchester University) wrote what might serve as Turing’s epitaph: Alan in whom the lamp of genius burned so bright—too hot a flame perhaps it was for his endurance. He was so unversed in worldly ways, so childlike it sometimes seemed to me, so unconventional, so non-conform[ing] to the general pattern. His genius flared because he had never quite grown up, he was I suppose a sort of scientific Shelley. After his short but brilliant career Alan Mathison Turing’s life ended 15 days short of his fortysecond birthday.2 His ideas lived on, however, and at the turn of the millennium Time magazine listed him among the twentieth-century’s one hundred greatest minds, alongside the Wright brothers, Albert Einstein, DNA busters Crick and Watson, and Alexander Fleming, the discoverer of penicillin.3 Turing’s achievements during his short lifetime were legion. Best known as the mathematician who broke some of Nazi Germany’s most secret codes, Turing was also one of the ringleaders of the computer revolution. Today, all who click, tap, or touch to open are familiar with the impact of his ideas. We take for granted that we use the same slab of hardware to shop, manage our finances, type our memoirs, play our favourite music and videos, and send instant messages across the street or around the world. In an era when ‘computer’ was the term for a human clerk who did the sums in the back office of an insurance company or science lab, Turing envisaged a ‘universal computing machine’, able to do anything that a programmer could pin down in the form of a series of instructions. He could not have foreseen this at the time, but his universal computing machine changed the way we live: it eventually caught on like wildfire, with sales of personal computers now hovering around the million a day mark. Turing’s universal machine transported us into a world where many young people have never known life without the Internet.
Less
A few months after Alan Turing’s tragically early death, in 1954, his colleague Geoffrey Jefferson (professor of neurosurgery at Manchester University) wrote what might serve as Turing’s epitaph: Alan in whom the lamp of genius burned so bright—too hot a flame perhaps it was for his endurance. He was so unversed in worldly ways, so childlike it sometimes seemed to me, so unconventional, so non-conform[ing] to the general pattern. His genius flared because he had never quite grown up, he was I suppose a sort of scientific Shelley. After his short but brilliant career Alan Mathison Turing’s life ended 15 days short of his fortysecond birthday.2 His ideas lived on, however, and at the turn of the millennium Time magazine listed him among the twentieth-century’s one hundred greatest minds, alongside the Wright brothers, Albert Einstein, DNA busters Crick and Watson, and Alexander Fleming, the discoverer of penicillin.3 Turing’s achievements during his short lifetime were legion. Best known as the mathematician who broke some of Nazi Germany’s most secret codes, Turing was also one of the ringleaders of the computer revolution. Today, all who click, tap, or touch to open are familiar with the impact of his ideas. We take for granted that we use the same slab of hardware to shop, manage our finances, type our memoirs, play our favourite music and videos, and send instant messages across the street or around the world. In an era when ‘computer’ was the term for a human clerk who did the sums in the back office of an insurance company or science lab, Turing envisaged a ‘universal computing machine’, able to do anything that a programmer could pin down in the form of a series of instructions. He could not have foreseen this at the time, but his universal computing machine changed the way we live: it eventually caught on like wildfire, with sales of personal computers now hovering around the million a day mark. Turing’s universal machine transported us into a world where many young people have never known life without the Internet.
Jack Copeland
- Published in print:
- 2017
- Published Online:
- November 2020
- ISBN:
- 9780198747826
- eISBN:
- 9780191916946
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198747826.003.0017
- Subject:
- Computer Science, History of Computer Science
This chapter summarizes Turing’s principal achievements at Bletchley Park and assesses his impact on the course of the Second World War. On the first day ...
More
This chapter summarizes Turing’s principal achievements at Bletchley Park and assesses his impact on the course of the Second World War. On the first day of the war, at the beginning of September 1939, Turing took up residence at Bletchley Park, the ugly Victorian mansion in Buckinghamshire that served as the wartime HQ of Britain’s military codebreakers (Fig. 9.1). There Turing was a key player in the battle to decrypt the coded messages generated by Enigma, the German forces’ typewriter-like cipher machine. Germany’s army, air force, and navy transmitted many thousands of coded messages each day during the Second World War. These ranged from top-level signals, such as detailed situation reports prepared by generals at the battlefronts and orders signed by Hitler himself, down to the important minutiae of war such as weather reports and inventories of the contents of supply ships. Thanks to Turing and his fellow codebreakers, much of this information ended up in Allied hands—sometimes within an hour or two of its being transmitted. The faster the messages could be broken, the fresher the intelligence that they contained, and on at least one occasion the English translation of an intercepted Enigma message was being read at the British Admiralty less than 15 minutes after the Germans had transmitted it. Turing pitted machine against machine. Building on pre-war work by the legendary Polish codebreaker Marian Rejewski, Turing invented the Enigma-cracking ‘bombes’ that quickly turned Bletchley Park from a country house accommodating a small group of thirty or so codebreakers into a vast codebreaking factory. There were approximately 200 bombes at Bletchley Park and its surrounding outstations by the end of the war. As early as 1943 Turing’s machines were cracking a staggering total of 84,000 Enigma messages each month—two messages every minute. Chapter 12 describes the bombes and explains how they worked. Turing also undertook, single-handedly at first, a 20-month struggle to crack the especially secure form of Enigma used by the North Atlantic U-boats. With his group he first broke into the current messages transmitted between the submarines and their bases during June 1941, the very month when Winston Churchill’s advisors were warning him that the wholesale sinkings in the North Atlantic would soon tip Britain into defeat by starvation.
Less
This chapter summarizes Turing’s principal achievements at Bletchley Park and assesses his impact on the course of the Second World War. On the first day of the war, at the beginning of September 1939, Turing took up residence at Bletchley Park, the ugly Victorian mansion in Buckinghamshire that served as the wartime HQ of Britain’s military codebreakers (Fig. 9.1). There Turing was a key player in the battle to decrypt the coded messages generated by Enigma, the German forces’ typewriter-like cipher machine. Germany’s army, air force, and navy transmitted many thousands of coded messages each day during the Second World War. These ranged from top-level signals, such as detailed situation reports prepared by generals at the battlefronts and orders signed by Hitler himself, down to the important minutiae of war such as weather reports and inventories of the contents of supply ships. Thanks to Turing and his fellow codebreakers, much of this information ended up in Allied hands—sometimes within an hour or two of its being transmitted. The faster the messages could be broken, the fresher the intelligence that they contained, and on at least one occasion the English translation of an intercepted Enigma message was being read at the British Admiralty less than 15 minutes after the Germans had transmitted it. Turing pitted machine against machine. Building on pre-war work by the legendary Polish codebreaker Marian Rejewski, Turing invented the Enigma-cracking ‘bombes’ that quickly turned Bletchley Park from a country house accommodating a small group of thirty or so codebreakers into a vast codebreaking factory. There were approximately 200 bombes at Bletchley Park and its surrounding outstations by the end of the war. As early as 1943 Turing’s machines were cracking a staggering total of 84,000 Enigma messages each month—two messages every minute. Chapter 12 describes the bombes and explains how they worked. Turing also undertook, single-handedly at first, a 20-month struggle to crack the especially secure form of Enigma used by the North Atlantic U-boats. With his group he first broke into the current messages transmitted between the submarines and their bases during June 1941, the very month when Winston Churchill’s advisors were warning him that the wholesale sinkings in the North Atlantic would soon tip Britain into defeat by starvation.