Paul J. Nahin
- Published in print:
- 2017
- Published Online:
- May 2018
- ISBN:
- 9780691176000
- eISBN:
- 9781400844654
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691176000.003.0003
- Subject:
- Mathematics, History of Mathematics
This chapter presents brief biographical sketches of George Boole and Claude Shannon. George was born in Lincoln, a town in the north of England, on November 2, 1815. His father John, while simple ...
More
This chapter presents brief biographical sketches of George Boole and Claude Shannon. George was born in Lincoln, a town in the north of England, on November 2, 1815. His father John, while simple tradesman (a cobbler), taught George geometry and trigonometry, subjects John had found of great aid in his optical studies. Boole was essentially self-taught, with a formal education that stopped at what today would be a junior in high school. Eventually he became a master mathematician (who succeeded in merging algebra with logic), one held in the highest esteem by talented, highly educated men who had graduated from Cambridge and Oxford. Claude was born on April 30, 1916, in Petoskey, Michigan. He enrolled at the University of Michigan, from which he graduated in 1936 with double bachelor's degrees in mathematics and electrical engineering. It was in a class there that he was introduced to Boole's algebra of logic.Less
This chapter presents brief biographical sketches of George Boole and Claude Shannon. George was born in Lincoln, a town in the north of England, on November 2, 1815. His father John, while simple tradesman (a cobbler), taught George geometry and trigonometry, subjects John had found of great aid in his optical studies. Boole was essentially self-taught, with a formal education that stopped at what today would be a junior in high school. Eventually he became a master mathematician (who succeeded in merging algebra with logic), one held in the highest esteem by talented, highly educated men who had graduated from Cambridge and Oxford. Claude was born on April 30, 1916, in Petoskey, Michigan. He enrolled at the University of Michigan, from which he graduated in 1936 with double bachelor's degrees in mathematics and electrical engineering. It was in a class there that he was introduced to Boole's algebra of logic.
Paul J. Nahin
- Published in print:
- 2017
- Published Online:
- May 2018
- ISBN:
- 9780691176000
- eISBN:
- 9781400844654
- Item type:
- book
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691176000.001.0001
- Subject:
- Mathematics, History of Mathematics
Boolean algebra, also called Boolean logic, is at the heart of the electronic circuitry in everything we use—from our computers and cars, to home appliances. How did a system of mathematics ...
More
Boolean algebra, also called Boolean logic, is at the heart of the electronic circuitry in everything we use—from our computers and cars, to home appliances. How did a system of mathematics established in the Victorian era become the basis for such incredible technological achievements a century later? This book combines engaging problems and a colorful historical narrative to tell the remarkable story of how two men in different eras—mathematician and philosopher George Boole and electrical engineer and pioneering information theorist Claude Shannon—advanced Boolean logic and became founding fathers of the electronic communications age. The book takes readers from fundamental concepts to a deeper and more sophisticated understanding of modern digital machines, in order to explore computing and its possible limitations in the twenty-first century and beyond.Less
Boolean algebra, also called Boolean logic, is at the heart of the electronic circuitry in everything we use—from our computers and cars, to home appliances. How did a system of mathematics established in the Victorian era become the basis for such incredible technological achievements a century later? This book combines engaging problems and a colorful historical narrative to tell the remarkable story of how two men in different eras—mathematician and philosopher George Boole and electrical engineer and pioneering information theorist Claude Shannon—advanced Boolean logic and became founding fathers of the electronic communications age. The book takes readers from fundamental concepts to a deeper and more sophisticated understanding of modern digital machines, in order to explore computing and its possible limitations in the twenty-first century and beyond.
Paul J. Nahin
- Published in print:
- 2017
- Published Online:
- May 2018
- ISBN:
- 9780691176000
- eISBN:
- 9781400844654
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691176000.003.0006
- Subject:
- Mathematics, History of Mathematics
George Boole and Claude Shannon shared a deep interest in the mathematics of probability. Boole's interest was, of course, not related to the theory of computation—he was a century too early for ...
More
George Boole and Claude Shannon shared a deep interest in the mathematics of probability. Boole's interest was, of course, not related to the theory of computation—he was a century too early for that—while Shannon's mathematical theory of communication and information processing is replete with probabilistic analyses. There is, nevertheless, an important intersection between what the two men did, which is shown in this chapter. The aim is to provide a flavor of how they reasoned and of the sort of probabilistic problem that caught their attention. Once we have finished with Boole's problem, the reader will see that it uses mathematics that will play a crucial role in answering Shannon's concern about “crummy” relays.Less
George Boole and Claude Shannon shared a deep interest in the mathematics of probability. Boole's interest was, of course, not related to the theory of computation—he was a century too early for that—while Shannon's mathematical theory of communication and information processing is replete with probabilistic analyses. There is, nevertheless, an important intersection between what the two men did, which is shown in this chapter. The aim is to provide a flavor of how they reasoned and of the sort of probabilistic problem that caught their attention. Once we have finished with Boole's problem, the reader will see that it uses mathematics that will play a crucial role in answering Shannon's concern about “crummy” relays.
Paul J. Nahin
- Published in print:
- 2017
- Published Online:
- May 2018
- ISBN:
- 9780691176000
- eISBN:
- 9781400844654
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691176000.003.0007
- Subject:
- Mathematics, History of Mathematics
The entire point of Shannon's 1948 “A Mathematical Theory of Communication” was to study the theoretical limits on the transmission of information from point A (the source) to point B (the receiver) ...
More
The entire point of Shannon's 1948 “A Mathematical Theory of Communication” was to study the theoretical limits on the transmission of information from point A (the source) to point B (the receiver) through an intervening medium (the channel). The information is imagined first to be encoded in some manner before being sent through the channel. Shannon considers two distinct types of channels: the so-called continuous channel that would carry, for example, a continuous signal like the human voice; and the so-called discrete channel that would carry, again for example, a keyboard's output in the form of a digital stream of bits. This chapter focuses on this second case. In a perfect world the digital stream would arrive at the receiver exactly as it was sent, but in the real world the channel is noisy and so, occasionally, a bit will arrive in error.Less
The entire point of Shannon's 1948 “A Mathematical Theory of Communication” was to study the theoretical limits on the transmission of information from point A (the source) to point B (the receiver) through an intervening medium (the channel). The information is imagined first to be encoded in some manner before being sent through the channel. Shannon considers two distinct types of channels: the so-called continuous channel that would carry, for example, a continuous signal like the human voice; and the so-called discrete channel that would carry, again for example, a keyboard's output in the form of a digital stream of bits. This chapter focuses on this second case. In a perfect world the digital stream would arrive at the receiver exactly as it was sent, but in the real world the channel is noisy and so, occasionally, a bit will arrive in error.
John Johnston
- Published in print:
- 2008
- Published Online:
- August 2013
- ISBN:
- 9780262101264
- eISBN:
- 9780262276351
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262101264.003.0002
- Subject:
- Computer Science, Artificial Intelligence
This chapter makes a case for the fundamental complexity of cybernetic machines as a new species of automata, existing both “in the metal and in the flesh,” to use Norbert Wiener’s expression, as ...
More
This chapter makes a case for the fundamental complexity of cybernetic machines as a new species of automata, existing both “in the metal and in the flesh,” to use Norbert Wiener’s expression, as built and theorized by Claude Shannon, Ross Ashby, John von Neumann, Grey Walter, Heinz von Foerster, and Valentino Braitenberg.Less
This chapter makes a case for the fundamental complexity of cybernetic machines as a new species of automata, existing both “in the metal and in the flesh,” to use Norbert Wiener’s expression, as built and theorized by Claude Shannon, Ross Ashby, John von Neumann, Grey Walter, Heinz von Foerster, and Valentino Braitenberg.
Jennifer Iverson
- Published in print:
- 2019
- Published Online:
- January 2019
- ISBN:
- 9780190868192
- eISBN:
- 9780190929138
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780190868192.003.0005
- Subject:
- Music, History, Western
In the transition from World War II to the Cold War, military innovations were domesticated and repurposed for civilian, scientific, and cultural advancement. Information theory is one such ...
More
In the transition from World War II to the Cold War, military innovations were domesticated and repurposed for civilian, scientific, and cultural advancement. Information theory is one such discourse—birthed from Shannon’s wartime cryptography work at Bell Labs—that burgeoned outward in a series of connected, interdisciplinary spirals in the 1950s. The WDR studio was a locale where wartime “technology” (defined broadly to include ideas) was reclaimed for cultural gain. After the initial experiments of the early 1950s, composers found themselves hemmed in by technological limits and unhappy with the serial, pointillist music they had so far made. Enter Meyer-Eppler, a former Nazi communications researcher turned phonetics scientist and electronic music expert, whose information-theoretic teachings helped composers solve their problems in several ways: to understand when their music had been too information dense; to incorporate gestures, approximations, and perceptible shapes; and to circumvent the technological limitations of the studio. The core concepts of information theory—perception, sampling and continuity, and probability—became the foundation for much mid-1950’s music from a range of composers in the studio and beyond. Working cooperatively, scientists, technicians, and composers participated in a process of culturally reclaiming information theory from its wartime origin, making it the conceptual foundation for 1950’s avant-garde music.Less
In the transition from World War II to the Cold War, military innovations were domesticated and repurposed for civilian, scientific, and cultural advancement. Information theory is one such discourse—birthed from Shannon’s wartime cryptography work at Bell Labs—that burgeoned outward in a series of connected, interdisciplinary spirals in the 1950s. The WDR studio was a locale where wartime “technology” (defined broadly to include ideas) was reclaimed for cultural gain. After the initial experiments of the early 1950s, composers found themselves hemmed in by technological limits and unhappy with the serial, pointillist music they had so far made. Enter Meyer-Eppler, a former Nazi communications researcher turned phonetics scientist and electronic music expert, whose information-theoretic teachings helped composers solve their problems in several ways: to understand when their music had been too information dense; to incorporate gestures, approximations, and perceptible shapes; and to circumvent the technological limitations of the studio. The core concepts of information theory—perception, sampling and continuity, and probability—became the foundation for much mid-1950’s music from a range of composers in the studio and beyond. Working cooperatively, scientists, technicians, and composers participated in a process of culturally reclaiming information theory from its wartime origin, making it the conceptual foundation for 1950’s avant-garde music.
Paul Kockelman
- Published in print:
- 2017
- Published Online:
- July 2017
- ISBN:
- 9780190636531
- eISBN:
- 9780190636562
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780190636531.003.0002
- Subject:
- Linguistics, Sociolinguistics / Anthropological Linguistics
This chapter begins by outlining some common properties of channels, infrastructure, and institutions. It connects and critiques the assumptions and interventions of three influential intellectual ...
More
This chapter begins by outlining some common properties of channels, infrastructure, and institutions. It connects and critiques the assumptions and interventions of three influential intellectual traditions: cybernetics (via Claude Shannon), linguistics and anthropology (via Roman Jakobson), and actor-network theory (via Michel Serres). By developing the relation between Serres’s notion of the parasite and Peirce’s notion of thirdness, it theorizes the role of those creatures who live in and off infrastructure: not just enemies, parasites, and noise, but also pirates, trolls, and internet service providers. And by extending Jakobson’s account of duplex categories (shifters, proper names, meta-language, reported speech) from codes to channels, it theorizes four reflexive modes of circulation any network may involve: self-channeling channels, source-dependent channels, signer-directed signers, and channel-directed signers. The conclusion returns to the notion of enclosure, showing the ways that networks are simultaneously a condition for, and a target of, knowledge, power, and profit.Less
This chapter begins by outlining some common properties of channels, infrastructure, and institutions. It connects and critiques the assumptions and interventions of three influential intellectual traditions: cybernetics (via Claude Shannon), linguistics and anthropology (via Roman Jakobson), and actor-network theory (via Michel Serres). By developing the relation between Serres’s notion of the parasite and Peirce’s notion of thirdness, it theorizes the role of those creatures who live in and off infrastructure: not just enemies, parasites, and noise, but also pirates, trolls, and internet service providers. And by extending Jakobson’s account of duplex categories (shifters, proper names, meta-language, reported speech) from codes to channels, it theorizes four reflexive modes of circulation any network may involve: self-channeling channels, source-dependent channels, signer-directed signers, and channel-directed signers. The conclusion returns to the notion of enclosure, showing the ways that networks are simultaneously a condition for, and a target of, knowledge, power, and profit.
Colin Koopman
- Published in print:
- 2019
- Published Online:
- January 2020
- ISBN:
- 9780226626444
- eISBN:
- 9780226626611
- Item type:
- chapter
- Publisher:
- University of Chicago Press
- DOI:
- 10.7208/chicago/9780226626611.003.0006
- Subject:
- Philosophy, General
This chapter develops an argument for what resistance might look like under conditions of infopower. Equally important, it also describes what forms such resistance to infopower are unlikely to take. ...
More
This chapter develops an argument for what resistance might look like under conditions of infopower. Equally important, it also describes what forms such resistance to infopower are unlikely to take. A key argument is that resistance calibrated to infopower is irreducible to mainstream theories of democratic deliberation that presuppose information in such a way that they cannot confront it as a political problematic in its own right. The chapter criticizes influential communicative accounts of democracy that have structured much of recent normative political theory. Primary targets include the critical theory of Jürgen Habermas and the work of the American pragmatist philosopher John Dewey. The chapter shows why both of these theories are structurally unable to confront information itself as a political problem. A precursor for a more viable approach is found in the work of Dewey’s interlocutor, and sometimes foil, Walter Lippmann. Rather than suspending communication-centered politics by way of a turn to aesthetics (a prominent option for contemporary political theory), an alternative is sketched in a turn toward technics and technology. On this view, resistance to infopolitical fastening is best mounted at the level of designs, protocols, audits, and other forms of formats.Less
This chapter develops an argument for what resistance might look like under conditions of infopower. Equally important, it also describes what forms such resistance to infopower are unlikely to take. A key argument is that resistance calibrated to infopower is irreducible to mainstream theories of democratic deliberation that presuppose information in such a way that they cannot confront it as a political problematic in its own right. The chapter criticizes influential communicative accounts of democracy that have structured much of recent normative political theory. Primary targets include the critical theory of Jürgen Habermas and the work of the American pragmatist philosopher John Dewey. The chapter shows why both of these theories are structurally unable to confront information itself as a political problem. A precursor for a more viable approach is found in the work of Dewey’s interlocutor, and sometimes foil, Walter Lippmann. Rather than suspending communication-centered politics by way of a turn to aesthetics (a prominent option for contemporary political theory), an alternative is sketched in a turn toward technics and technology. On this view, resistance to infopolitical fastening is best mounted at the level of designs, protocols, audits, and other forms of formats.
Paul J. Nahin
- Published in print:
- 2017
- Published Online:
- May 2018
- ISBN:
- 9780691176000
- eISBN:
- 9781400844654
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691176000.003.0005
- Subject:
- Mathematics, History of Mathematics
Today's digital circuitry is built with electronic technology that the telephone engineers of the 1930s and the pioneer computer designers of the 1940s would have thought to be magic. The first real ...
More
Today's digital circuitry is built with electronic technology that the telephone engineers of the 1930s and the pioneer computer designers of the 1940s would have thought to be magic. The first real digital technology took the form of electromagnetic relays in telephone switching exchanges. Then came vacuum tube digital circuitry, discrete transistors, integrated transistor circuits, and so on. But the one thing that remains the same is the math, the Boolean algebra that is the central star of this book. This chapter describes the technology that Shannon himself used in his switching analyses. It covers Switches and the logical connectives, a classic switching design problem, electromagnetic relay, the ideal diode and the relay logical AND and OR, and the bi-stable relay latch.Less
Today's digital circuitry is built with electronic technology that the telephone engineers of the 1930s and the pioneer computer designers of the 1940s would have thought to be magic. The first real digital technology took the form of electromagnetic relays in telephone switching exchanges. Then came vacuum tube digital circuitry, discrete transistors, integrated transistor circuits, and so on. But the one thing that remains the same is the math, the Boolean algebra that is the central star of this book. This chapter describes the technology that Shannon himself used in his switching analyses. It covers Switches and the logical connectives, a classic switching design problem, electromagnetic relay, the ideal diode and the relay logical AND and OR, and the bi-stable relay latch.
Paul J. Nahin
- Published in print:
- 2017
- Published Online:
- May 2018
- ISBN:
- 9780691176000
- eISBN:
- 9781400844654
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691176000.003.0010
- Subject:
- Mathematics, History of Mathematics
Boole and Shannon never studied the physics of computation. Obviously Boole simply could not have, as none of the required physics was even known in his day, and Shannon was nearing the end of his ...
More
Boole and Shannon never studied the physics of computation. Obviously Boole simply could not have, as none of the required physics was even known in his day, and Shannon was nearing the end of his career when such considerations were just beginning. And yet, both Boole's algebra and Shannon's information concepts to make many of our calculations. This chapter touches on how fundamental physics—the uncertainty principle from quantum mechanics, and thermodynamics, for example—constrain what is possible, in principle, for the computers of the far future. It argues that while there are indeed finite limitations, present-day technology falls so far short of those limits that there will be good employment for computer technologists for a very long time to come.Less
Boole and Shannon never studied the physics of computation. Obviously Boole simply could not have, as none of the required physics was even known in his day, and Shannon was nearing the end of his career when such considerations were just beginning. And yet, both Boole's algebra and Shannon's information concepts to make many of our calculations. This chapter touches on how fundamental physics—the uncertainty principle from quantum mechanics, and thermodynamics, for example—constrain what is possible, in principle, for the computers of the far future. It argues that while there are indeed finite limitations, present-day technology falls so far short of those limits that there will be good employment for computer technologists for a very long time to come.
John Ross
- Published in print:
- 2008
- Published Online:
- March 2012
- ISBN:
- 9780199228768
- eISBN:
- 9780191696336
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199228768.003.0019
- Subject:
- Psychology, Cognitive Psychology
This chapter looks at changes and developments in the subject of visual perception. The dominance of behaviourism in psychology in the 1950s started to fade as new ideas emerged from research into ...
More
This chapter looks at changes and developments in the subject of visual perception. The dominance of behaviourism in psychology in the 1950s started to fade as new ideas emerged from research into visual perception. Some of the most influential works include Claude Shannon's proposition that information could be quantified, Hermann von Helmholtz's doctrine of unconscious inference, and Richard Gregory's proposal that percepts are literally hypotheses about the world.Less
This chapter looks at changes and developments in the subject of visual perception. The dominance of behaviourism in psychology in the 1950s started to fade as new ideas emerged from research into visual perception. Some of the most influential works include Claude Shannon's proposition that information could be quantified, Hermann von Helmholtz's doctrine of unconscious inference, and Richard Gregory's proposal that percepts are literally hypotheses about the world.
Alan Baddeley
- Published in print:
- 2008
- Published Online:
- March 2012
- ISBN:
- 9780199228768
- eISBN:
- 9780191696336
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199228768.003.0003
- Subject:
- Psychology, Cognitive Psychology
This chapter is concerned with developments in psychology in the 1950s. It suggests that this period witnessed the demise of Gestalt psychology, a distinctive approach to experimental psychology, ...
More
This chapter is concerned with developments in psychology in the 1950s. It suggests that this period witnessed the demise of Gestalt psychology, a distinctive approach to experimental psychology, strongly influenced by the Gestalt principles of perception. During this period, in universities the influence of behaviourism was very strong and the major focus of most psychology courses were theories of learning. Another significant development came through the information processing approach to the study of human cognition, which reflected a number of separate but related sources. These include communication theory and the attempt by Claude Shannon to measure the flow of information through an electronic communication channel in terms of the capacity of a message to reduce uncertainty.Less
This chapter is concerned with developments in psychology in the 1950s. It suggests that this period witnessed the demise of Gestalt psychology, a distinctive approach to experimental psychology, strongly influenced by the Gestalt principles of perception. During this period, in universities the influence of behaviourism was very strong and the major focus of most psychology courses were theories of learning. Another significant development came through the information processing approach to the study of human cognition, which reflected a number of separate but related sources. These include communication theory and the attempt by Claude Shannon to measure the flow of information through an electronic communication channel in terms of the capacity of a message to reduce uncertainty.
Susan D'Agostino
- Published in print:
- 2020
- Published Online:
- April 2020
- ISBN:
- 9780198843597
- eISBN:
- 9780191879388
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198843597.003.0035
- Subject:
- Mathematics, Educational Mathematics, Applied Mathematics
“Find balance, as in coding theory” explains how mathematicians construct codes to transmit messages as accurately and efficiently as possible. Since “noise” may corrupt messages during transmission, ...
More
“Find balance, as in coding theory” explains how mathematicians construct codes to transmit messages as accurately and efficiently as possible. Since “noise” may corrupt messages during transmission, a good code repeats some of the information in a sent message so that errors due to noise in the received message may be detected and (ideally) corrected. However, too much repetition increases not only the code word length but the transmission time—an undesirable outcome. Mathematician Claude Shannon proved that “optimal” codes with just the right balance between repetition in transmission and error detection exist. Mathematics students and enthusiasts are encouraged to find a balance that is just right between repetition and forward momentum in mathematical and life pursuits. At the chapter’s end, readers may check their understanding by working on a problem concerning book International Standard Book Numbers (ISBNs). A solution is provided.Less
“Find balance, as in coding theory” explains how mathematicians construct codes to transmit messages as accurately and efficiently as possible. Since “noise” may corrupt messages during transmission, a good code repeats some of the information in a sent message so that errors due to noise in the received message may be detected and (ideally) corrected. However, too much repetition increases not only the code word length but the transmission time—an undesirable outcome. Mathematician Claude Shannon proved that “optimal” codes with just the right balance between repetition in transmission and error detection exist. Mathematics students and enthusiasts are encouraged to find a balance that is just right between repetition and forward momentum in mathematical and life pursuits. At the chapter’s end, readers may check their understanding by working on a problem concerning book International Standard Book Numbers (ISBNs). A solution is provided.
Paul Kockelman
- Published in print:
- 2017
- Published Online:
- July 2017
- ISBN:
- 9780190636531
- eISBN:
- 9780190636562
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780190636531.003.0004
- Subject:
- Linguistics, Sociolinguistics / Anthropological Linguistics
This chapter argues that information is a species of meaning that has been radically enclosed, such that the values in question seem to have become radically portable. They are not so much ...
More
This chapter argues that information is a species of meaning that has been radically enclosed, such that the values in question seem to have become radically portable. They are not so much independent of context, as dependent on contexts which have been engineered so as to be relatively ubiquitous, and hence ostensibly and erroneously ‘context-free’; not so much able to accommodate all contents, as able to assimilate all contents to their contours, and hence ostensibly and erroneously ‘open content’. To make this argument, the chapter highlights the ideas of Donald MacKay in relation to those of Claude Shannon, and it foregrounds the semiotic framework of Charles Sanders Peirce in relation to cybernetics and computer science. It offers two alternative definitions of information. The first focuses on interaction, while the second focuses on institutions, and both effectively mediate between relatively quantitative theories of information and relatively qualitative theories of meaning.Less
This chapter argues that information is a species of meaning that has been radically enclosed, such that the values in question seem to have become radically portable. They are not so much independent of context, as dependent on contexts which have been engineered so as to be relatively ubiquitous, and hence ostensibly and erroneously ‘context-free’; not so much able to accommodate all contents, as able to assimilate all contents to their contours, and hence ostensibly and erroneously ‘open content’. To make this argument, the chapter highlights the ideas of Donald MacKay in relation to those of Claude Shannon, and it foregrounds the semiotic framework of Charles Sanders Peirce in relation to cybernetics and computer science. It offers two alternative definitions of information. The first focuses on interaction, while the second focuses on institutions, and both effectively mediate between relatively quantitative theories of information and relatively qualitative theories of meaning.
David Sarokin and Jay Schulkin
- Published in print:
- 2016
- Published Online:
- May 2017
- ISBN:
- 9780262034920
- eISBN:
- 9780262336253
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262034920.003.0002
- Subject:
- Information Science, Library Science
“Information” has taken on new meanings and new significance in the Information Age. The subject has moved beyond the traditional realm of engineering. Encyclopedias and text books that formerly ...
More
“Information” has taken on new meanings and new significance in the Information Age. The subject has moved beyond the traditional realm of engineering. Encyclopedias and text books that formerly ignored information as a topic now give it a major presence. However, we still overlook the central importance of information itself, focusing instead on information technology.Less
“Information” has taken on new meanings and new significance in the Information Age. The subject has moved beyond the traditional realm of engineering. Encyclopedias and text books that formerly ignored information as a topic now give it a major presence. However, we still overlook the central importance of information itself, focusing instead on information technology.