Norman D. Cook
- Published in print:
- 2004
- Published Online:
- January 2012
- ISBN:
- 9780197263112
- eISBN:
- 9780191734885
- Item type:
- chapter
- Publisher:
- British Academy
- DOI:
- 10.5871/bacad/9780197263112.003.0010
- Subject:
- Archaeology, Prehistoric Archaeology
Speech production in most people is strongly lateralized to the left hemisphere (LH), but language understanding is generally a bilateral activity. At every level of linguistic processing that has ...
More
Speech production in most people is strongly lateralized to the left hemisphere (LH), but language understanding is generally a bilateral activity. At every level of linguistic processing that has been investigated experimentally, the right hemisphere (RH) has been found to make characteristic contributions, from the processing of the affective aspects of intonation, through the appreciation of word connotations, the decoding of the meaning of metaphors and figures of speech, to the understanding of the overall coherency of verbal humour, paragraphs and short stories. If both hemispheres are indeed engaged in linguistic decoding and both processes are required to achieve a normal level of understanding, a central question concerns how the separate language functions on the left and right are integrated. This chapter reviews relevant studies on the hemispheric contributions to language processing and the role of interhemispheric communications in cognition.Less
Speech production in most people is strongly lateralized to the left hemisphere (LH), but language understanding is generally a bilateral activity. At every level of linguistic processing that has been investigated experimentally, the right hemisphere (RH) has been found to make characteristic contributions, from the processing of the affective aspects of intonation, through the appreciation of word connotations, the decoding of the meaning of metaphors and figures of speech, to the understanding of the overall coherency of verbal humour, paragraphs and short stories. If both hemispheres are indeed engaged in linguistic decoding and both processes are required to achieve a normal level of understanding, a central question concerns how the separate language functions on the left and right are integrated. This chapter reviews relevant studies on the hemispheric contributions to language processing and the role of interhemispheric communications in cognition.
Michael Fishbane
- Published in print:
- 1988
- Published Online:
- November 2003
- ISBN:
- 9780198266990
- eISBN:
- 9780191600593
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198266995.003.0018
- Subject:
- Religion, Judaism
Deals with the mantological exegesis of oracles (or prophecies) in ancient Israel. A method is presented for the isolation of these exegetical traditions; as in early chapters, central is the ...
More
Deals with the mantological exegesis of oracles (or prophecies) in ancient Israel. A method is presented for the isolation of these exegetical traditions; as in early chapters, central is the recognition of technical terms and of close comparison of parallel material. The analyses break down into two broad categories: the first of these treats the phenomenon of non‐transformative exegesis, where the language of the oracle is explicated or clarified, and hidden codes revealed; the second deals with transformative exegesis, where the terms or meaning of the oracle is re‐adapted, reapplied, or revised—all in accord with new understandings of the proper intent and purpose of the oracle. Thus, vague or imprecise predictions are specified for new times (like the ongoing revisions of Jeremiah's 70‐year oracle), and earlier material are reused in new theological ways (like the way early material in the book of Isaiah is reused in the later strata).Less
Deals with the mantological exegesis of oracles (or prophecies) in ancient Israel. A method is presented for the isolation of these exegetical traditions; as in early chapters, central is the recognition of technical terms and of close comparison of parallel material. The analyses break down into two broad categories: the first of these treats the phenomenon of non‐transformative exegesis, where the language of the oracle is explicated or clarified, and hidden codes revealed; the second deals with transformative exegesis, where the terms or meaning of the oracle is re‐adapted, reapplied, or revised—all in accord with new understandings of the proper intent and purpose of the oracle. Thus, vague or imprecise predictions are specified for new times (like the ongoing revisions of Jeremiah's 70‐year oracle), and earlier material are reused in new theological ways (like the way early material in the book of Isaiah is reused in the later strata).
Marc Mézard and Andrea Montanari
- Published in print:
- 2009
- Published Online:
- September 2009
- ISBN:
- 9780198570837
- eISBN:
- 9780191718755
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198570837.003.0015
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
This chapter revisits the problem of decoding low density parity check (LDPC) codes. The maximum a posteriori probability (MAP) decoding of a bit is described as a statistical inference problem, and ...
More
This chapter revisits the problem of decoding low density parity check (LDPC) codes. The maximum a posteriori probability (MAP) decoding of a bit is described as a statistical inference problem, and belief propagation is applied to its solution. The corresponding message passing procedure is analyzed in details, and the threshold noise level below which this ‘iterative decoding’ achieves perfect decoding is derived. The chapter ends with a general discussion of the relation between message passing and optimal (exact symbol MAP) decoding.Less
This chapter revisits the problem of decoding low density parity check (LDPC) codes. The maximum a posteriori probability (MAP) decoding of a bit is described as a statistical inference problem, and belief propagation is applied to its solution. The corresponding message passing procedure is analyzed in details, and the threshold noise level below which this ‘iterative decoding’ achieves perfect decoding is derived. The chapter ends with a general discussion of the relation between message passing and optimal (exact symbol MAP) decoding.
Ranulfo Romo, Adrián Hernández, Luis Lemus, Rogelio Luna, Antonio Zainos, Verónica Nácher, Manuel Alvarez, Yuriria Vázquez, Silvia Cordero, and Liliana Camarillo
- Published in print:
- 2008
- Published Online:
- May 2009
- ISBN:
- 9780195369007
- eISBN:
- 9780199865253
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195369007.003.0014
- Subject:
- Neuroscience, Molecular and Cellular Systems, Development
This chapter uses a highly simplified sensory discrimination task to show that the comparison between stored and ongoing sensory information takes place in a distributed fashion. There is also a ...
More
This chapter uses a highly simplified sensory discrimination task to show that the comparison between stored and ongoing sensory information takes place in a distributed fashion. There is also a continuum between sensory- and motor-related activities. Neurons from areas central to the S1 cortex do not simply wait for a signal encoding decision; rather, they participate at every step of its generation by combining working memory and sensory inputs. This process is carried out by two complementary neuronal responses. This dual representation is found in all areas central to the S1 cortex examined in this task, and might serve to compute optimally the entire perceptual process of the task. This coding scheme has also been found in some cortices of monkeys performing tasks that require behavioral decisions based on a comparison operation.Less
This chapter uses a highly simplified sensory discrimination task to show that the comparison between stored and ongoing sensory information takes place in a distributed fashion. There is also a continuum between sensory- and motor-related activities. Neurons from areas central to the S1 cortex do not simply wait for a signal encoding decision; rather, they participate at every step of its generation by combining working memory and sensory inputs. This process is carried out by two complementary neuronal responses. This dual representation is found in all areas central to the S1 cortex examined in this task, and might serve to compute optimally the entire perceptual process of the task. This coding scheme has also been found in some cortices of monkeys performing tasks that require behavioral decisions based on a comparison operation.
Edmund T. Rolls
- Published in print:
- 2002
- Published Online:
- May 2009
- ISBN:
- 9780195134971
- eISBN:
- 9780199864157
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195134971.003.0023
- Subject:
- Neuroscience, Behavioral Neuroscience, Molecular and Cellular Systems
This chapter considers the functions of the orbitofrontal cortex. It shows that the orbitofrontal cortex is involved in decoding and representing some primary reinforcers such as taste and touch; in ...
More
This chapter considers the functions of the orbitofrontal cortex. It shows that the orbitofrontal cortex is involved in decoding and representing some primary reinforcers such as taste and touch; in learning and reversing associations of visual and other stimuli to these primary reinforcers; in controlling and correcting reward-related and punishment-related behavior; and, thus, in emotion.Less
This chapter considers the functions of the orbitofrontal cortex. It shows that the orbitofrontal cortex is involved in decoding and representing some primary reinforcers such as taste and touch; in learning and reversing associations of visual and other stimuli to these primary reinforcers; in controlling and correcting reward-related and punishment-related behavior; and, thus, in emotion.
Mike Fortun
- Published in print:
- 2008
- Published Online:
- March 2012
- ISBN:
- 9780520247505
- eISBN:
- 9780520942615
- Item type:
- chapter
- Publisher:
- University of California Press
- DOI:
- 10.1525/california/9780520247505.003.0002
- Subject:
- Biology, Evolutionary Biology / Genetics
The author has a brief history of collaboration with Skúli Sigurdsson; the first publication of his academic career was coauthored with Sigurdsson. But two different career paths on the two different ...
More
The author has a brief history of collaboration with Skúli Sigurdsson; the first publication of his academic career was coauthored with Sigurdsson. But two different career paths on the two different tectonic plates that diverge from each other under Iceland had kept additional layers from sedimenting into the friendship. So it was really through his work as advance man in September 1998, and on later trips, that Sigurdsson and the author became friends. In September 1998, on his first trip to Iceland, the author thought that deCODE Genetics was an interesting and potentially admirable project. Other ethnographers of Iceland at the time have suggested that “Iceland is emerging as the site of biotechnology and bioethics.”.Less
The author has a brief history of collaboration with Skúli Sigurdsson; the first publication of his academic career was coauthored with Sigurdsson. But two different career paths on the two different tectonic plates that diverge from each other under Iceland had kept additional layers from sedimenting into the friendship. So it was really through his work as advance man in September 1998, and on later trips, that Sigurdsson and the author became friends. In September 1998, on his first trip to Iceland, the author thought that deCODE Genetics was an interesting and potentially admirable project. Other ethnographers of Iceland at the time have suggested that “Iceland is emerging as the site of biotechnology and bioethics.”.
Mike Fortun
- Published in print:
- 2008
- Published Online:
- March 2012
- ISBN:
- 9780520247505
- eISBN:
- 9780520942615
- Item type:
- chapter
- Publisher:
- University of California Press
- DOI:
- 10.1525/california/9780520247505.003.0006
- Subject:
- Biology, Evolutionary Biology / Genetics
In this chapter, the author narrates his trip to Húsavík, Iceland for a meeting with deCODE Genetics CEO Kári Stefánsson. Kári and the deCODE team were presenting their plans, wants, needs, wishes, ...
More
In this chapter, the author narrates his trip to Húsavík, Iceland for a meeting with deCODE Genetics CEO Kári Stefánsson. Kári and the deCODE team were presenting their plans, wants, needs, wishes, and demands to a group of about fifteen physicians, nurses, and administrators. In this milieu, the author was at a sign nadir, clueless as to what was being said. But Kári was smooth, working the floor, speaking softly and confidently, pausing at the overhead projector to point to one of the few simple Icelandic phrases on the PowerPoint transparency. There was some questioning, but it was easy to tell that everyone seemed pleased, especially the deCODErs. Up in a corner of the meeting hall, on the wall near the ceiling, there were two fading words written in neat script: uppruni hvala, which means “origin of whales.”.Less
In this chapter, the author narrates his trip to Húsavík, Iceland for a meeting with deCODE Genetics CEO Kári Stefánsson. Kári and the deCODE team were presenting their plans, wants, needs, wishes, and demands to a group of about fifteen physicians, nurses, and administrators. In this milieu, the author was at a sign nadir, clueless as to what was being said. But Kári was smooth, working the floor, speaking softly and confidently, pausing at the overhead projector to point to one of the few simple Icelandic phrases on the PowerPoint transparency. There was some questioning, but it was easy to tell that everyone seemed pleased, especially the deCODErs. Up in a corner of the meeting hall, on the wall near the ceiling, there were two fading words written in neat script: uppruni hvala, which means “origin of whales.”.
Mike Fortun
- Published in print:
- 2008
- Published Online:
- March 2012
- ISBN:
- 9780520247505
- eISBN:
- 9780520942615
- Item type:
- chapter
- Publisher:
- University of California Press
- DOI:
- 10.1525/california/9780520247505.003.0007
- Subject:
- Biology, Evolutionary Biology / Genetics
In the deCODE mass media stories, no one leveraged the saga effect better than Robert Kunzig in his December 1998 piece in Discover, “Blood of the Vikings.” The issue was on U.S. newsstands as the ...
More
In the deCODE mass media stories, no one leveraged the saga effect better than Robert Kunzig in his December 1998 piece in Discover, “Blood of the Vikings.” The issue was on U.S. newsstands as the Althingi sped and slogged to its final vote on the Health Sector Database legislation. There is no question that deCODE Genetics, genomics, the 1990s, and Iceland are all subject to the laws of fable, even if such laws should turn out to be unruly, unwritten, or unreadable. Speculation is surely one element of the unruly laws of fable. It involutes a future into the present, complementing the mythic foldings of past into present, generating anticipation; the excitement, thrill, and risk of awaiting the arrival of what might, or might not, come. Ever slow on the uptake, the author learned about how fable crosses with history not from the deCODE events themselves, but from the fable of another expat who returned to Iceland at the same time as deCODE Genetics CEO Kári Stefánsson: Keiko the killer whale, a.k.a. Free Willy.Less
In the deCODE mass media stories, no one leveraged the saga effect better than Robert Kunzig in his December 1998 piece in Discover, “Blood of the Vikings.” The issue was on U.S. newsstands as the Althingi sped and slogged to its final vote on the Health Sector Database legislation. There is no question that deCODE Genetics, genomics, the 1990s, and Iceland are all subject to the laws of fable, even if such laws should turn out to be unruly, unwritten, or unreadable. Speculation is surely one element of the unruly laws of fable. It involutes a future into the present, complementing the mythic foldings of past into present, generating anticipation; the excitement, thrill, and risk of awaiting the arrival of what might, or might not, come. Ever slow on the uptake, the author learned about how fable crosses with history not from the deCODE events themselves, but from the fable of another expat who returned to Iceland at the same time as deCODE Genetics CEO Kári Stefánsson: Keiko the killer whale, a.k.a. Free Willy.
Li Zhaoping
- Published in print:
- 2014
- Published Online:
- August 2014
- ISBN:
- 9780199564668
- eISBN:
- 9780191772504
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199564668.001.0001
- Subject:
- Neuroscience, Development, Behavioral Neuroscience
This book explains computational principles and models of biological visual processing, in particular, of primate vision. Vision scientists unfamiliar with mathematical details should be able to ...
More
This book explains computational principles and models of biological visual processing, in particular, of primate vision. Vision scientists unfamiliar with mathematical details should be able to conceptually follow the theoretical principles and their relationship with physiological, anatomical, and psychological observations, without going through the more mathematical pages. For readers with a physical science background, especially those from machine vision, this book serves as an analytical introduction to biological vision. It can be used as a textbook or a reference book in a vision course, or a computational neuroscience course, for graduate students or advanced undergraduate students. It is also suitable for self-learning by motivated readers. For readers with a focused interest in just one of the topics in the book, it is feasible to read just the chapter on this topic without having read or fully comprehended the other chapters. In particular, Chapter 2 is a brief overview of experimental observations on biological vision, Chapter 3 is on encoding of visual inputs, Chapter 5 is on visual attentional selection driven by sensory inputs, and Chapter 6 is on visual perception or decoding. There are many examples throughout the book to illustrate the application of computational principles to experimental observations.Less
This book explains computational principles and models of biological visual processing, in particular, of primate vision. Vision scientists unfamiliar with mathematical details should be able to conceptually follow the theoretical principles and their relationship with physiological, anatomical, and psychological observations, without going through the more mathematical pages. For readers with a physical science background, especially those from machine vision, this book serves as an analytical introduction to biological vision. It can be used as a textbook or a reference book in a vision course, or a computational neuroscience course, for graduate students or advanced undergraduate students. It is also suitable for self-learning by motivated readers. For readers with a focused interest in just one of the topics in the book, it is feasible to read just the chapter on this topic without having read or fully comprehended the other chapters. In particular, Chapter 2 is a brief overview of experimental observations on biological vision, Chapter 3 is on encoding of visual inputs, Chapter 5 is on visual attentional selection driven by sensory inputs, and Chapter 6 is on visual perception or decoding. There are many examples throughout the book to illustrate the application of computational principles to experimental observations.
Marc Mézard and Andrea Montanari
- Published in print:
- 2009
- Published Online:
- September 2009
- ISBN:
- 9780198570837
- eISBN:
- 9780191718755
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198570837.003.0011
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
Low-density parity-check (LDPC) codes are among the most efficient error correcting codes in use. This chapter introduces an important family of LDPC ensembles, based on random factor graphs, and ...
More
Low-density parity-check (LDPC) codes are among the most efficient error correcting codes in use. This chapter introduces an important family of LDPC ensembles, based on random factor graphs, and studies some of their basic properties. It focuses on performances under optimal decoding, when no constraint is imposed on the computational complexity of the decoding procedure. Bounds in their performances are derived through an analysis of the geometric properties of their codebook. In particular, it shows that appropriately chosen LDPC ensembles allow for communication reliably at rates close to Shannon's capacity.Less
Low-density parity-check (LDPC) codes are among the most efficient error correcting codes in use. This chapter introduces an important family of LDPC ensembles, based on random factor graphs, and studies some of their basic properties. It focuses on performances under optimal decoding, when no constraint is imposed on the computational complexity of the decoding procedure. Bounds in their performances are derived through an analysis of the geometric properties of their codebook. In particular, it shows that appropriately chosen LDPC ensembles allow for communication reliably at rates close to Shannon's capacity.
Marc Mézard and Andrea Montanari
- Published in print:
- 2009
- Published Online:
- September 2009
- ISBN:
- 9780198570837
- eISBN:
- 9780191718755
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198570837.003.0021
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
In the limit of large block-length, iteratively decoded low density parity check (LDPC) codes exhibit two phase transitions. At low noise, the bit error rate under belief propagation decoding ...
More
In the limit of large block-length, iteratively decoded low density parity check (LDPC) codes exhibit two phase transitions. At low noise, the bit error rate under belief propagation decoding vanishes. In a second regime belief propagation decoding fails but maximum a posteriori probability (MAP) decoding succeeds. Finally above a second noise threshold decoding is impossible even with unbounded computational power. This chapter develops a common approach to these two transitions, through the study of ‘metastable’ configurations of the bits that are not codewords. It identifies the belief propagation phase transition with the onset of a dynamical glass phase, detected through the one step replica symmetry breaking approach. This is a structural phenomenon that spoils the performance of a large variety of decoders, from general iterative message passing schemes to simulated annealing.Less
In the limit of large block-length, iteratively decoded low density parity check (LDPC) codes exhibit two phase transitions. At low noise, the bit error rate under belief propagation decoding vanishes. In a second regime belief propagation decoding fails but maximum a posteriori probability (MAP) decoding succeeds. Finally above a second noise threshold decoding is impossible even with unbounded computational power. This chapter develops a common approach to these two transitions, through the study of ‘metastable’ configurations of the bits that are not codewords. It identifies the belief propagation phase transition with the onset of a dynamical glass phase, detected through the one step replica symmetry breaking approach. This is a structural phenomenon that spoils the performance of a large variety of decoders, from general iterative message passing schemes to simulated annealing.
Marc Mézard and Andrea Montanari
- Published in print:
- 2009
- Published Online:
- September 2009
- ISBN:
- 9780198570837
- eISBN:
- 9780191718755
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198570837.003.0003
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
This chapter provides an elementary introduction to some basic concepts in theoretical computer science. It includes basic notions of graph theory and an informal introduction to computational ...
More
This chapter provides an elementary introduction to some basic concepts in theoretical computer science. It includes basic notions of graph theory and an informal introduction to computational complexity, presenting the basic classes P, NP, and NP-complete. These notions are illustrated by discussions of the minimal spanning tree and satisfiability problems, and by applications from statistical physics (spin glasses and maximum cuts), and from coding theory (decoding complexity).Less
This chapter provides an elementary introduction to some basic concepts in theoretical computer science. It includes basic notions of graph theory and an informal introduction to computational complexity, presenting the basic classes P, NP, and NP-complete. These notions are illustrated by discussions of the minimal spanning tree and satisfiability problems, and by applications from statistical physics (spin glasses and maximum cuts), and from coding theory (decoding complexity).
Li Zhaoping
- Published in print:
- 2014
- Published Online:
- August 2014
- ISBN:
- 9780199564668
- eISBN:
- 9780191772504
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199564668.003.0006
- Subject:
- Neuroscience, Development, Behavioral Neuroscience
This chapter gives an account of the experimental and computational investigations in visual perception or recognition. The perceptions, including illusions, are viewed as the outcomes of inferring ...
More
This chapter gives an account of the experimental and computational investigations in visual perception or recognition. The perceptions, including illusions, are viewed as the outcomes of inferring or decoding properties of visual scenes from the neural responses to the visual inputs. Emphasis is on understanding perception at both physiological and behavioral levels through the use of computational principles. Maximum-likelihood decoding and Bayesian decoding approaches are introduced. Examples are provided to use these approaches to understand, e.g., contrast detection, color discrimination, motion direction perception, depth illusion, and influences of context and prior experience in visual perception. Limits in the visual decoding performance, due to inefficiency in utilizing the visual input information, likely caused by the attentional bottleneck, are highlighted. Likely neural architectures to implement decoding are discussed.Less
This chapter gives an account of the experimental and computational investigations in visual perception or recognition. The perceptions, including illusions, are viewed as the outcomes of inferring or decoding properties of visual scenes from the neural responses to the visual inputs. Emphasis is on understanding perception at both physiological and behavioral levels through the use of computational principles. Maximum-likelihood decoding and Bayesian decoding approaches are introduced. Examples are provided to use these approaches to understand, e.g., contrast detection, color discrimination, motion direction perception, depth illusion, and influences of context and prior experience in visual perception. Limits in the visual decoding performance, due to inefficiency in utilizing the visual input information, likely caused by the attentional bottleneck, are highlighted. Likely neural architectures to implement decoding are discussed.
Russell Samolsky
- Published in print:
- 2011
- Published Online:
- May 2012
- ISBN:
- 9780823234790
- eISBN:
- 9780823241248
- Item type:
- chapter
- Publisher:
- Fordham University Press
- DOI:
- 10.5422/fordham/9780823234790.003.0003
- Subject:
- Literature, 20th-century and Contemporary Literature
This chapter examines Heart of Darkness' apocalyptic drive to power by establishing a dialectic of “hollowing out” and “filling in” as the mechanisms by which the text incorporates the African ...
More
This chapter examines Heart of Darkness' apocalyptic drive to power by establishing a dialectic of “hollowing out” and “filling in” as the mechanisms by which the text incorporates the African genocide into its textual field in a radical inflation of its technique of delayed decoding. It considers the Kurtz/Marlow pairing as the text's meditation on its future reception and performs the political intervention of setting a limit to the power of this text to consume mutilated bodies. Using Freud's analysis of the uncanny, the chapter turns the text's incorporation of African genocide back on itself, releasing an ethical counter-history. What the incorporated bodies now call up is the repressed memory of colonial genocide in the Congo, which is overwhelmed by its will to power over the Rwandan genocide. The chapter concludes by analyzing Heart of Darkness in relation to contemporary discourse on messianism.Less
This chapter examines Heart of Darkness' apocalyptic drive to power by establishing a dialectic of “hollowing out” and “filling in” as the mechanisms by which the text incorporates the African genocide into its textual field in a radical inflation of its technique of delayed decoding. It considers the Kurtz/Marlow pairing as the text's meditation on its future reception and performs the political intervention of setting a limit to the power of this text to consume mutilated bodies. Using Freud's analysis of the uncanny, the chapter turns the text's incorporation of African genocide back on itself, releasing an ethical counter-history. What the incorporated bodies now call up is the repressed memory of colonial genocide in the Congo, which is overwhelmed by its will to power over the Rwandan genocide. The chapter concludes by analyzing Heart of Darkness in relation to contemporary discourse on messianism.
Andrew Smith
- Published in print:
- 2010
- Published Online:
- July 2012
- ISBN:
- 9780719074462
- eISBN:
- 9781781700006
- Item type:
- chapter
- Publisher:
- Manchester University Press
- DOI:
- 10.7228/manchester/9780719074462.003.0006
- Subject:
- Literature, 19th-century and Victorian Literature
This chapter addresses the issue of how to read and critically decode spectral messages. It analyses the literary qualities of spirit messages. Some of the literary works that are analysed in this ...
More
This chapter addresses the issue of how to read and critically decode spectral messages. It analyses the literary qualities of spirit messages. Some of the literary works that are analysed in this chapter include Eliot's ‘The Lifted Veil’, where it explores the relationship between the literary imagination and clairvoyance. This chapter also takes a look at Browning's poems in order to examine the mysterious transmission of literary ideas.Less
This chapter addresses the issue of how to read and critically decode spectral messages. It analyses the literary qualities of spirit messages. Some of the literary works that are analysed in this chapter include Eliot's ‘The Lifted Veil’, where it explores the relationship between the literary imagination and clairvoyance. This chapter also takes a look at Browning's poems in order to examine the mysterious transmission of literary ideas.
Pierre Gosselin, Gilles Kirouac, and françois y. doré
- Published in print:
- 2005
- Published Online:
- March 2012
- ISBN:
- 9780195179644
- eISBN:
- 9780199847044
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195179644.003.0012
- Subject:
- Psychology, Cognitive Psychology
This chapter presents two studies that examined the encoding and decoding of facial expression of emotions portrayed by actors. In Study 1, facial expressions portrayed by actors in two encoding ...
More
This chapter presents two studies that examined the encoding and decoding of facial expression of emotions portrayed by actors. In Study 1, facial expressions portrayed by actors in two encoding conditions are explored. Study 2 was concerned with the decoding of actors' portrayals of emotions from facial behavior. Results of Study 1 showed that the facial components of emotions portrayed by actors correspond, to a certain degree, to those that characterize genuine emotions. Meanwhile, Study 2 illustrated that the facial portrayals allowed decoders to judge the emotional category in each encoding condition very well. The studies that showed how the problems were solved, or how to avoid them in the future, are discussed.Less
This chapter presents two studies that examined the encoding and decoding of facial expression of emotions portrayed by actors. In Study 1, facial expressions portrayed by actors in two encoding conditions are explored. Study 2 was concerned with the decoding of actors' portrayals of emotions from facial behavior. Results of Study 1 showed that the facial components of emotions portrayed by actors correspond, to a certain degree, to those that characterize genuine emotions. Meanwhile, Study 2 illustrated that the facial portrayals allowed decoders to judge the emotional category in each encoding condition very well. The studies that showed how the problems were solved, or how to avoid them in the future, are discussed.
Thomas H. Flowers
- Published in print:
- 2006
- Published Online:
- November 2020
- ISBN:
- 9780192840554
- eISBN:
- 9780191917936
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780192840554.003.0013
- Subject:
- Computer Science, History of Computer Science
Before the war in Europe started in 1939, I worked as an engineer in the Dollis Hill communications research laboratories of what was then the British Post ...
More
Before the war in Europe started in 1939, I worked as an engineer in the Dollis Hill communications research laboratories of what was then the British Post Office and is now British Telecom. During the war I continued to work in the laboratories; luckily I was not conscripted into the armed forces. Early in 1942 I was directed to go to GCHQ, the Government Communications Headquarters, then at Bletchley Park. I was told that there I would be briefed concerning some top-secret work which they wanted our laboratories to do for them. At Bletchley Park I met Alan Turing. Turing was working on Enigma at that time, and it was he who wanted the top-secret work done—a machine to assist with decoding Enigma messages once the Bombe had produced the message settings. From then until the end of the war, I was a frequent visitor to GCHQ. In the early years of the war, Alan Turing had saved Britain from defeat by the U-boats, by breaking the Enigma code used by the German Navy to communicate by radio with their ships at sea. Radio broadcasting is the only possible way of maintaining contact with mobile units like ships, tanks, and troops, but it is not secret—the transmissions can be intercepted by anyone with a suitable radio receiver. Therefore the messages must be encrypted before transmission. Even then the transmissions are secure only so long as the code remains unbroken by the enemy. The Germans were very sure that their high-grade ciphers could not be broken! British intelligence services had many radio receiving stations at home and abroad, listening continuously to German military radio broadcasts. These stations sent the coded messages they intercepted to Bletchley Park. In 1940 Bletchley Park started to receive teleprinter messages in a code that they could not recognise. The Germans had invented a new coding machine specifically for teleprinter messages. Messages typed into this machine in plain language were automatically encoded before being transmitted. At the receiving end an identical machine automatically decoded the message and printed the plaintext on paper tape.
Less
Before the war in Europe started in 1939, I worked as an engineer in the Dollis Hill communications research laboratories of what was then the British Post Office and is now British Telecom. During the war I continued to work in the laboratories; luckily I was not conscripted into the armed forces. Early in 1942 I was directed to go to GCHQ, the Government Communications Headquarters, then at Bletchley Park. I was told that there I would be briefed concerning some top-secret work which they wanted our laboratories to do for them. At Bletchley Park I met Alan Turing. Turing was working on Enigma at that time, and it was he who wanted the top-secret work done—a machine to assist with decoding Enigma messages once the Bombe had produced the message settings. From then until the end of the war, I was a frequent visitor to GCHQ. In the early years of the war, Alan Turing had saved Britain from defeat by the U-boats, by breaking the Enigma code used by the German Navy to communicate by radio with their ships at sea. Radio broadcasting is the only possible way of maintaining contact with mobile units like ships, tanks, and troops, but it is not secret—the transmissions can be intercepted by anyone with a suitable radio receiver. Therefore the messages must be encrypted before transmission. Even then the transmissions are secure only so long as the code remains unbroken by the enemy. The Germans were very sure that their high-grade ciphers could not be broken! British intelligence services had many radio receiving stations at home and abroad, listening continuously to German military radio broadcasts. These stations sent the coded messages they intercepted to Bletchley Park. In 1940 Bletchley Park started to receive teleprinter messages in a code that they could not recognise. The Germans had invented a new coding machine specifically for teleprinter messages. Messages typed into this machine in plain language were automatically encoded before being transmitted. At the receiving end an identical machine automatically decoded the message and printed the plaintext on paper tape.
Marco Giunti
- Published in print:
- 1997
- Published Online:
- November 2020
- ISBN:
- 9780195090093
- eISBN:
- 9780197560600
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780195090093.003.0006
- Subject:
- Computer Science, Mathematical Theory of Computation
The definition of a computational system that I proposed in chapter 1 (definition 3) employs the concept of Turing computability. In this chapter, however, I will show ...
More
The definition of a computational system that I proposed in chapter 1 (definition 3) employs the concept of Turing computability. In this chapter, however, I will show that this concept is not absolute, but instead depends on the relational structure of the support on which Turing machines operate. Ordinary Turing machines operate on a linear tape divided into a countably infinite number of adjacent squares. But one can also think of Turing machines that operate on different supports. For example, we can let a Turing machine work on an infinite checkerboard or, more generally, on some n-dimensional infinite array. I call an arbitrary support on which a Turing machine can operate a pattern field. Depending on the pattern field F we choose, we in fact obtain different concepts of computability. At the end of this chapter (section 6), I will thus propose a new definition of a computational system (a computational system on pattern field F) that takes into account the relativity of the concept of Turing computability. If F is a doubly infinite tape, however, computational systems on F reduce to computational systems. Turing (1965) presented his machines as an idealization of a human being that transforms symbols by means of a specified set of rules. Turing based his analysis on four hypotheses: 1. The capacity to recognize, transform, and memorize symbols and rules is finite. It thus follows that any transformation of a complex symbol must always be reduced to a series of simpler transformations. These operations on elementary symbols are of three types: recognizing a symbol, replacing a symbol, and shifting the attention to a symbol that is contiguous to the symbol which has been considered earlier. 2. The series of elementary operations that are in fact executed is determined by three factors: first, the subject’s mental state at a given time; second, the symbol which the subject considers at that time; third, a rule chosen from a finite number of alternatives.
Less
The definition of a computational system that I proposed in chapter 1 (definition 3) employs the concept of Turing computability. In this chapter, however, I will show that this concept is not absolute, but instead depends on the relational structure of the support on which Turing machines operate. Ordinary Turing machines operate on a linear tape divided into a countably infinite number of adjacent squares. But one can also think of Turing machines that operate on different supports. For example, we can let a Turing machine work on an infinite checkerboard or, more generally, on some n-dimensional infinite array. I call an arbitrary support on which a Turing machine can operate a pattern field. Depending on the pattern field F we choose, we in fact obtain different concepts of computability. At the end of this chapter (section 6), I will thus propose a new definition of a computational system (a computational system on pattern field F) that takes into account the relativity of the concept of Turing computability. If F is a doubly infinite tape, however, computational systems on F reduce to computational systems. Turing (1965) presented his machines as an idealization of a human being that transforms symbols by means of a specified set of rules. Turing based his analysis on four hypotheses: 1. The capacity to recognize, transform, and memorize symbols and rules is finite. It thus follows that any transformation of a complex symbol must always be reduced to a series of simpler transformations. These operations on elementary symbols are of three types: recognizing a symbol, replacing a symbol, and shifting the attention to a symbol that is contiguous to the symbol which has been considered earlier. 2. The series of elementary operations that are in fact executed is determined by three factors: first, the subject’s mental state at a given time; second, the symbol which the subject considers at that time; third, a rule chosen from a finite number of alternatives.
Michael Fortun
Roberto Reis (ed.)
- Published in print:
- 2008
- Published Online:
- March 2012
- ISBN:
- 9780520247505
- eISBN:
- 9780520942615
- Item type:
- book
- Publisher:
- University of California Press
- DOI:
- 10.1525/california/9780520247505.001.0001
- Subject:
- Biology, Evolutionary Biology / Genetics
Part detective story, part exposé, and part travelogue, this book investigates one of the signature biotechnology stories of our time and, in so doing, opens a window onto the high-speed, high-tech, ...
More
Part detective story, part exposé, and part travelogue, this book investigates one of the signature biotechnology stories of our time and, in so doing, opens a window onto the high-speed, high-tech, and high-finance world of genome science. It investigates how deCODE Genetics, in Iceland, became one of the wealthiest, as well as one of the most scandalous, companies of its kind with its plan to use the genes and medical records of the entire Icelandic population for scientific research. Delving into the poetry of W. H. Auden, the novels of Halldór Laxness, and the perils of Keiko the killer whale, the book maps the contemporary genomics landscape at a time when we must begin to ask questions about what “life” is made of in the age of DNA, databases, and derivatives trading.Less
Part detective story, part exposé, and part travelogue, this book investigates one of the signature biotechnology stories of our time and, in so doing, opens a window onto the high-speed, high-tech, and high-finance world of genome science. It investigates how deCODE Genetics, in Iceland, became one of the wealthiest, as well as one of the most scandalous, companies of its kind with its plan to use the genes and medical records of the entire Icelandic population for scientific research. Delving into the poetry of W. H. Auden, the novels of Halldór Laxness, and the perils of Keiko the killer whale, the book maps the contemporary genomics landscape at a time when we must begin to ask questions about what “life” is made of in the age of DNA, databases, and derivatives trading.
Simona Cocco and Rémi Monasson
- Published in print:
- 2005
- Published Online:
- November 2020
- ISBN:
- 9780195177374
- eISBN:
- 9780197562260
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780195177374.003.0010
- Subject:
- Computer Science, Mathematical Theory of Computation
The computational effort needed to deal with large combinatorial structures varies considerably with the task to be performed and the resolution procedure ...
More
The computational effort needed to deal with large combinatorial structures varies considerably with the task to be performed and the resolution procedure used [425]. The worst-case complexity of a decision or optimization problem is defined as the time required by the best algorithm to treat any possible input to the problem. For instance, the worst-case complexity of the problem of sorting a list of n numbers scales as n log n: there exist several algorithms that can order any list in at most ~ n log n elementary operations, and none with asymptotically fewer operations. Unfortunately, the worst-case complexities of many important computational problems, called NP-complete, are not known. Partitioning a list of n numbers in two sets with equal partial sums is one among hundreds of known NP-complete problems. It is a fundamental conjecture of theoretical computer science that there exists no algorithm capable of partitioning any list of length n, or of solving any other NP-complete problem with inputs of size n, in a time bounded by a polynomial of n. Therefore, when trying to solve such a problem exactly, one necessarily uses algorithms that may take exponential time on some inputs. Quantifying how“frequent” these hard inputs are for a given algorithm is the question answered by the analysis of algorithms. We will present an overview of recent work by physicists to address this point, and more precisely to characterize the average performance—hereafter simply called complexity—of a given algorithm over a distribution of inputs to a computational problem. The history of algorithm analysis by physical methods and ideas is at least as old as the use of computers by physicists. One well-established chapter in this history is the analysis of Monte Carlo sampling algorithms for statistical mechanics models. It is well known that phase transitions, that is, abrupt changes in the physical properties of the model, can imply a dramatic increase in the time necessary for the sampling procedure. This phenomenon is commonly known as critical slowing down. The physicist's insight comes from the analogy between the dynamics of algorithms and the physical dynamics of the system. That analogy is quite natural: in fact many algorithms mimic the physical dynamics.
Less
The computational effort needed to deal with large combinatorial structures varies considerably with the task to be performed and the resolution procedure used [425]. The worst-case complexity of a decision or optimization problem is defined as the time required by the best algorithm to treat any possible input to the problem. For instance, the worst-case complexity of the problem of sorting a list of n numbers scales as n log n: there exist several algorithms that can order any list in at most ~ n log n elementary operations, and none with asymptotically fewer operations. Unfortunately, the worst-case complexities of many important computational problems, called NP-complete, are not known. Partitioning a list of n numbers in two sets with equal partial sums is one among hundreds of known NP-complete problems. It is a fundamental conjecture of theoretical computer science that there exists no algorithm capable of partitioning any list of length n, or of solving any other NP-complete problem with inputs of size n, in a time bounded by a polynomial of n. Therefore, when trying to solve such a problem exactly, one necessarily uses algorithms that may take exponential time on some inputs. Quantifying how“frequent” these hard inputs are for a given algorithm is the question answered by the analysis of algorithms. We will present an overview of recent work by physicists to address this point, and more precisely to characterize the average performance—hereafter simply called complexity—of a given algorithm over a distribution of inputs to a computational problem. The history of algorithm analysis by physical methods and ideas is at least as old as the use of computers by physicists. One well-established chapter in this history is the analysis of Monte Carlo sampling algorithms for statistical mechanics models. It is well known that phase transitions, that is, abrupt changes in the physical properties of the model, can imply a dramatic increase in the time necessary for the sampling procedure. This phenomenon is commonly known as critical slowing down. The physicist's insight comes from the analogy between the dynamics of algorithms and the physical dynamics of the system. That analogy is quite natural: in fact many algorithms mimic the physical dynamics.