Helmuth Spieler
- Published in print:
- 2005
- Published Online:
- September 2007
- ISBN:
- 9780198527848
- eISBN:
- 9780191713248
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198527848.001.0001
- Subject:
- Physics, Particle Physics / Astrophysics / Cosmology
Semiconductor sensors patterned at the micron scale combined with custom-designed integrated circuits have revolutionized semiconductor radiation detector systems. Designs covering many square meters ...
More
Semiconductor sensors patterned at the micron scale combined with custom-designed integrated circuits have revolutionized semiconductor radiation detector systems. Designs covering many square meters with millions of signal channels are now commonplace in high-energy physics and the technology is finding its way into many other fields, ranging from astrophysics to experiments at synchrotron light sources and medical imaging. This book presents a discussion of the many facets of highly integrated semiconductor detector systems, covering sensors, signal processing, transistors, circuits, low-noise electronics, and radiation effects. To lay a basis for the more detailed discussions in the book and aid in understanding how these different elements combine to form functional detector systems, the text includes introductions to semiconductor physics, diodes, detectors, signal formation, transistors, amplifier circuits, electronic noise mechanisms, and signal processing. A chapter on digital electronics includes key elements of analog-to-digital converters and an introduction to digital signal processing. The physics of radiation damage in semiconductor devices is discussed and applied to detectors and electronics. The diversity of design approaches is illustrated in a chapter describing systems in high-energy physics, astronomy, and astrophysics. Finally, a chapter ‘Why things don't work’, discusses common pitfalls, covering interference mechanisms such as power supply noise, microphonics, and shared current paths (‘ground loops’), together with mitigation techniques for pickup noise reduction, both at the circuit and system level. Beginning at a basic level, the book provides a unique introduction to a key area of modern science.Less
Semiconductor sensors patterned at the micron scale combined with custom-designed integrated circuits have revolutionized semiconductor radiation detector systems. Designs covering many square meters with millions of signal channels are now commonplace in high-energy physics and the technology is finding its way into many other fields, ranging from astrophysics to experiments at synchrotron light sources and medical imaging. This book presents a discussion of the many facets of highly integrated semiconductor detector systems, covering sensors, signal processing, transistors, circuits, low-noise electronics, and radiation effects. To lay a basis for the more detailed discussions in the book and aid in understanding how these different elements combine to form functional detector systems, the text includes introductions to semiconductor physics, diodes, detectors, signal formation, transistors, amplifier circuits, electronic noise mechanisms, and signal processing. A chapter on digital electronics includes key elements of analog-to-digital converters and an introduction to digital signal processing. The physics of radiation damage in semiconductor devices is discussed and applied to detectors and electronics. The diversity of design approaches is illustrated in a chapter describing systems in high-energy physics, astronomy, and astrophysics. Finally, a chapter ‘Why things don't work’, discusses common pitfalls, covering interference mechanisms such as power supply noise, microphonics, and shared current paths (‘ground loops’), together with mitigation techniques for pickup noise reduction, both at the circuit and system level. Beginning at a basic level, the book provides a unique introduction to a key area of modern science.
Andrew J. Connolly, Jacob T. VanderPlas, Alexander Gray, Andrew J. Connolly, Jacob T. VanderPlas, and Alexander Gray
- Published in print:
- 2014
- Published Online:
- October 2017
- ISBN:
- 9780691151687
- eISBN:
- 9781400848911
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691151687.003.0010
- Subject:
- Physics, Particle Physics / Astrophysics / Cosmology
This chapter summarizes the fundamental concepts and tools for analyzing time series data. Time series analysis is a branch of applied mathematics developed mostly in the fields of signal processing ...
More
This chapter summarizes the fundamental concepts and tools for analyzing time series data. Time series analysis is a branch of applied mathematics developed mostly in the fields of signal processing and statistics. Contributions to this field, from an astronomical perspective, have predominantly focused on unevenly sampled data, low signal-to-noise data, and heteroscedastic errors. The chapter starts with a brief introduction to the main concepts in time series analysis. It then discusses the main tools from the modeling toolkit for time series analysis. Despite being set in the context of time series, many tools and results are readily applicable in other domains, and for this reason the examples presented will not be strictly limited to time-domain data. Armed with the modeling toolkit, the chapter goes on to discuss the analysis of periodic time series, search for temporally localized signals, and concludes with a brief discussion of stochastic processes.Less
This chapter summarizes the fundamental concepts and tools for analyzing time series data. Time series analysis is a branch of applied mathematics developed mostly in the fields of signal processing and statistics. Contributions to this field, from an astronomical perspective, have predominantly focused on unevenly sampled data, low signal-to-noise data, and heteroscedastic errors. The chapter starts with a brief introduction to the main concepts in time series analysis. It then discusses the main tools from the modeling toolkit for time series analysis. Despite being set in the context of time series, many tools and results are readily applicable in other domains, and for this reason the examples presented will not be strictly limited to time-domain data. Armed with the modeling toolkit, the chapter goes on to discuss the analysis of periodic time series, search for temporally localized signals, and concludes with a brief discussion of stochastic processes.
R. Duncan Luce
- Published in print:
- 1991
- Published Online:
- January 2008
- ISBN:
- 9780195070019
- eISBN:
- 9780199869879
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195070019.003.0005
- Subject:
- Psychology, Cognitive Models and Architectures
This chapter explores the limited literature available on the designs in which there are little or no experimenter-imposed time structure on signal presentations. The chapter first examines the ...
More
This chapter explores the limited literature available on the designs in which there are little or no experimenter-imposed time structure on signal presentations. The chapter first examines the earliest literature, in which signals occur infrequently. The next section considers experiments that resemble typical simple reaction-timeexperiments in that the signal rate is relatively high. The final part of the chapter is devoted to studies on the impact the processing of one signal has on the processing of a second signal when the second occurs very shortly after the first.Less
This chapter explores the limited literature available on the designs in which there are little or no experimenter-imposed time structure on signal presentations. The chapter first examines the earliest literature, in which signals occur infrequently. The next section considers experiments that resemble typical simple reaction-timeexperiments in that the signal rate is relatively high. The final part of the chapter is devoted to studies on the impact the processing of one signal has on the processing of a second signal when the second occurs very shortly after the first.
Marc O. Ernst and Massimiliano Di Luca
- Published in print:
- 2011
- Published Online:
- September 2012
- ISBN:
- 9780195387247
- eISBN:
- 9780199918379
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195387247.003.0012
- Subject:
- Psychology, Cognitive Neuroscience, Cognitive Psychology
The brain receives information about the environment from all the sensory modalities, including vision, touch, and audition. To interact efficiently with the environment, this information must ...
More
The brain receives information about the environment from all the sensory modalities, including vision, touch, and audition. To interact efficiently with the environment, this information must eventually converge to form a reliable and accurate multimodal percept. This process is often complicated by the existence of noise at every level of signal processing, which makes the sensory information derived from the world unreliable and inaccurate. There are several ways in which the nervous system may minimize the negative consequences of noise in terms of reliability and accuracy. Two key strategies are to combine redundant sensory estimates and to use prior knowledge. This chapter elaborates further on how these strategies may be used by the nervous system to obtain the best possible estimates from noisy signals.Less
The brain receives information about the environment from all the sensory modalities, including vision, touch, and audition. To interact efficiently with the environment, this information must eventually converge to form a reliable and accurate multimodal percept. This process is often complicated by the existence of noise at every level of signal processing, which makes the sensory information derived from the world unreliable and inaccurate. There are several ways in which the nervous system may minimize the negative consequences of noise in terms of reliability and accuracy. Two key strategies are to combine redundant sensory estimates and to use prior knowledge. This chapter elaborates further on how these strategies may be used by the nervous system to obtain the best possible estimates from noisy signals.
Dean J. Krusienski, Dennis J. McFarland, and José C. Principe
- Published in print:
- 2012
- Published Online:
- May 2012
- ISBN:
- 9780195388855
- eISBN:
- 9780199932689
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195388855.003.0007
- Subject:
- Neuroscience, Techniques
The purpose of a brain-computer interface (BCI) is to detect and quantify characteristics of brain signals that indicate what the user wants the BCI to do, to translate these measurements in real ...
More
The purpose of a brain-computer interface (BCI) is to detect and quantify characteristics of brain signals that indicate what the user wants the BCI to do, to translate these measurements in real time into the desired device commands, and to provide concurrent feedback to the user. The brain-signal characteristics used for this purpose are called signal features, or simply features. Feature extraction is the process of distinguishing the pertinent signal characteristics from extraneous content and representing them in a compact and/or meaningful form, amenable to interpretation by a human or computer. This chapter focuses on feature extraction. It begins with an overview of some basic principles of digital signal processing and a discussion of common techniques used to enhance signals prior to feature extraction. It then covers method selection, typical processing protocols, and major established methods for BCI feature extraction.Less
The purpose of a brain-computer interface (BCI) is to detect and quantify characteristics of brain signals that indicate what the user wants the BCI to do, to translate these measurements in real time into the desired device commands, and to provide concurrent feedback to the user. The brain-signal characteristics used for this purpose are called signal features, or simply features. Feature extraction is the process of distinguishing the pertinent signal characteristics from extraneous content and representing them in a compact and/or meaningful form, amenable to interpretation by a human or computer. This chapter focuses on feature extraction. It begins with an overview of some basic principles of digital signal processing and a discussion of common techniques used to enhance signals prior to feature extraction. It then covers method selection, typical processing protocols, and major established methods for BCI feature extraction.
Helmuth Spieler
- Published in print:
- 2005
- Published Online:
- September 2007
- ISBN:
- 9780198527848
- eISBN:
- 9780191713248
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198527848.003.0005
- Subject:
- Physics, Particle Physics / Astrophysics / Cosmology
This chapter first describes logic functions and illustrates how false signals are formed due to timing errors. Basic logic circuits are described (NMOS, PMOS, CMOS) together with propagation delays ...
More
This chapter first describes logic functions and illustrates how false signals are formed due to timing errors. Basic logic circuits are described (NMOS, PMOS, CMOS) together with propagation delays and power dissipation. The application of logic circuitry in analog-to-digital converters is illustrated, with a detailed discussion of conversion flaws (channel profile, integral and differential non-linearity, rate dependence). Different types of analog-to-digital converters are described (flash, successive approximation, Wilkinson, and pipelined ADCs) with their strengths and shortcomings. The chapter closes with a brief discussion of sampling, the Nyquist criterion, and digital signal processing.Less
This chapter first describes logic functions and illustrates how false signals are formed due to timing errors. Basic logic circuits are described (NMOS, PMOS, CMOS) together with propagation delays and power dissipation. The application of logic circuitry in analog-to-digital converters is illustrated, with a detailed discussion of conversion flaws (channel profile, integral and differential non-linearity, rate dependence). Different types of analog-to-digital converters are described (flash, successive approximation, Wilkinson, and pipelined ADCs) with their strengths and shortcomings. The chapter closes with a brief discussion of sampling, the Nyquist criterion, and digital signal processing.
Alan K. Bowman and Roger S. O. Tomlin
- Published in print:
- 2005
- Published Online:
- January 2012
- ISBN:
- 9780197262962
- eISBN:
- 9780191734533
- Item type:
- chapter
- Publisher:
- British Academy
- DOI:
- 10.5871/bacad/9780197262962.003.0002
- Subject:
- Archaeology, Archaeological Methodology and Techniques
The imaging of ancient document papers presents several challenges, the nature of which is determined by the character of the text, the material on which it is written and the state of preservation. ...
More
The imaging of ancient document papers presents several challenges, the nature of which is determined by the character of the text, the material on which it is written and the state of preservation. This chapter talks about the struggle to read and interpret Latin manuscripts from Roman Britain. These manuscripts come mainly in three forms: texts written in ink on thin wooden leaves, texts inscribed with metal stylus on wax-coated wooden stilus tablets, and texts incised on sheets of lead. This chapter focuses on the problems of imaging and signalling process of the texts found on the Vindolanda stilus tablets. These problems in interpreting ancient texts arise from the two identifiable sources of difficulty. The first one is the problem of seeing and identifying, in abraded and damaged documents what is aimed to be read. The second is the problem arising from the character of the text itself which determines the ability of the reader to decipher and interpret it.Less
The imaging of ancient document papers presents several challenges, the nature of which is determined by the character of the text, the material on which it is written and the state of preservation. This chapter talks about the struggle to read and interpret Latin manuscripts from Roman Britain. These manuscripts come mainly in three forms: texts written in ink on thin wooden leaves, texts inscribed with metal stylus on wax-coated wooden stilus tablets, and texts incised on sheets of lead. This chapter focuses on the problems of imaging and signalling process of the texts found on the Vindolanda stilus tablets. These problems in interpreting ancient texts arise from the two identifiable sources of difficulty. The first one is the problem of seeing and identifying, in abraded and damaged documents what is aimed to be read. The second is the problem arising from the character of the text itself which determines the ability of the reader to decipher and interpret it.
Larry R. Squire
- Published in print:
- 2009
- Published Online:
- May 2009
- ISBN:
- 9780195380101
- eISBN:
- 9780199864362
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195380101.003.0013
- Subject:
- Neuroscience, History of Neuroscience
This chapter presents an autobiography of Nobuo Suga. Suga and his collaborators explored the neural mechanisms for parallel and hierarchical processing of biosonar information and the cortical maps ...
More
This chapter presents an autobiography of Nobuo Suga. Suga and his collaborators explored the neural mechanisms for parallel and hierarchical processing of biosonar information and the cortical maps representing different types of biosonar information. They also explored the role of the corticofugal (descending) auditory system in the improvement and adjustment of auditory signal processing and the neural circuit for plastic changes in the central auditory system elicited by auditory fear conditioning. His early years, career, and achievements are discussed.Less
This chapter presents an autobiography of Nobuo Suga. Suga and his collaborators explored the neural mechanisms for parallel and hierarchical processing of biosonar information and the cortical maps representing different types of biosonar information. They also explored the role of the corticofugal (descending) auditory system in the improvement and adjustment of auditory signal processing and the neural circuit for plastic changes in the central auditory system elicited by auditory fear conditioning. His early years, career, and achievements are discussed.
Robert M. Stern, William J. Ray, and Karen S. Quigley
- Published in print:
- 2000
- Published Online:
- March 2012
- ISBN:
- 9780195113594
- eISBN:
- 9780199846962
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195113594.001.0001
- Subject:
- Psychology, Health Psychology
This text contains a revised edition of a book on psychophysiological recording. The book includes information on the most up-to-date equipment used today to do brain scanning and discusses other ...
More
This text contains a revised edition of a book on psychophysiological recording. The book includes information on the most up-to-date equipment used today to do brain scanning and discusses other equipment not available in 1980. A new chapter on signal processing and analysis has been added, and discussions cover nonlinear systems as well as cognitive psychophysiology.Less
This text contains a revised edition of a book on psychophysiological recording. The book includes information on the most up-to-date equipment used today to do brain scanning and discusses other equipment not available in 1980. A new chapter on signal processing and analysis has been added, and discussions cover nonlinear systems as well as cognitive psychophysiology.
Moody T. Chu and Gene H. Golub
- Published in print:
- 2005
- Published Online:
- September 2007
- ISBN:
- 9780198566649
- eISBN:
- 9780191718021
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198566649.003.0002
- Subject:
- Mathematics, Applied Mathematics
Inverse eigenvalue problems arise in a remarkable variety of applications. This chapter briefly highlights a few applications. The discussion is divided into six categories of applications: feedback ...
More
Inverse eigenvalue problems arise in a remarkable variety of applications. This chapter briefly highlights a few applications. The discussion is divided into six categories of applications: feedback control, applied mechanics, inverse Sturm-Liouville problem, applied physics, numerical analysis, and signal and data processing. Each category covers some additional problems.Less
Inverse eigenvalue problems arise in a remarkable variety of applications. This chapter briefly highlights a few applications. The discussion is divided into six categories of applications: feedback control, applied mechanics, inverse Sturm-Liouville problem, applied physics, numerical analysis, and signal and data processing. Each category covers some additional problems.
Joanna Demers
- Published in print:
- 2010
- Published Online:
- September 2010
- ISBN:
- 9780195387650
- eISBN:
- 9780199863594
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195387650.003.0002
- Subject:
- Music, History, Western, History, American
Chapter 2 explores how electronica conceives of sound material as a metaphor. Compared with post-Schaefferian electroacoustic music, electronica spends less time dictating listeners’ responses. ...
More
Chapter 2 explores how electronica conceives of sound material as a metaphor. Compared with post-Schaefferian electroacoustic music, electronica spends less time dictating listeners’ responses. Sounds of the outside world, sounds of other works, and sounds newly created all figure in electronica. What matters in electronica is not the origins of sound so much as the metaphors that portray sound as malleable material, the product of construction, reproduction, or destruction. These metaphors often correspond to actual sound-production techniques. Construction is often synonymous with sound synthesis, reproduction with sound sampling, and destruction with the defacement of the phonographic medium. But of course, the most interesting moments in electronica occur when the metaphor describing sound does not correspond with the actual means of producing it. When digital signal processing hides or disguises the provenance of a sound, listeners can hear in an old sound something supposedly new.Less
Chapter 2 explores how electronica conceives of sound material as a metaphor. Compared with post-Schaefferian electroacoustic music, electronica spends less time dictating listeners’ responses. Sounds of the outside world, sounds of other works, and sounds newly created all figure in electronica. What matters in electronica is not the origins of sound so much as the metaphors that portray sound as malleable material, the product of construction, reproduction, or destruction. These metaphors often correspond to actual sound-production techniques. Construction is often synonymous with sound synthesis, reproduction with sound sampling, and destruction with the defacement of the phonographic medium. But of course, the most interesting moments in electronica occur when the metaphor describing sound does not correspond with the actual means of producing it. When digital signal processing hides or disguises the provenance of a sound, listeners can hear in an old sound something supposedly new.
Jay M. Goldberg, Victor J. Wilson, Kathleen E. Cullen, Dora E. Angelaki, Dianne M. Broussard, Jean A. Büttner-Ennever, Kikuro Fukushima, and Lloyd B. Minor
- Published in print:
- 2012
- Published Online:
- May 2012
- ISBN:
- 9780195167085
- eISBN:
- 9780199932153
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195167085.003.0012
- Subject:
- Neuroscience, Sensory and Motor Systems
The cerebellum is critical for proper timing and patterning during voluntary movements. In conjunction with the vestibular system, it also modulates the motor pathways that control balance and gaze. ...
More
The cerebellum is critical for proper timing and patterning during voluntary movements. In conjunction with the vestibular system, it also modulates the motor pathways that control balance and gaze. There are five main regions of the cerebellum that receive inputs from the vestibular nerve or vestibular nuclei: the nodulus and ventral uvula; the flocculus and ventral paraflocculus; the oculomotor vermis of posterior lobe; vermal lobules I and II of the anterior lobe; and the deep cerebellar nuclei. This chapter focuses on the signal processing in each of these regions. The inputs to and projections from each region are considered, followed by a review of the findings of single-unit and lesion studies that have provided insights into the functional roles of each region. It begins with a consideration of the general features of cerebellar structure and function.Less
The cerebellum is critical for proper timing and patterning during voluntary movements. In conjunction with the vestibular system, it also modulates the motor pathways that control balance and gaze. There are five main regions of the cerebellum that receive inputs from the vestibular nerve or vestibular nuclei: the nodulus and ventral uvula; the flocculus and ventral paraflocculus; the oculomotor vermis of posterior lobe; vermal lobules I and II of the anterior lobe; and the deep cerebellar nuclei. This chapter focuses on the signal processing in each of these regions. The inputs to and projections from each region are considered, followed by a review of the findings of single-unit and lesion studies that have provided insights into the functional roles of each region. It begins with a consideration of the general features of cerebellar structure and function.
Peter Manning
- Published in print:
- 2013
- Published Online:
- September 2013
- ISBN:
- 9780199746392
- eISBN:
- 9780199332496
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199746392.001.0001
- Subject:
- Music, Theory, Analysis, Composition, Popular
In this new edition of the classic text on the history and evolution of electronic music, Peter Manning extends the definitive account of the medium from its birth to include key developments from ...
More
In this new edition of the classic text on the history and evolution of electronic music, Peter Manning extends the definitive account of the medium from its birth to include key developments from the dawn of the 21st century to the present day. After explaining the antecedents of electronic music from the turn of the 20th century to the Second World War, Manning discusses the emergence of the early ‘classical’ studios of the 1950s, and the subsequent evolution of more advanced analogue technologies during the 1960s and ‘70s, leading in turn to the birth and development of the MIDI synthesizer. Attention then turns to the characteristics of the digital revolution, from the pioneering work of Max Mathews at Bell Telephone Laboratories in the 1950s to the wealth of resources available today, facilitated by the development of the personal computer and allied digital technologies. The scope and extent of the technical and creative developments that have taken place since the late 1990s are considered in an extended series of new and updated chapters. These include topics such as the development of the digital audio workstation, laptop music, the Internet, and the emergence of new performance interfaces. Manning offers a critical perspective of the medium in terms of the philosophical and technical features that have shaped its growth. Emphasizing the functional characteristics of emerging technologies and their influence on the creative development of the medium, Manning covers key developments in both commercial and the non-commercial sectors to provide readers with the most comprehensive resource available on the evolution of this ever-expanding area of creativity.Less
In this new edition of the classic text on the history and evolution of electronic music, Peter Manning extends the definitive account of the medium from its birth to include key developments from the dawn of the 21st century to the present day. After explaining the antecedents of electronic music from the turn of the 20th century to the Second World War, Manning discusses the emergence of the early ‘classical’ studios of the 1950s, and the subsequent evolution of more advanced analogue technologies during the 1960s and ‘70s, leading in turn to the birth and development of the MIDI synthesizer. Attention then turns to the characteristics of the digital revolution, from the pioneering work of Max Mathews at Bell Telephone Laboratories in the 1950s to the wealth of resources available today, facilitated by the development of the personal computer and allied digital technologies. The scope and extent of the technical and creative developments that have taken place since the late 1990s are considered in an extended series of new and updated chapters. These include topics such as the development of the digital audio workstation, laptop music, the Internet, and the emergence of new performance interfaces. Manning offers a critical perspective of the medium in terms of the philosophical and technical features that have shaped its growth. Emphasizing the functional characteristics of emerging technologies and their influence on the creative development of the medium, Manning covers key developments in both commercial and the non-commercial sectors to provide readers with the most comprehensive resource available on the evolution of this ever-expanding area of creativity.
Andrew P. Yonelinas
- Published in print:
- 2002
- Published Online:
- March 2012
- ISBN:
- 9780198508809
- eISBN:
- 9780191687396
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198508809.003.0003
- Subject:
- Psychology, Cognitive Psychology
The examination of recognition memory confidence judgements indicates that there are two separate components or processes underlying episodic memory. Recollection is assumed to reflect a threshold ...
More
The examination of recognition memory confidence judgements indicates that there are two separate components or processes underlying episodic memory. Recollection is assumed to reflect a threshold process whereby qualitative information about the study event is retrieved, whereas familiarity reflects a classical signal-detection process whereby items exceeding a familiarity response criterion are accepted as having been studied. Evidence from cognitive, neuropsychological, and neuroimaging studies indicate that the model is in agreement with the existing recognition results, and indicate that recollection and familiarity are behaviourally, neurally, and phenomenologically distinct memory retrieval processes.Less
The examination of recognition memory confidence judgements indicates that there are two separate components or processes underlying episodic memory. Recollection is assumed to reflect a threshold process whereby qualitative information about the study event is retrieved, whereas familiarity reflects a classical signal-detection process whereby items exceeding a familiarity response criterion are accepted as having been studied. Evidence from cognitive, neuropsychological, and neuroimaging studies indicate that the model is in agreement with the existing recognition results, and indicate that recollection and familiarity are behaviourally, neurally, and phenomenologically distinct memory retrieval processes.
Hermann Kolanoski and Norbert Wermes
- Published in print:
- 2020
- Published Online:
- September 2020
- ISBN:
- 9780198858362
- eISBN:
- 9780191890710
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198858362.003.0017
- Subject:
- Physics, Atomic, Laser, and Optical Physics, Particle Physics / Astrophysics / Cosmology
The electronic readout and processing of detector signals, generated by radiation in detectors, is today by far the most common form of signal acquisition in particle physics. In this chapter typical ...
More
The electronic readout and processing of detector signals, generated by radiation in detectors, is today by far the most common form of signal acquisition in particle physics. In this chapter typical procedures for electronic readout of detectors are discussed with special attention to small, noise-prone signals. An overview is given of standard techniques for signal processing, like amplification, pulse shaping, discrimination and digitization where also the new developments in microelectronics are discussed. In applications with high data rates, as at modern accelerator experiments or also in (X-ray) image processing, deadtimes can occur which are discussed in a dedicated section. Similarly, there is a section on wave guide properties of signal cable. Often the signals are so small, in particular those of semiconductor detectors, that electronic noise and its suppression play an important role.Less
The electronic readout and processing of detector signals, generated by radiation in detectors, is today by far the most common form of signal acquisition in particle physics. In this chapter typical procedures for electronic readout of detectors are discussed with special attention to small, noise-prone signals. An overview is given of standard techniques for signal processing, like amplification, pulse shaping, discrimination and digitization where also the new developments in microelectronics are discussed. In applications with high data rates, as at modern accelerator experiments or also in (X-ray) image processing, deadtimes can occur which are discussed in a dedicated section. Similarly, there is a section on wave guide properties of signal cable. Often the signals are so small, in particular those of semiconductor detectors, that electronic noise and its suppression play an important role.
Max A. Little
- Published in print:
- 2019
- Published Online:
- October 2019
- ISBN:
- 9780198714934
- eISBN:
- 9780191879180
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198714934.003.0007
- Subject:
- Mathematics, Logic / Computer Science / Mathematical Philosophy, Mathematical Physics
Linear systems theory, based on the mathematics of vector spaces, is the backbone of all “classical” DSP and a large part of statistical machine learning. The basic idea -- that linear algebra ...
More
Linear systems theory, based on the mathematics of vector spaces, is the backbone of all “classical” DSP and a large part of statistical machine learning. The basic idea -- that linear algebra applied to a signal can of substantial practical value -- has counterparts in many areas of science and technology. In other areas of science and engineering, linear algebra is often justified by the fact that it is often an excellent model for real-world systems. For example, in acoustics the theory of (linear) wave propagation emerges from the concept of linearization of small pressure disturbances about the equilibrium pressure in classical fluid dynamics. Similarly, the theory of electromagnetic waves is also linear. Except when a signal emerges from a justifiably linear system, in DSP and machine learning we do not have any particular correspondence to reality to back up the choice of linearity. However, the mathematics of vector spaces, particularly when applied to systems which are time-invariant and jointly Gaussian, is highly tractable, elegant and immensely useful.Less
Linear systems theory, based on the mathematics of vector spaces, is the backbone of all “classical” DSP and a large part of statistical machine learning. The basic idea -- that linear algebra applied to a signal can of substantial practical value -- has counterparts in many areas of science and technology. In other areas of science and engineering, linear algebra is often justified by the fact that it is often an excellent model for real-world systems. For example, in acoustics the theory of (linear) wave propagation emerges from the concept of linearization of small pressure disturbances about the equilibrium pressure in classical fluid dynamics. Similarly, the theory of electromagnetic waves is also linear. Except when a signal emerges from a justifiably linear system, in DSP and machine learning we do not have any particular correspondence to reality to back up the choice of linearity. However, the mathematics of vector spaces, particularly when applied to systems which are time-invariant and jointly Gaussian, is highly tractable, elegant and immensely useful.
Peter Manning
- Published in print:
- 2013
- Published Online:
- September 2013
- ISBN:
- 9780199746392
- eISBN:
- 9780199332496
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199746392.003.0023
- Subject:
- Music, Theory, Analysis, Composition, Popular
The development of synthesis and signal processing software since the late 1990s, facilitated by the exponential growth in the power and versatility of the associated hardware, has materially ...
More
The development of synthesis and signal processing software since the late 1990s, facilitated by the exponential growth in the power and versatility of the associated hardware, has materially transformed the opportunities for exploring the medium of computer music during the early part of the 21st century. Although the foundations for these developments had been clearly established by the end of the last century, the true extent and significance of what was to follow could not have been fully anticipated, not least in terms of the resulting empowerment of the individual, working without the enhanced support of a research institution. This important shift in the locus of activity has opened up new perspectives for the future development of the medium, and these form an integral part of the critique that emerges from this chapter.Less
The development of synthesis and signal processing software since the late 1990s, facilitated by the exponential growth in the power and versatility of the associated hardware, has materially transformed the opportunities for exploring the medium of computer music during the early part of the 21st century. Although the foundations for these developments had been clearly established by the end of the last century, the true extent and significance of what was to follow could not have been fully anticipated, not least in terms of the resulting empowerment of the individual, working without the enhanced support of a research institution. This important shift in the locus of activity has opened up new perspectives for the future development of the medium, and these form an integral part of the critique that emerges from this chapter.
Jonathan Sterne
- Published in print:
- 2021
- Published Online:
- August 2020
- ISBN:
- 9780197511121
- eISBN:
- 9780197511169
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780197511121.003.0007
- Subject:
- Music, Psychology of Music, History, Western
Many music technologists have sought to reproduce the sonic signature of analog audio devices—amplifiers, compressors, signal processors, instruments—in the software domain. Assessing the ...
More
Many music technologists have sought to reproduce the sonic signature of analog audio devices—amplifiers, compressors, signal processors, instruments—in the software domain. Assessing the effectiveness of the analog model involves a confusing mix of math, sound, electronics, appearances, and feelings, all of which are negotiated differently by users and designers. Drawing on ethnographic research at major software companies, along with close analysis of the technologies themselves, this chapter argues that “analog modeling” in the digital domain is one of the latest chapters in the long history of hearing tests, for it is in the moment of the listening test that engineers and users attempt to resolve competing epistemologies of sound. The listening test thus offers a privileged point of entry into both the classification and the experience of digital sound technologies.Less
Many music technologists have sought to reproduce the sonic signature of analog audio devices—amplifiers, compressors, signal processors, instruments—in the software domain. Assessing the effectiveness of the analog model involves a confusing mix of math, sound, electronics, appearances, and feelings, all of which are negotiated differently by users and designers. Drawing on ethnographic research at major software companies, along with close analysis of the technologies themselves, this chapter argues that “analog modeling” in the digital domain is one of the latest chapters in the long history of hearing tests, for it is in the moment of the listening test that engineers and users attempt to resolve competing epistemologies of sound. The listening test thus offers a privileged point of entry into both the classification and the experience of digital sound technologies.
Jussi Parikka
- Published in print:
- 2016
- Published Online:
- January 2018
- ISBN:
- 9781474409483
- eISBN:
- 9781474426954
- Item type:
- chapter
- Publisher:
- Edinburgh University Press
- DOI:
- 10.3366/edinburgh/9781474409483.003.0010
- Subject:
- Philosophy, Aesthetics
This chapter addresses a non-linear signal archaeology that connects Cold War architectures to current politics of global surveillance. In the wake of the NSA/PRISM/Snowden revelations in June 2013, ...
More
This chapter addresses a non-linear signal archaeology that connects Cold War architectures to current politics of global surveillance. In the wake of the NSA/PRISM/Snowden revelations in June 2013, it was discovered that Britain still has a “secret listening post” in the heart of Berlin. The story about Britain’s involvement in Berlin is indicative of some continuities in the Cold War narratives that persist, and some media technological practices that never disappeared: from the Teufelsberg listening post in Berlin to the current NSA culture, we are forced to admit the significance of what Thomas Elsaesser referred to as the S/M perversion of cinematic media: the centrality of technical media in Surveillance and Military. Indeed, excavating “signal architecture archaeologies” means looking at those non-human spaces built for signals – a preparation for the war conducted over signals, or what nowadays is referred to as “cyberwar”. This theme haunts the abandoned buildings and remnants of the Cold War like Teufelsberg, which is approached poetically as a haunted signal space: the ghosts that characterise military architectures are not dead souls of humans, but the non-human pings of massive infrastructures of signal processing.Less
This chapter addresses a non-linear signal archaeology that connects Cold War architectures to current politics of global surveillance. In the wake of the NSA/PRISM/Snowden revelations in June 2013, it was discovered that Britain still has a “secret listening post” in the heart of Berlin. The story about Britain’s involvement in Berlin is indicative of some continuities in the Cold War narratives that persist, and some media technological practices that never disappeared: from the Teufelsberg listening post in Berlin to the current NSA culture, we are forced to admit the significance of what Thomas Elsaesser referred to as the S/M perversion of cinematic media: the centrality of technical media in Surveillance and Military. Indeed, excavating “signal architecture archaeologies” means looking at those non-human spaces built for signals – a preparation for the war conducted over signals, or what nowadays is referred to as “cyberwar”. This theme haunts the abandoned buildings and remnants of the Cold War like Teufelsberg, which is approached poetically as a haunted signal space: the ghosts that characterise military architectures are not dead souls of humans, but the non-human pings of massive infrastructures of signal processing.
Peter Manning
- Published in print:
- 2004
- Published Online:
- October 2011
- ISBN:
- 9780195144840
- eISBN:
- 9780199849802
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195144840.003.0018
- Subject:
- Music, Theory, Analysis, Composition
In considering the development of the personal computer as a musical tool, attention has so far focused primarily on applications that make relatively modest demands on the underlying technology, for ...
More
In considering the development of the personal computer as a musical tool, attention has so far focused primarily on applications that make relatively modest demands on the underlying technology, for example the development of software for editing and regulating control data within the MIDI environment. Tasks such as audio synthesis and signal processing were to prove a far more challenging prospect, in many instances requiring the services of additional hardware. The evolution of audio processing systems based on the personal computer is discussed, taking into account some interesting institutional research programs dating from the pre-MIDI era that directly influenced these developments.Less
In considering the development of the personal computer as a musical tool, attention has so far focused primarily on applications that make relatively modest demands on the underlying technology, for example the development of software for editing and regulating control data within the MIDI environment. Tasks such as audio synthesis and signal processing were to prove a far more challenging prospect, in many instances requiring the services of additional hardware. The evolution of audio processing systems based on the personal computer is discussed, taking into account some interesting institutional research programs dating from the pre-MIDI era that directly influenced these developments.