C. Julian Chen
- Published in print:
- 2007
- Published Online:
- September 2007
- ISBN:
- 9780199211500
- eISBN:
- 9780191705991
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199211500.001.0001
- Subject:
- Physics, Condensed Matter Physics / Materials
The scanning tunneling microscope (STM) and the atomic force microscope (AFM), both capable of visualizing and manipulating individual atoms, are the cornerstones of nanoscience and nanotechnology ...
More
The scanning tunneling microscope (STM) and the atomic force microscope (AFM), both capable of visualizing and manipulating individual atoms, are the cornerstones of nanoscience and nanotechnology today. The inventors of STM, Gerd Binnig and Heinrich Rohrer, were awarded with the Nobel Prize of physics in 1986. Both microscopes are based on mechanically scanning an atomically sharp tip over a sample surface, with quantum-mechanical tunneling or atomic forces between the tip and the atoms on the sample as the measurable quantities. This book presents the principles of STM and AFM, and the experimental details. Part I presents the principles from a unified point of view: the Bardeen theory of tunneling phenomenon, and the Herring-Landau theory of covalent-bond force. The similarity between those two theories, both rooted from the Heisenberg-Pauling concept of quantum-mechanical resonance, points to the equivalence of tunneling and covalent-bond force. The Tersoff-Hamann model of STM is presented, including the original derivation. The mechanisms of atomic-scale imaging of both STM and AFM are discussed. Part II presents the instrumentation and experimental techniques of STM and AFM, including piezoelectric scanners, vibration isolation, electronics and control, mechanical design, tip treatment and characterization, scanning tunneling spectroscopy, and atomic force detection techniques. Part II ends with illustrative applications of STM and AFM in various fields of research and technology.Less
The scanning tunneling microscope (STM) and the atomic force microscope (AFM), both capable of visualizing and manipulating individual atoms, are the cornerstones of nanoscience and nanotechnology today. The inventors of STM, Gerd Binnig and Heinrich Rohrer, were awarded with the Nobel Prize of physics in 1986. Both microscopes are based on mechanically scanning an atomically sharp tip over a sample surface, with quantum-mechanical tunneling or atomic forces between the tip and the atoms on the sample as the measurable quantities. This book presents the principles of STM and AFM, and the experimental details. Part I presents the principles from a unified point of view: the Bardeen theory of tunneling phenomenon, and the Herring-Landau theory of covalent-bond force. The similarity between those two theories, both rooted from the Heisenberg-Pauling concept of quantum-mechanical resonance, points to the equivalence of tunneling and covalent-bond force. The Tersoff-Hamann model of STM is presented, including the original derivation. The mechanisms of atomic-scale imaging of both STM and AFM are discussed. Part II presents the instrumentation and experimental techniques of STM and AFM, including piezoelectric scanners, vibration isolation, electronics and control, mechanical design, tip treatment and characterization, scanning tunneling spectroscopy, and atomic force detection techniques. Part II ends with illustrative applications of STM and AFM in various fields of research and technology.
Gastone Gilli and Paola Gilli
- Published in print:
- 2009
- Published Online:
- September 2009
- ISBN:
- 9780199558964
- eISBN:
- 9780191720949
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199558964.001.0001
- Subject:
- Physics, Crystallography: Physics
Hydrogen bond (H-bond) effects are well known: it makes sea water liquid, joins cellulose microfibrils in sequoia trees, shapes DNA into chromosomes, and polypeptide chains into wool, hair, muscles, ...
More
Hydrogen bond (H-bond) effects are well known: it makes sea water liquid, joins cellulose microfibrils in sequoia trees, shapes DNA into chromosomes, and polypeptide chains into wool, hair, muscles, or enzymes. However, its very nature is much less known and we may still wonder why O-H···O energies range from less than 1 to more than 30 kcal/mol without evident reason. This H-bond puzzle is tackled here by a new approach aimed to obtain full rationalization and comprehensive interpretation of the H-bond in terms of classical chemical-bond theories starting from the very root of the problem, an extended compilation of H-bond energies and geometries derived from modern thermodynamic and structural databases. From this analysis new concepts emerge: new classes of systematically strong H-bonds (CAHBs and RAHBs: charge- and resonance-assisted H-bonds); full H-bond classification in six classes (the chemical leitmotifs); assessment of the covalent nature of all strong H-bonds. This finally leads to three distinct though inter-consistent theoretical models able to rationalize the H-bond and to predict its strength which are based on the classical VB theory (electrostatic-covalent H-bond model, ECHBM), the matching of donor-acceptor acid-base parameters (PA/pKa equalization principle), and the shape of the H-bond proton-transfer pathway (transition-state H-bond theory, TSHBT). A number of important chemical and biochemical systems where strong H-bonds play an important functional role are surveyed, such as enzymatic catalysis, ion-transport through cell membranes, crystal packing, prototropic tautomerism, and molecular mechanisms of functional materials. Particular attention is paid to the drug-receptor binding process and to the interpretation of the enthalpy-entropy compensation phenomenon.Less
Hydrogen bond (H-bond) effects are well known: it makes sea water liquid, joins cellulose microfibrils in sequoia trees, shapes DNA into chromosomes, and polypeptide chains into wool, hair, muscles, or enzymes. However, its very nature is much less known and we may still wonder why O-H···O energies range from less than 1 to more than 30 kcal/mol without evident reason. This H-bond puzzle is tackled here by a new approach aimed to obtain full rationalization and comprehensive interpretation of the H-bond in terms of classical chemical-bond theories starting from the very root of the problem, an extended compilation of H-bond energies and geometries derived from modern thermodynamic and structural databases. From this analysis new concepts emerge: new classes of systematically strong H-bonds (CAHBs and RAHBs: charge- and resonance-assisted H-bonds); full H-bond classification in six classes (the chemical leitmotifs); assessment of the covalent nature of all strong H-bonds. This finally leads to three distinct though inter-consistent theoretical models able to rationalize the H-bond and to predict its strength which are based on the classical VB theory (electrostatic-covalent H-bond model, ECHBM), the matching of donor-acceptor acid-base parameters (PA/pKa equalization principle), and the shape of the H-bond proton-transfer pathway (transition-state H-bond theory, TSHBT). A number of important chemical and biochemical systems where strong H-bonds play an important functional role are surveyed, such as enzymatic catalysis, ion-transport through cell membranes, crystal packing, prototropic tautomerism, and molecular mechanisms of functional materials. Particular attention is paid to the drug-receptor binding process and to the interpretation of the enthalpy-entropy compensation phenomenon.
Mary McClintock Fulkerson
- Published in print:
- 2007
- Published Online:
- September 2007
- ISBN:
- 9780199296477
- eISBN:
- 9780191711930
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199296477.003.0004
- Subject:
- Religion, Religion and Society
This chapter focuses on worship practices. Three services are described, highlighting the combination of inscribed and incorporative practices as experienced by the author of this book. Following ...
More
This chapter focuses on worship practices. Three services are described, highlighting the combination of inscribed and incorporative practices as experienced by the author of this book. Following this participatory account is a reflection on the variety of proffered subject positions and affective resonances of each worship style.Less
This chapter focuses on worship practices. Three services are described, highlighting the combination of inscribed and incorporative practices as experienced by the author of this book. Following this participatory account is a reflection on the variety of proffered subject positions and affective resonances of each worship style.
Ben Brubaker, Daniel Bump, and Solomon Friedberg
- Published in print:
- 2011
- Published Online:
- October 2017
- ISBN:
- 9780691150659
- eISBN:
- 9781400838998
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691150659.003.0010
- Subject:
- Mathematics, Combinatorics / Graph Theory / Discrete Mathematics
This chapter deals with noncritical resonances. A short pattern is resonant at i if lsubscript i plus 1 = bᵢ. This property depends only on the associated prototype, so resonance is actually a ...
More
This chapter deals with noncritical resonances. A short pattern is resonant at i if lsubscript i plus 1 = bᵢ. This property depends only on the associated prototype, so resonance is actually a property of prototypes. A first (middle) row entry is also called aᵢ critical if it is equal to one of its four neighbors, which are lᵢ, lsubscript i plus 1, bᵢ, and bsubscript i minus 1. We say that the resonance at i is critical if either aᵢ or asubscript i plus 1 is critical. The chapter introduces the relevant theorem, stating that if t is a strict pattern with no critical resonances, then t′ is also strict with no critical resonances. It also chooses a pair of canonical indexings of Γ = Γₜ and Δ′ = Δsubscript tprime.Less
This chapter deals with noncritical resonances. A short pattern is resonant at i if lsubscript i plus 1 = bᵢ. This property depends only on the associated prototype, so resonance is actually a property of prototypes. A first (middle) row entry is also called aᵢ critical if it is equal to one of its four neighbors, which are lᵢ, lsubscript i plus 1, bᵢ, and bsubscript i minus 1. We say that the resonance at i is critical if either aᵢ or asubscript i plus 1 is critical. The chapter introduces the relevant theorem, stating that if t is a strict pattern with no critical resonances, then t′ is also strict with no critical resonances. It also chooses a pair of canonical indexings of Γ = Γₜ and Δ′ = Δsubscript tprime.
Klaus Eder and Hans‐Jörg Trenz
- Published in print:
- 2003
- Published Online:
- April 2004
- ISBN:
- 9780199252268
- eISBN:
- 9780191601040
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0199252262.003.0006
- Subject:
- Political Science, European Union
Chapter 6 sets out to explain the dynamics of multi-level governance as regards the evolution of forms of public communication and the making of a European public sphere. The central theoretical ...
More
Chapter 6 sets out to explain the dynamics of multi-level governance as regards the evolution of forms of public communication and the making of a European public sphere. The central theoretical concern is discussed with empirical reference to core areas of governance in the fields of justice and home affairs. So far, research has mainly taken an intergovernmentalist perspective, which fails to explain the institutional dynamics of intensifying cooperation in these fields which is slowly integrating the ‘European security community’ into an encom-passing ‘area of justice, freedom and rights’. The intergovernmentalist account neglects two significant factors: first, that governments act within an expanding transnational field made up of norms, discourses and institutions that increasingly constrain their action. Second, competitive actors within the field are more and more linked to public monitoring of their activities. Consequently, the transnational field is transformed into a public space attended by different audiences with shifting attention and expectations. The term ‘transnational resonance structures’ is introduced to account for the integration and legitimation of forms of ‘loose coupling’ between international, European and domestic politics as the organizing principle of governance in Europe.Less
Chapter 6 sets out to explain the dynamics of multi-level governance as regards the evolution of forms of public communication and the making of a European public sphere. The central theoretical concern is discussed with empirical reference to core areas of governance in the fields of justice and home affairs. So far, research has mainly taken an intergovernmentalist perspective, which fails to explain the institutional dynamics of intensifying cooperation in these fields which is slowly integrating the ‘European security community’ into an encom-passing ‘area of justice, freedom and rights’. The intergovernmentalist account neglects two significant factors: first, that governments act within an expanding transnational field made up of norms, discourses and institutions that increasingly constrain their action. Second, competitive actors within the field are more and more linked to public monitoring of their activities. Consequently, the transnational field is transformed into a public space attended by different audiences with shifting attention and expectations. The term ‘transnational resonance structures’ is introduced to account for the integration and legitimation of forms of ‘loose coupling’ between international, European and domestic politics as the organizing principle of governance in Europe.
Vince D. Calhoun and Tom Eichele
- Published in print:
- 2010
- Published Online:
- May 2010
- ISBN:
- 9780195372731
- eISBN:
- 9780199776283
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195372731.003.0011
- Subject:
- Neuroscience, Techniques
Independent component analysis (ICA) is increasingly utilized as a tool for evaluating the hidden spatiotemporal structure contained within brain imaging data. This chapter first provides a brief ...
More
Independent component analysis (ICA) is increasingly utilized as a tool for evaluating the hidden spatiotemporal structure contained within brain imaging data. This chapter first provides a brief overview of ICA and how ICA is applied to functional magnetic resonance imaging (fMRI) data. It then discusses group ICA and the application of group ICA for data fusion, with an emphasis on the methods developed within our group. It also discusses, within a larger context, the many alternative approaches that are feasible and currently in use.Less
Independent component analysis (ICA) is increasingly utilized as a tool for evaluating the hidden spatiotemporal structure contained within brain imaging data. This chapter first provides a brief overview of ICA and how ICA is applied to functional magnetic resonance imaging (fMRI) data. It then discusses group ICA and the application of group ICA for data fusion, with an emphasis on the methods developed within our group. It also discusses, within a larger context, the many alternative approaches that are feasible and currently in use.
Tom Eichele and Vince D. Calhoun
- Published in print:
- 2010
- Published Online:
- May 2010
- ISBN:
- 9780195372731
- eISBN:
- 9780199776283
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195372731.003.0012
- Subject:
- Neuroscience, Techniques
This chapter introduces and applies the concept of parallel spatial and temporal unmixing with group independent component analysis (ICA) for concurrent electroencephalography-functional magnetic ...
More
This chapter introduces and applies the concept of parallel spatial and temporal unmixing with group independent component analysis (ICA) for concurrent electroencephalography-functional magnetic resonance imaging (EEG-fMRI). Hemodynamic response function (HRF) deconvolution and single-trial estimation in the fMRI data were employed, and the single-trial weights were used as predictors for the amplitude modulation in the EEG. For illustration, data from a previously published performance-monitoring experiment were analyzed, in order to identify error-preceding activity in the EEG modality. EEG components that displayed such slow trends, and which were coupled to the corresponding fMRI components, are described. Parallel ICA for analysis of concurrent EEG-fMRI on a trial-by-trial basis is a very useful addition to the toolbelt of researchers interested in multimodal integration.Less
This chapter introduces and applies the concept of parallel spatial and temporal unmixing with group independent component analysis (ICA) for concurrent electroencephalography-functional magnetic resonance imaging (EEG-fMRI). Hemodynamic response function (HRF) deconvolution and single-trial estimation in the fMRI data were employed, and the single-trial weights were used as predictors for the amplitude modulation in the EEG. For illustration, data from a previously published performance-monitoring experiment were analyzed, in order to identify error-preceding activity in the EEG modality. EEG components that displayed such slow trends, and which were coupled to the corresponding fMRI components, are described. Parallel ICA for analysis of concurrent EEG-fMRI on a trial-by-trial basis is a very useful addition to the toolbelt of researchers interested in multimodal integration.
Michael Wibral, Christoph Bledowski, and Georg Turi
- Published in print:
- 2010
- Published Online:
- May 2010
- ISBN:
- 9780195372731
- eISBN:
- 9780199776283
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195372731.003.0014
- Subject:
- Neuroscience, Techniques
This chapter presents various strategies of combining separately recorded electroencephalography/magnetoencephalography (EEG/MEG) and functional magnetic resonance imaging (fMRI) data sets. To help ...
More
This chapter presents various strategies of combining separately recorded electroencephalography/magnetoencephalography (EEG/MEG) and functional magnetic resonance imaging (fMRI) data sets. To help the experimenter decide in the first place whether to use concurrent recordings of EEG and fMRI or separate recordings, it attempts to weigh the relative merits of combined versus separate EEG/MEG and fMRI measurements, and puts them in perspective with respect to various experimental goals. The principle of MEG recording and its advantages, as compared to EEG, are also described; these particular advantages of MEG recordings are important to consider because, at present, they are only available when data are recorded separately, due to the current incompatibility of MRI and MEG measurement equipment.Less
This chapter presents various strategies of combining separately recorded electroencephalography/magnetoencephalography (EEG/MEG) and functional magnetic resonance imaging (fMRI) data sets. To help the experimenter decide in the first place whether to use concurrent recordings of EEG and fMRI or separate recordings, it attempts to weigh the relative merits of combined versus separate EEG/MEG and fMRI measurements, and puts them in perspective with respect to various experimental goals. The principle of MEG recording and its advantages, as compared to EEG, are also described; these particular advantages of MEG recordings are important to consider because, at present, they are only available when data are recorded separately, due to the current incompatibility of MRI and MEG measurement equipment.
Ingmar Gutberlet
- Published in print:
- 2010
- Published Online:
- May 2010
- ISBN:
- 9780195372731
- eISBN:
- 9780199776283
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195372731.003.0004
- Subject:
- Neuroscience, Techniques
This chapter focuses on the performance of parallel and concurrent electroencephalography (EEG), and functional magnetic resonance imaging (fMRI) measurements. Topics discussed include the technical ...
More
This chapter focuses on the performance of parallel and concurrent electroencephalography (EEG), and functional magnetic resonance imaging (fMRI) measurements. Topics discussed include the technical challenges of recording EEG in the MR environment, types of MR-compatible equipment, patient safety during combined recordings, data acquisition considerations, and subject considerations.Less
This chapter focuses on the performance of parallel and concurrent electroencephalography (EEG), and functional magnetic resonance imaging (fMRI) measurements. Topics discussed include the technical challenges of recording EEG in the MR environment, types of MR-compatible equipment, patient safety during combined recordings, data acquisition considerations, and subject considerations.
Andrew P. Bagshaw and Christian-G. Bénar
- Published in print:
- 2010
- Published Online:
- May 2010
- ISBN:
- 9780195372731
- eISBN:
- 9780199776283
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195372731.003.0005
- Subject:
- Neuroscience, Techniques
The simultaneous recording of electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) has opened several new perspectives within the field of functional MRI. It allows the ...
More
The simultaneous recording of electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) has opened several new perspectives within the field of functional MRI. It allows the detailed temporal and spectral information contained in EEG to be incorporated into the analysis of fMRI signals. In addition, it permits the study of spontaneous brain activity that can only be observed with EEG, such as epileptic spikes, sleep spindles, or alpha waves. However, the price to pay for this technical advance is the deterioration of the quality of both signals, and, in particular, of EEG. This chapter first reviews the pros and cons of sparse and continuous EEG-fMRI. It then introduces advanced recording strategies aimed at improving the quality of EEG within a continuous sampling scheme.Less
The simultaneous recording of electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) has opened several new perspectives within the field of functional MRI. It allows the detailed temporal and spectral information contained in EEG to be incorporated into the analysis of fMRI signals. In addition, it permits the study of spontaneous brain activity that can only be observed with EEG, such as epileptic spikes, sleep spindles, or alpha waves. However, the price to pay for this technical advance is the deterioration of the quality of both signals, and, in particular, of EEG. This chapter first reviews the pros and cons of sparse and continuous EEG-fMRI. It then introduces advanced recording strategies aimed at improving the quality of EEG within a continuous sampling scheme.
Tom Eichele, Matthias Moosmann, Lei Wu, Ingmar Gutberlet, and Stefan Debener
- Published in print:
- 2010
- Published Online:
- May 2010
- ISBN:
- 9780195372731
- eISBN:
- 9780199776283
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195372731.003.0006
- Subject:
- Neuroscience, Techniques
The simultaneous recording of electroencephalogram (EEG) and functional magnetic resonance imaging (fMRI) provides several advantages over multimodal integration based on separate EEG and fMRI ...
More
The simultaneous recording of electroencephalogram (EEG) and functional magnetic resonance imaging (fMRI) provides several advantages over multimodal integration based on separate EEG and fMRI recording protocols. However, the recording and analysis of simultaneous EEG-fMRI is not without pitfalls. The potential benefits of simultaneous recordings come at the expense of a massive, inevitable presence of artifacts, which corrupt the EEG signals recorded in the MR environment. This chapter presents different methods of EEG artifact correction. It also discusses the limitations of currently available approaches as well as possible future directions.Less
The simultaneous recording of electroencephalogram (EEG) and functional magnetic resonance imaging (fMRI) provides several advantages over multimodal integration based on separate EEG and fMRI recording protocols. However, the recording and analysis of simultaneous EEG-fMRI is not without pitfalls. The potential benefits of simultaneous recordings come at the expense of a massive, inevitable presence of artifacts, which corrupt the EEG signals recorded in the MR environment. This chapter presents different methods of EEG artifact correction. It also discusses the limitations of currently available approaches as well as possible future directions.
Karen Mullinger and Richard Bowtell
- Published in print:
- 2010
- Published Online:
- May 2010
- ISBN:
- 9780195372731
- eISBN:
- 9780199776283
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195372731.003.0007
- Subject:
- Neuroscience, Techniques
This chapter first considers the detrimental effect of main magnetic field inhomogeneity in MRI, focusing on the particular effects of the differences in magnetic susceptibility between the materials ...
More
This chapter first considers the detrimental effect of main magnetic field inhomogeneity in MRI, focusing on the particular effects of the differences in magnetic susceptibility between the materials used in EEG recording (electrodes and leads) and the human head. It then examines the importance of a uniform RF field in MRI, and analyzes how the presence of an EEG cap may affect the homogeneity of this field, thus reducing image quality. The ways in which these phenomena affect the signal-to-noise ratio (SNR), and cause local signal loss in EPI data collected for fMRI experiments, are discussed. The safety aspect of simultaneous EEG-fMRI, which is an extremely important consideration, is explored. Guidance is given relating to how to test the safety of an experimental set-up, and the importance of testing each new EEG recording arrangement in the scanner is emphasized. Finally, the chapter suggests some methods that may be used to overcome the problems of reduced MR image quality, which can be encountered when performing simultaneous EEG and fMRI.Less
This chapter first considers the detrimental effect of main magnetic field inhomogeneity in MRI, focusing on the particular effects of the differences in magnetic susceptibility between the materials used in EEG recording (electrodes and leads) and the human head. It then examines the importance of a uniform RF field in MRI, and analyzes how the presence of an EEG cap may affect the homogeneity of this field, thus reducing image quality. The ways in which these phenomena affect the signal-to-noise ratio (SNR), and cause local signal loss in EPI data collected for fMRI experiments, are discussed. The safety aspect of simultaneous EEG-fMRI, which is an extremely important consideration, is explored. Guidance is given relating to how to test the safety of an experimental set-up, and the importance of testing each new EEG recording arrangement in the scanner is emphasized. Finally, the chapter suggests some methods that may be used to overcome the problems of reduced MR image quality, which can be encountered when performing simultaneous EEG and fMRI.
Giancarlo Valente, Fabrizio Esposito, Federico de Martino, Rainer Goebel, and Elia Formisano
- Published in print:
- 2010
- Published Online:
- May 2010
- ISBN:
- 9780195372731
- eISBN:
- 9780199776283
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195372731.003.0009
- Subject:
- Neuroscience, Techniques
This chapter examines the most relevant aspects concerning the use of independent component analysis (ICA) for the analysis of functional magnetic resonance imaging (fMRI) data. In particular, after ...
More
This chapter examines the most relevant aspects concerning the use of independent component analysis (ICA) for the analysis of functional magnetic resonance imaging (fMRI) data. In particular, after illustrating the fMRI-ICA model (“Problem formulation and application to fMRI”), the chapter compares the most commonly used ICA algorithms in the context of fMRI data analysis. The problems of choosing the dimensionality of the ICA decomposition, and of selecting the “meaningful” components, are considered. Optimizations of the ICA algorithms for dealing with the specific spatiotemporal properties of the fMRI data, and extensions of the ICA to multisubject fMRI studies, are described. For each of these aspects, different approaches from various groups are briefly reviewed.Less
This chapter examines the most relevant aspects concerning the use of independent component analysis (ICA) for the analysis of functional magnetic resonance imaging (fMRI) data. In particular, after illustrating the fMRI-ICA model (“Problem formulation and application to fMRI”), the chapter compares the most commonly used ICA algorithms in the context of fMRI data analysis. The problems of choosing the dimensionality of the ICA decomposition, and of selecting the “meaningful” components, are considered. Optimizations of the ICA algorithms for dealing with the specific spatiotemporal properties of the fMRI data, and extensions of the ICA to multisubject fMRI studies, are described. For each of these aspects, different approaches from various groups are briefly reviewed.
Christina S. Kraus, John Marincola, and Christopher Pelling (eds)
- Published in print:
- 2010
- Published Online:
- May 2010
- ISBN:
- 9780199558681
- eISBN:
- 9780191720888
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199558681.001.0001
- Subject:
- Classical Studies, European History: BCE to 500CE
This volume collects essays written by colleagues and friends as a tribute to Tony Woodman, Gildersleeve Professor of Latin at the University of Virginia. These essays, like Woodman's own work, cover ...
More
This volume collects essays written by colleagues and friends as a tribute to Tony Woodman, Gildersleeve Professor of Latin at the University of Virginia. These essays, like Woodman's own work, cover topics in Latin poetry, oratory, and Greek and Roman historiography. Recurrent themes are the importance of rhetoric and rhetorical training, the skilful use of language and recurrent motifs in narrative, the use and adaptation of topoi, the importance of intertextuality, and the subtle and varied ways in which literary texts can have a contemporary resonance for their own day.Less
This volume collects essays written by colleagues and friends as a tribute to Tony Woodman, Gildersleeve Professor of Latin at the University of Virginia. These essays, like Woodman's own work, cover topics in Latin poetry, oratory, and Greek and Roman historiography. Recurrent themes are the importance of rhetoric and rhetorical training, the skilful use of language and recurrent motifs in narrative, the use and adaptation of topoi, the importance of intertextuality, and the subtle and varied ways in which literary texts can have a contemporary resonance for their own day.
Grant Hardy
- Published in print:
- 2010
- Published Online:
- May 2010
- ISBN:
- 9780199731701
- eISBN:
- 9780199777167
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199731701.003.0005
- Subject:
- Religion, Religion and Literature, World Religions
One important feature of Mormon's narration is his insertion of purported primary source documents into his history. Bernard Duyfhuizen's theories of “narratives of transmission” are applied to the ...
More
One important feature of Mormon's narration is his insertion of purported primary source documents into his history. Bernard Duyfhuizen's theories of “narratives of transmission” are applied to the Book of Mormon. Several examples of embedded documents are examined in detail, including a memoir of Zeniff, sermons of both King Benjamin and Alma the Younger, and letters of Helaman. In addition, this chapter looks at how such documents are integrated into the narrative. Mormon's editorial tactics can be analyzed in terms of “coherence,” “avoidance,” and “resonance,” all of which describe ways that the embedded documents are connected to larger storylines.Less
One important feature of Mormon's narration is his insertion of purported primary source documents into his history. Bernard Duyfhuizen's theories of “narratives of transmission” are applied to the Book of Mormon. Several examples of embedded documents are examined in detail, including a memoir of Zeniff, sermons of both King Benjamin and Alma the Younger, and letters of Helaman. In addition, this chapter looks at how such documents are integrated into the narrative. Mormon's editorial tactics can be analyzed in terms of “coherence,” “avoidance,” and “resonance,” all of which describe ways that the embedded documents are connected to larger storylines.
Eiichi Yamaguchi
- Published in print:
- 2006
- Published Online:
- September 2007
- ISBN:
- 9780199297320
- eISBN:
- 9780191711237
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199297320.003.0009
- Subject:
- Business and Management, Innovation
This chapter amplifies a number of the preceding themes and provides new insights in his case study of the blue LED invention. Distinguishing between ‘paradigm-disruptive’ innovation and ...
More
This chapter amplifies a number of the preceding themes and provides new insights in his case study of the blue LED invention. Distinguishing between ‘paradigm-disruptive’ innovation and ‘performance-disruptive’ innovation, it is argued that large firms face difficult problems with both, but especially the former (because of their ‘competency-enhancing’ innovation bias as well as growing bureaucracy, and more recently, restructuring). Using the blue LED case, it is argued that there is a greater chance of top managers following a hunch or creating a ‘field of resonance’ with their scientists in smaller companies. The implication of this study is a call for hastening the shift to a more open innovation system in Japan, with a greater role for startups and networks of innovative small firms.Less
This chapter amplifies a number of the preceding themes and provides new insights in his case study of the blue LED invention. Distinguishing between ‘paradigm-disruptive’ innovation and ‘performance-disruptive’ innovation, it is argued that large firms face difficult problems with both, but especially the former (because of their ‘competency-enhancing’ innovation bias as well as growing bureaucracy, and more recently, restructuring). Using the blue LED case, it is argued that there is a greater chance of top managers following a hunch or creating a ‘field of resonance’ with their scientists in smaller companies. The implication of this study is a call for hastening the shift to a more open innovation system in Japan, with a greater role for startups and networks of innovative small firms.
Alan Corney
- Published in print:
- 2006
- Published Online:
- September 2007
- ISBN:
- 9780199211456
- eISBN:
- 9780191705915
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199211456.001.0001
- Subject:
- Physics, Atomic, Laser, and Optical Physics
This book gives an account of the progress that has been made in the fields of atomic physics and laser spectroscopy during the last fifty years. The first five chapters prepare the foundations of ...
More
This book gives an account of the progress that has been made in the fields of atomic physics and laser spectroscopy during the last fifty years. The first five chapters prepare the foundations of atomic physics, classical electro-magnetism, and quantum mechanics, which are necessary for an understanding of the interaction of electromagnetic radiation with free atoms. The application of these concepts to processes involving the spontaneous emission of radiation is then developed in Chapters 6, 7, and 8, while stimulated emission and the properties of gas and tunable dye lasers form the subject matter of Chapters 9 to 14. The last four chapters are concerned with the physics and applications of atomic resonance fluorescence, optical double-resonance, optical pumping, and atomic beam magnetic resonance.Less
This book gives an account of the progress that has been made in the fields of atomic physics and laser spectroscopy during the last fifty years. The first five chapters prepare the foundations of atomic physics, classical electro-magnetism, and quantum mechanics, which are necessary for an understanding of the interaction of electromagnetic radiation with free atoms. The application of these concepts to processes involving the spontaneous emission of radiation is then developed in Chapters 6, 7, and 8, while stimulated emission and the properties of gas and tunable dye lasers form the subject matter of Chapters 9 to 14. The last four chapters are concerned with the physics and applications of atomic resonance fluorescence, optical double-resonance, optical pumping, and atomic beam magnetic resonance.
André Longtin
- Published in print:
- 2009
- Published Online:
- February 2010
- ISBN:
- 9780199235070
- eISBN:
- 9780191715778
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199235070.003.0004
- Subject:
- Mathematics, Biostatistics
This chapter concerns the influence of noise and periodic rhythms on the firing patterns of neurons in their subthreshold regime. Such a regime conceals many computations that lead to successive ...
More
This chapter concerns the influence of noise and periodic rhythms on the firing patterns of neurons in their subthreshold regime. Such a regime conceals many computations that lead to successive decisions to fire or not fire, and noise and rhythms are important components of these decisions. We first consider a TypeII neuron model, the FitzHugh-Nagumo model, characterized by a resonant frequency. In the subthreshold regime, noise induces firings with a regularity that increases with noise intensity. At a certain finite noise level, the regularity may be maximized, but this depends on the numerical implementation of an absolute refractory period. We discuss measures of this coherence resonance based on the coefficient of variation (CV) of interspike intervals and spike train power spectra. We then characterize its phase locking to periodic input, and how this locking is modified by noise. This lays the foundation for understanding how noise can express subthreshold signals in the spike train. We discuss measures and qualitative features of this stochastic resonance across all time-scales of periodic forcing. We show how the resonance relates to firing once per forcing cycle, on average, or submultiples thereof at higher forcing frequencies where refractory effects come into play. For slow forcing the optimal noise is independent of forcing period. We then discuss coherence resonance and stochastic resonance in the quadratic integrate-and-fire model of TypeI dynamics. The presence of a full coherence resonance depends on the interpretation of the model, particularly the boundaries for firing and reset. Our study is motivated by the observation of randomly phase locked firing activity in a large number of neurons, especially those involved in transducing physical stimuli such as temperature, sound, pressure, and electric fields, but also in central neurons involved in the generation of various rhythms.Less
This chapter concerns the influence of noise and periodic rhythms on the firing patterns of neurons in their subthreshold regime. Such a regime conceals many computations that lead to successive decisions to fire or not fire, and noise and rhythms are important components of these decisions. We first consider a TypeII neuron model, the FitzHugh-Nagumo model, characterized by a resonant frequency. In the subthreshold regime, noise induces firings with a regularity that increases with noise intensity. At a certain finite noise level, the regularity may be maximized, but this depends on the numerical implementation of an absolute refractory period. We discuss measures of this coherence resonance based on the coefficient of variation (CV) of interspike intervals and spike train power spectra. We then characterize its phase locking to periodic input, and how this locking is modified by noise. This lays the foundation for understanding how noise can express subthreshold signals in the spike train. We discuss measures and qualitative features of this stochastic resonance across all time-scales of periodic forcing. We show how the resonance relates to firing once per forcing cycle, on average, or submultiples thereof at higher forcing frequencies where refractory effects come into play. For slow forcing the optimal noise is independent of forcing period. We then discuss coherence resonance and stochastic resonance in the quadratic integrate-and-fire model of TypeI dynamics. The presence of a full coherence resonance depends on the interpretation of the model, particularly the boundaries for firing and reset. Our study is motivated by the observation of randomly phase locked firing activity in a large number of neurons, especially those involved in transducing physical stimuli such as temperature, sound, pressure, and electric fields, but also in central neurons involved in the generation of various rhythms.
Alan Corney
- Published in print:
- 2006
- Published Online:
- September 2007
- ISBN:
- 9780199211456
- eISBN:
- 9780191705915
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199211456.003.0018
- Subject:
- Physics, Atomic, Laser, and Optical Physics
This chapter develops the theory of the hyperfine structure of atoms involving nuclear magnetic dipole and electric quadrupole moments. The Zeeman effect in weak, intermediate, and strong magnetic ...
More
This chapter develops the theory of the hyperfine structure of atoms involving nuclear magnetic dipole and electric quadrupole moments. The Zeeman effect in weak, intermediate, and strong magnetic fields is considered. The experimental measurement of the hyperfine structure of ground state atoms by the techniques of optical pumping, atomic beam magnetic resonance, and optical double resonance is explained. The caesium beam atomic clock, the importance of hyperfine structure experiments in hydrogen, and the investigation of hyperfine structure of excited states are discussed.Less
This chapter develops the theory of the hyperfine structure of atoms involving nuclear magnetic dipole and electric quadrupole moments. The Zeeman effect in weak, intermediate, and strong magnetic fields is considered. The experimental measurement of the hyperfine structure of ground state atoms by the techniques of optical pumping, atomic beam magnetic resonance, and optical double resonance is explained. The caesium beam atomic clock, the importance of hyperfine structure experiments in hydrogen, and the investigation of hyperfine structure of excited states are discussed.
V. Andrew Stenger
- Published in print:
- 2006
- Published Online:
- February 2010
- ISBN:
- 9780198565741
- eISBN:
- 9780191723971
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198565741.003.0017
- Subject:
- Neuroscience, Behavioral Neuroscience
Neuroimaging plays a major role in furthering our understanding of the orbitofrontal cortex (OFC). This chapter presents some of the technical challenges, limitations, and potential solutions with ...
More
Neuroimaging plays a major role in furthering our understanding of the orbitofrontal cortex (OFC). This chapter presents some of the technical challenges, limitations, and potential solutions with regard to using Magnetic Resonance Imaging (MRI) to study the OFC. The proximity of the OFC to the sinus results in signal loss and distortion due to inhomogeneity in magnetic susceptibility. Several techniques have proven useful in reducing signal loss and distortion including shorter echo times, thinner slice acquisitions, shimming, post-processing distortion correction using field maps, reduced data acquisition methods, parallel imaging, use of rapid acquisition trajectories including reverse spiral and spiral in-out sequences, gradient compensation, and tailored radiofrequency pulses. The advantages and disadvantages of each of these imaging techniques are discussed.Less
Neuroimaging plays a major role in furthering our understanding of the orbitofrontal cortex (OFC). This chapter presents some of the technical challenges, limitations, and potential solutions with regard to using Magnetic Resonance Imaging (MRI) to study the OFC. The proximity of the OFC to the sinus results in signal loss and distortion due to inhomogeneity in magnetic susceptibility. Several techniques have proven useful in reducing signal loss and distortion including shorter echo times, thinner slice acquisitions, shimming, post-processing distortion correction using field maps, reduced data acquisition methods, parallel imaging, use of rapid acquisition trajectories including reverse spiral and spiral in-out sequences, gradient compensation, and tailored radiofrequency pulses. The advantages and disadvantages of each of these imaging techniques are discussed.